What constitutes effective learning experiences in a mixed methods ...

8 downloads 7141 Views 563KB Size Report
tant are methods of data gathering and analysis'. (Kalous ... a qualitative case study bounded by the duration of the course and its participants for the purpose of ...
Copyright © eContent Management Pty Ltd. International Journal of Multiple Research Approaches (2014) 8(1): 74–86.



What constitutes effective learning experiences in a mixed methods research course? An examination from the student perspective

Cheryl Poth Centre for Research in Applied Measurement and Evaluation, Department of Educational Psychology, Faculty of Education, University of Alberta, Edmonton, AB, Canada

Abstract:  Researchers are increasingly tasked with integrating multiple data sources for addressing complex issues, yet methodological training has to date failed to prepare researchers adequately to meet these new demands (e.g., Leech & Onwuegbuzie, 2010). An embedded mixed methods design was used in which quantitative data were embedded within a qualitative case study bounded by the duration of the course and its participants for the purpose of generating a comprehensive understanding of the course experience and impact from the students’ perspective. The findings shed new light on the inadequacy of a single mixed methods course for preparing course participants to undertake mixed methods dissertation research, as well as the untapped potential of the course for building research skills beyond planning across three methodologies. Implications for teaching about mixed methods are discussed.

Keywords: student perspective, methodological training, mixed methods teaching Methodological training should enable new r­ esearchers to ‘read and critically evaluate research findings from a wide array of methods while being expert in a specific methodological orientation’ (Raudenbush, 2005, pp. 30–31).

A

s the integration of diverse data sources is increasingly necessary for addressing complex issues, a pressing need exists for researchers who possess a degree of methodological literacy across a wide array of methods. Yet methodological training to date has not adequately prepared researchers for meeting these new demands (e.g., James, 2012; Leech & Onwuegbuzie, 2010). Methodological literacy has been defined as ‘handling the basic operations of classical logic and general methodology of problem solving. The most important are methods of data gathering and analysis’ (Kalous, 2007, p. 257). Missing from this definition is attention to additional knowledge and skills related to planning and dissemination; for example, during planning it is necessary to consider the paradigmatic frameworks within which methods and methodologies can be mixed. These additional foci are important given the essential role methodologically literate researchers within emerging research areas play in determining the extent to which information is applied to complex societal issues. For the purposes of this paper, I operationally define methodological literacy as possessing the knowledge and skills necessary for making informed decisions during the research

74

process involving planning (e.g., developing research questions, knowledge of research designs, identifying paradigmatic frameworks, selecting data collection instruments), conducting (e.g., designing data collection instruments, collecting data, analyzing data, and interpreting findings), and disseminating (e.g., selecting avenues and preparing written and oral presentations). Mixed methods research is widely recognized as an approach involving combining qualitative (i.e., words and images) and quantitative (numerical) data to address research problems either within a single study or across a series of studies (e.g., Creswell & Plano Clark, 2011; Johnson, Onwuegbuzie, & Turner, 2007). In do doing, I situate mixed methods research as a ‘third methodology’ along with quantitative and qualitative approaches (Johnson & Onwuegbuzie, 2004; Tashakkori & Teddlie, 2003). In recognition of today’s pluralist social inquiry field, I conceptualize these methodologies as representative of a narrow definition of approaches and that many others also exist (e.g., action research, critical social science, critical race theory); each with their own history and procedures. Yet, this paper is limited to discussing developing methodological literacy related to three m ­ ethodologies, that is, qualitative, quantitative, and mixed methods. Specific to the history of the field of mixed methods research, Creswell and Plano Clark (2011) categorize the stages of development into five overlapping periods of time (i.e., formative, paradigm

INTERNATIONAL JOURNAL OF MULTIPLE RESEARCH APPROACHES  Volume 8, Issue 1, April 2014

© eContent Management Pty Ltd

debate, procedural development, a­dvocacy and expansion, and reflective). The reflective period has been characterized by a three foci: (a) assessing the current state of the field, (b) looking to the future, and (c) engaging in a constructive critique of recent advances within the field (Creswell & Plano Clark, 2011). With so much happening within the emerging field itself and the multiple perspectives involved, Greene (2007, 2010) highlights that it is not surprising that opportunities to learn about mixed methods research within many graduate-level programs may remain limited; Capraro and Thompson (2008) note this is especially evident when compared with quantitative research (e.g., statistics, measurement) and qualitative research (e.g., narrative, discourse analysis). Thus, a pressing need exists for further research exploring how research methods courses effectively integrate learning about quantitative, qualitative and mixed methods research approaches. One might assume that literature guiding how researchers are afforded comprehensive methodological training opportunities would be well established. However, the extensive research undertaken related to doctoral curriculum during the past two decades has been limited to revealing trends highlighting the differing emphasis of methodological approaches within and across fields of study at differing time periods; for example, the prominence of quantitative training in psychology in the late 1980s (Aiken, West, Sechrest, & Reno, 1990) and the lack of mixed methods research training in education in the mid 2000s (Capraro & Thompson, 2008). Increasing interest during the 2000s led to improved access and subsequently greater presence of mixed methods research within the health and medicine and education literature (Ivankova & Kawamura, 2010). Indeed, a more inclusive approach to methodological education providing effective learning opportunities will be critical for preparing researchers to address the complex nature of educational research problems facing society (James, 2012). A growing, yet limited body of literature dedicated to how to go about teaching mixed methods research courses has emerged during the past decade (e.g., Christ, 2009; Creswell & Plano Clark, 2011; Onwuegbuzie & Leech, 2006). These researchers identified one of the pressing challenges for

Toward effective mixed methods learning

institutions as the availability of ­faculty ­members with expertise in mixed methods. In this paper, faculty members are defined as instructors who are members a Faculty (also referred to as a College) within higher education institutions. Faculties may or may not be departmentalized. While the issue remains that many faculty members lack previous course experiences as a learner, numerous mixed methods workshops and online courses have begun to address the issue of limited access to course resources and mentors to guide the development of mixed methods courses previously identified by Creswell, Tashakkori, Jensen, and Shapley (2003). In her opening article in the 2010 special issue of International Journal of Multiple Research Approaches, Jennifer Greene describes the challenges faced by those teaching mixed methods. Among the challenges related to the newness of the field are the lack of teaching exemplars and mentors to follow which influence decisions related to selecting course content and identifying prerequisite skills. To begin to address some of these issues, two of the papers within the special issue documented teaching doctoral-level mixed methods research courses within specific instructional contexts (i.e., cohort-based and online environment). The latter paper highlighted the advantages of online environments for providing students with opportunities to learn from one another (Ivankova, 2010). Two additional papers considered the instructional content of such courses; whereas one advances curriculum reform of research methodology courses as impacting statistics anxiety, the other forwards the importance of providing students opportunities for engaging in a critical examination of their own philosophical assumptions. The former paper strongly argued for a more integrated approach to learning about research methods (Onwuegbuzie, Leech, Murtonen, & Tahtinen, 2010). Although the special issue contributed greatly to the discussion related to teaching of mixed methods, many issues remain to be addressed; including the need for diverse ­examples to guide how instructional ­opportunities are provided to learn about mixed methods research. This paper, using an empirical example of the introduction of a new doctoral-level mixed methods research course, contributes important implications for research methods instructors in general

Volume 8, Issue 1, April 2014  INTERNATIONAL JOURNAL OF MULTIPLE RESEARCH APPROACHES

75

Cheryl Poth

and specifically for those who teach or plan to teach mixed methods research courses. These implications are intended to inform the creation of learning environments that build relevant skills for preparing students to undertake mixed methods research. The study was guided by the mixed methods question ‘What are the course experiences that students identify as effective for developing mixed methods research knowledge and skills.’ An embedded mixed methods design was used in which quantitative data were embedded within a qualitative case study bounded by the duration of the course and its participants for the purpose of generating a comprehensive understanding of the course experience and impact from the students’ perspective. Study context Increasing demand from students requesting opportunities to gain knowledge and skills related to mixed methods research and the lack of opportunities afforded within the department provided the impetus for developing a doctoral-level mixed methods research course. The study context reflected the typical situation highlighted by the research by Capraro and Thompson (2008) in that although the majority of doctoral programs required completion of at least one quantitative course there was no explicit requirement for either a qualitative or mixed methods course. The course development began during winter 2010 and was initially offered during the 2011 winter term (January to April) through the Department of Educational Psychology, within a Faculty of Education at a research-intensive Western Canadian University. Course organization The organization of the mixed methods research course was influenced by pedagogical and logistical considerations. Instructional decisions were guided by literature related to effective learning environments as well as literature specific to the topic of teaching mixed methods. In particular, a 12-step model outlined by Fink (2003) was used to enhance the learning environment by emphasizing alignment of learning goals, teaching and learning activities, as well as feedback and assessment. Specifically, the teaching and learning activities were intended to create significant learning experiences, defined by Fink as a two-dimensional experience where 76

© eContent Management Pty Ltd

the learning process creates an outcome that is ­characterized by change. To that end, the instructor sought to build on students’ existing research skills and then extend to mixed methods research skills; for example, when teaching about mixed methods research designs, the instructor began with a review of the designs related to qualitative and quantitative research and then discussed the designs specific to mixed methods research. Further instructional decisions were informed by research related to instructing mixed methods research courses; specifically the use of online environments (Ivankova, 2010), collaborative instructional approaches (Baran, 2010), and instruction related to the intended use of the skills (Greene, 2010). The course content was organized by four topics related to the research process (planning, evaluating, disseminating, and future directions). Each topic had a specific learning goal (e.g., planning topic included applying key characteristics of mixed methods research designs to create a study proposal) and was aligned with a summative assessment focused on measuring achievement of the learning goal where a grade was assigned (e.g., planning topic used a research proposal that contributed 30% toward course grade). The four summative assessments included: research proposal, article critique, resource review, and issue examination. The research proposal required students to identify (a) a study goal and research objectives by stating the salient problems addressed and rationale for the study, (b) t­heoretical framework by situating the research within relevant literature, (c) methodology by providing a rationale for how the design, data sources, and analysis procedures address the study objectives, and (d) issues by addressing any concerns related to validity, logistics, and limitations. The article critique required students to locate an empirical example of mixed methods research in his or her field of study or interest. Then (a) provide a brief summary of the research covering study objectives, rationale for using mixed methods, research questions, design – including methods of data collection, analysis, and interpretation, (b) discuss strengths and weaknesses of the research in terms of the qualitative, quantitative and mixed methods aspects, and (c) propose alternative approaches for overcoming the weaknesses. The resource review required students

INTERNATIONAL JOURNAL OF MULTIPLE RESEARCH APPROACHES  Volume 8, Issue 1, April 2014

© eContent Management Pty Ltd

to read their selected resource and prepare a written review organized generally by three sections and reflective of a review that would be accepted in an academic journal: introduction, evaluation, and conclusion. Finally, the issue examination required students to select from a list or identify their own issue and facilitate a discussion related to this issue and a handout summarizing the major points of the issue. It is important to note that the intended purpose of the course was to introduce the foundational elements of mixed methods research and to provide opportunities to develop skills that students could then use to conduct their research studies (often in the form of a dissertation). The teaching and learning activities for the 13 weekly 3 hour class focused on offering opportunities to developing relevant topic skills by providing interactive formative assessments (with no grades assigned) by peers and the instructor. These formative opportunities were supplemented by support beyond the lecture with access to feedback available from the instructor individually (e.g., through office hour and email) or more publically

Toward effective mixed methods learning

from both the instructor and peers using an online discussion board through a learning management system (LMS) (see Table 1). Weekly readings related to the focus of each class were assigned from the course text, the 2nd edition of Designing and conducting mixed methods research (Creswell & Plano Clark, 2011) and supplemented, on average, by three articles that were chosen by the instructor with the intention of representing multiple perspectives. The course outline and syllabus are available through the Mixed Methods International Research Association website (mmira.org) and following links to the author’s resources. Methodology An embedded mixed methods research design was used whereby findings from one strand (in this case themes from the qualitative analysis) are enhanced by the findings from the other strand (in this case findings from the quantitative analysis) (Creswell & Plano Clark, 2011). According to Yin (2009) a case study design is useful for describing the implementation and evaluation of an intervention (in this case,

Table 1: Matrix of alignment between learning goals, teaching and learning activities and assessments Topic/week

Learning goals/teaching and learning activities

Summative assessment/ formative assessment

Planning

Apply key characteristics of MM to create a study design

Research proposal (30%)

1

Explore history and unique contributions of MM

2

Connect worldviews with research approach

Draft philosophical stance

3

Examine purposes of qualitative, quantitative and MM approaches

Draft research purpose

4

Investigate qualitative, quantitative and MM research questions

Draft research questions

5

Compare rationale for MM designs

Draft rationale for design

6

Construct visual representation of MM design

Draft visual representation

7

Align data collection, analysis, and integration strategies with designs

Draft data collection and analysis procedures

Evaluating

Apply key characteristics of MM to a published study

Article critique (20%)

8

Identify key features for evaluating

Draft structure for critique

9

Connect key features with emerging literature

Draft critique

Disseminating

Apply key characteristics of MM to a publically available resource

Resource review (20%)

10

Identify avenues for MM reporting

Draft structure for review

11

Connect resource features with emerging literature

Draft review

Future directions

Explore a current issue in MM

Issue examination (30%)

12

Identify key issues

Draft rationale for issue

13

Connect issues with emerging literature

Draft issue examination

Volume 8, Issue 1, April 2014  INTERNATIONAL JOURNAL OF MULTIPLE RESEARCH APPROACHES

77

Cheryl Poth

© eContent Management Pty Ltd

the course) within a real-life context and is ideal for examining the relationships between constructs (in this case, experience and impact). The rationale for using this overall design is that a single data set is not sufficient to answer the mixed method research question. The qualitative data collection, analysis, and interpretation is enhanced and augmented by the collection of secondary quantitative data (e.g., Creswell & Plano Clark, 2011).

finally the instructor did not have access to who had chosen to participate in the research study itself until the final course grades were submitted. Three data sources included pre- and postcourse written questionnaires, mid- and end- of course written evaluations, and an end-of-course focus group captured the student perspective at three time points (see Figure 1): at the beginning of the semester (pre-questionnaires), during week 7 of 13 (mid-point course evaluations), and at the course completion (post-questionnaires, endof-course evaluations, and end-of-course focus group). Where possible, pseudonyms were used to link the students’ responses across the data sources, the exception being mid-point and end-of-course course evaluations that were collected anonymously. To mitigate any issues related to power the author/instructor was not present during any of the data collection.

Study participants Fourteen participants chose to be part of the study, although, as only those who completed both the pre- and post-course questionnaires were eligible for inclusion, 13 participants were included in the analysis. The majority of participants were enrolled in doctoral programs offered by the Faculty of Education (e.g., program evaluation, special education, school psychology, counseling psychology), as well as representation from various Faculties of Health Sciences (e.g., Rehabilitation Questionnaires Science, Pharmacy). Participants’ research inter- Each questionnaire contained the same quantitaests encompassed K-12 educational contexts (e.g., tive items for the purpose of assessing change in learning disabilities and reading) as well as post- research skills. At each time, participants rated their secondary learning environments (e.g., assessing perceived levels of competence across three methclinical competence, accommodating students odological research skills (i.e., qualitative, quantitawith special needs). Although diverse in their tive, and mixed methods). For each methodology methodological preparation across, a greater num- (i.e., qualitative, quantitative and mixed methods), ber of students reported quantitative coursework participants rated their competence in the eight and experiences than qualitative; QUAL strand however, none of the participants PostIntegrated analysis reported any previous coursequestionnaire e Pre-course of course Mid-course questionnaire work or mixed methods research experiences evaluation -background experiences experiences. -course experiences -demographics

Data collection procedures The research reported in this study received ethical clearance through the University of Alberta’s Research Ethics Board. To mitigate potential power issues all data collection activities were embedded within the course activities, an external research assistant recruited students enrolled in the course to participate in the study at the beginning of the semester, instructor was not present during any of the data collection, and

78

quan strand

Prequestionnaire -knowledge & skills

experiences

End-of course focus group -course experiences End-of-course e evaluation -course experiences Postquestionnaire -Knowledge & skills

Integrated discussion

Change over time analysis of knowledge & skills

Figure 1: Visual representation of the embedded mixed method design exploring the student experiences and impact of the

MM course

INTERNATIONAL JOURNAL OF MULTIPLE RESEARCH APPROACHES  Volume 8, Issue 1, April 2014

© eContent Management Pty Ltd

steps of the research process on a 5-point Likertformat scale from (1 being little to no competency to 5 being extremely competent) across two categories: planning (e.g., developing research questions, knowledge of research designs, identifying paradigmatic frameworks, selecting data collection instruments) and conducting (e.g., designing data collection instruments, collecting data, analyzing data, and interpreting findings). The items were researcher-developed yet followed the basic research steps outlined in introductory social science research design texts (e.g., Babbie, 2007; Creswell, 2013). Each questionnaire contained a section with researcher-developed qualitative open-ended questions yet these items differed between the pre- and post-course questionnaire. At the beginning of the course, the items served the purpose of collecting participant demographic information; 11 openended qualitative items that asked students about their backgrounds related to program choices, completed research methods courses, research experiences, motivation for taking the course, and areas of research and methodological interests (e.g., What background experiences and courses do you bring in terms of quantitative and qualitative methods? What is your current area of research interest?). At the end of the course, nine open-ended qualitative items focused on course experiences (e.g., ‘What supported your learning in this course?’ To what extent did each of the assessments helped prepare you for undertaking future mixed methods research projects) and two additional open-ended questions asked about intended use of mixed methods research skills and time spent per week on course readings and assessments. Prior to administration, the questionnaires were piloted using think-aloud protocols (Willis, 2005) with two graduate students to inform clarity of instructions and items. Thirty minutes was allocated of class time and 13 participants completed the questionnaires. Course evaluations Two types of written evaluations were administered in-class during the course: an informal evaluation at mid-point (week 7) and a formal evaluation at the end of the course (week 13). The purpose of the mid-point evaluation was to informally elicit feedback about participants’ experience to inform

Toward effective mixed methods learning

instructional decisions and students were guided by the oral instructions, ‘What would you like your instructor to start doing in this course, stop doing in this course, and continue doing in this course.’ The mid-point evaluation was researcher-created yet followed a critical incident questionnaire which is what Brookfield (2006, p. 35), in his book, The Skillful Teacher describes as an ‘attempt by teachers to study their classrooms in order to find out what and how students are learning.’ Ten minutes of class time was allocated and 12 participants completed the evaluation. The following day, the teaching assistant provided the instructor with a typed summary of these comments. The institution for the purpose of assessing student experience and instructor effectiveness requires the end-of-course evaluation. Students rated their agreement to eight instructorand course-focused items (i.e., quality of instructor and course). Nine students completed the ratings on a 5-point Likert-format scale (1 being strongly disagree to 5 being strongly agree). Focus group Following course completion and after grades were assigned, seven participants volunteered to be involved in a focus group in a location where a private conversation could take place that was led by a facilitator and supported by a note-taker (Krueger & Casey, 2000). The purpose of the focus group was to provide an opportunity to reflect on participants’ course experiences and further discuss the course impact on their research skills as well as their intended use of mixed methods research skills. The 60-­minute focus group was audio-recorded and transcribed verbatim. Data analysis procedures The analysis was undertaken in four steps: (a) separate analysis of each of the data sources, (b) combining themes from individual qualitative data sources, (c) bringing together findings from individual quantitative data sources, and (d) generating the integrated mixed methods themes. Separate analysis of each of the data sources Descriptive statistics (mean, standard deviation, and frequency) were generated from the separate analysis of the quantitative items from the

Volume 8, Issue 1, April 2014  INTERNATIONAL JOURNAL OF MULTIPLE RESEARCH APPROACHES

79

Cheryl Poth

questionnaires and end-of-course evaluations items using SPSS Statistics 19 ©. It should be noted that it was not possible to generate any meaningful psychometric information about the questionnaires due to the small sample size and nature of the items (i.e., researcher-­developed self reports). Separate inductive analyses were conducted on each of the open-ended qualitative items from the pre- and post-questionnaires, the midpoint course evaluation summary, and the focus group transcript. Files were first uploaded to ATLAS-ti © (2012) and analyzed using a constant comparative method (Charmaz, 2006) to create preliminary code structures. Once the codes were defined, the list was given to a second and third researcher to assess fidelity in coding among them. When inter-rater reliability reached 90%, the code list was applied to the remaining data. Throughout the iterative analysis process, the current study’s researchers used memos to document emerging insights and evolving understanding of patterns from the data. Summaries for each data source were generated, distributed to participants, and feedback was sought pertaining to errors and omissions. These efforts (i.e., member checks with participants and inter-rater comparisons among coders) were completed to enhance the trustworthiness, confidence, and dependability of the data analysis procedures (Lincoln & Guba, 1985). Combining themes from individual qualitative data sources Nine common themes emerged from comparing qualitative findings across the data sources. Two researchers reviewed these themes and where overlap existed, the themes were refined to five. Bringing together findings from individual quantitative data sources Patterns across the three quantitative data sources (i.e., pre- and post-questionnaire and end-ofcourse evaluations) were examined. Specifically, the inferential analyses (means comparison from pre- to post-test) to assess change over time of knowledge and skills were performed across the two questionnaires. To estimate the effect size of 80

© eContent Management Pty Ltd

the course Cohen’s d was calculated because of its usefulness in conveying magnitude of a difference (Cohen, 1988). An effect size of 0.13 is considered small, 0.47–0.72 medium, and 0.73 and higher large. Although the questionnaire data included three sets of eight variables, the sample size was too small (N = 13) to conduct analysis beyond a means comparison. Given the small sample size, a difference was claimed for each variable if the difference was significant at the 0.05 level of s­ignificance and the effect size was at least 0.50. Generating the integrated mixed methods themes The integration of the findings from the qualitative and quantitative strands was guided by a pragmatic approach (Greene, 2007; Teddlie & Tashakkori, 2009). Specifically, a basic type of qualitative dominant cross-over mixed analysis (Onwuegbuzie, Leech, & Collins, 2011) was employed where the themes generated by the qualitative analysis were used as the organizational framework on which to integrate the quantitative data. To represent the integrated qualitative and quantitative themes, Plano Clark, Garrett, and Leslie-Pelecky (2010) have suggested the use of matrices. Matrices, such as comparison tables and joint displays, were created to compare qualitative and quantitative data at the individual and group levels. Thus, the first step combined the findings from the individual data sources within each of the qualitative and quantitative strands using partially ordered meta-matrix for a visual representation summarizing the findings (Miles & Huberman, 1994). The second step integrated the findings where each strand was compared, consolidated, and finally represented in matrices (Onwuegbuzie & Combs, 2010). Integrated findings The integrated analysis revealed four major themes: mixed methods as a distinct research approach, focus on planning research skills, relevant course goals, activities and assessments, and embedded opportunities for accessible and specific feedback. The following section is organized

INTERNATIONAL JOURNAL OF MULTIPLE RESEARCH APPROACHES  Volume 8, Issue 1, April 2014

© eContent Management Pty Ltd

by presenting the findings related to subthemes within the major themes (see Table 2). Mixed methods as a distinct research approach Participants reported recognizing mixed methods research as distinct from qualitative or quantitative research and its specific role to generate new understandings that were not accessible by either qualitative or quantitative research alone. The need for earlier and sustained exposure to mixed methods was a common comment on the end-ofcourse evaluations: ‘It would have been helpful if I learned this stuff earlier and then learned more about it right now.’ Finally, focus group participant identified an advanced mixed methods course as necessary for continuing their l­earning: ‘What I would like to see is another course added as a follow-up to look at some of the analysis and interpretation concepts that we learned, and given the opportunity to learn them more indepth.’ This advanced course was perceived to be particularly useful if it provided opportunities to

Toward effective mixed methods learning

apply theory into practice related to analysis: ‘We did a presentation on data transformation and it is one thing to talk about it, quantitating and qualitizing data, but how do you actually … take data and transform it.’ Focus on planning research skills Assessing change in research skills was challenging because of the differences in levels of skill at the beginning of the course. Indeed, noteworthy differences were revealed by the comparison between the overall pre-­quantitative and pre-mixed methods skills ratings (SD = 10.78, p 

Suggest Documents