Journal of Interactive Learning Research (2012) 23(1), 5-28
Factor Validity of the Motivated Strategies for Learning Questionnaire (MSLQ) in Asynchronous Online Learning Environments (AOLE) Moon-Heum Cho Kent State University-Stark
[email protected] Jessica Summers University of Arizona
[email protected]
The purpose of this study was to investigate the factor validity of the Motivated Strategies for Learning Questionnaire (MSLQ) in asynchronous online learning environments (AOLE). In order to check the factor validity, confirmatory factor analysis (CFA) was conducted with 193 cases. Using CFA, it was found that the original measurement model fit for motivation and learning strategies scales of the MSLQ was not satisfactory in AOLE. Exploratory factor analysis (EFA) was conducted to find alternative factors of the MSLQ in AOLE. EFA with motivation and learning strategies scales of the MSLQ revealed five and four latent factors respectively. Finally, issues and implications to improve the factor validity of the MSLQ for AOLE were discussed.
Keywords: Factor validity; MSLQ; Self-Regulated Learning; Asynchronous Online Learning Environments
6
Cho and Summers
Factor validity of the Motivated Strategies for Learning Questionnaire (MSLQ) in Asynchronous Online Learning Environments (AOLE) The Motivated Strategies for Learning Questionnaire (MSLQ) describes multi-faceted constructs explaining students’ Self-Regulated Learning (SRL) experiences represented by motivation and learning strategies scales. The MSLQ was developed in the early 1990’s by a group of researchers (Pintrich, Smith, Garcia, & McKeachie, 1991) at the University of Michigan. The MSLQ contributes to understanding college students’ SRL as well as the relationships between motivation and cognition. Because of the solid theoretical background (e.g., Expectancy x Value Theory and Goal Theory), and ease of administration, the MSLQ has been used and translated in many studies conducted across several different countries (Duncan & McKeachie, 2005). Consequently, the MSLQ has become one of the most commonly used research tools for SRL research across diverse subject areas and with different audiences (e.g., high school and college students) (Duncan & McKeachie, 2005). However, recently some researchers questioned the factor validity of the MSLQ (Benson, 1998; Gable, 1998; Muis, Winne, & Jamieson-Noel, 2007). In her review of the MSLQ, Benson questioned the 15-factor model of the MSLQ and argued the MSLQ should be tested with different numbers of factors and verify whether the 15-factor model is the best fit for the different number of factors tested. Also, in a review of the MSLQ as a research tool, Gabel (1998) complained that the model fit of the MSLQ only came from one-time sample (n = 380) and in order to construct a valid argument, the MSLQ should be tested with more samples. Also, Muis, Winne, and Jamieson-Noel (2007) conducted confirmatory factor analysis (CFA) with college students in order to check the stability of the rehearsal, elaboration, organization, critical thinking, self-regulation, and adjusted self-regulation factors in the MSLQ. Muis and her colleagues found the model fit indices of the learning strategies scale were not stable as Pintrich and his colleagues reported in their article (Pintrich et al., 1993). The study results show the MSLQ should be tested with diverse samples in order to investigate its factor validity.
Factor Validity of MSLQ in Online
7
Factor validity of the MSLQ in Asynchronous Online Learning Environments (AOLE) In recent years, researchers have applied the MSLQ to measure students’ SRL in technology-mediated learning contexts. Despite the necessity of testing the MSLQ with diverse samples in different contexts, online learning contexts were rarely used to test the factor validity of the MSLQ or only small portion of the MSLQ items were tested for factor validity in AOLE. Recently, Hadwin et al. (2007) selected 10 items from the MSLQ’s most relevant strategies when it comes to studying in gStudy, a cognition support software tool for learning. In that study, Hadwin et al. (2007) clustered the participants into four groups based on their MSLQ scores. However, they found students’ actual SRL behaviors were very different even if they were in the same clusters. Hadwin et al. (2007) challenged the factor validity of the MSLQ. Other research investigating the factor validity of the MSLQ online was conducted by Richardson (2007). Richardson (2007) used the motivation scale of the MSLQ to measure students self-regulation in distance education. When he implemented the MSLQ, he changed certain terms reflecting his own institution’s culture regarding distance learning by replacing class, instructor, and test with course, tutor, and assignment respectively. Richardson (2007) conducted principal component analysis and found two factors of the Motivation scale of the MSLQ. As can be seen, factor validity studies of the MSLQ used either a motivation or learning strategies scale, or small portion thereof, and no researchers have conducted factor validity studies with the entire MSLQ in AOLE. One of the reasons that researchers only used a small portion of the MSLQ for their online study might be because the MSLQ was originally developed for traditional classrooms. Consequently, many items do not directly reflect students’ online learning experiences. The lack of reflection of the online nature in items may have lead researchers to use only a small portion of the items and adjust them to their research contexts. Although this approach is useful to use the MSLQ as a research tool, it does not provide fundamental solutions to use the MSLQ as a research tool in AOLE. An active research approach is necessary to investigate the possibility of using the MSLQ in online learning. This investigation of the factor validity of the MSLQ in its entirety, provides insights into the usefulness of the MSLQ as a research tool for online SRL research.
Cho and Summers
8
Current Research Questions Thus, the overarching purpose of this research was to investigate the factor validity of the MSLQ with all its items in AOLE and to find a way to use the MSLQ as a SRL research tool for AOLE. More specifically, the primary research questions guiding this study are as follows: 1. Does the original MSLQ model fit asynchronous online learning environments? 2. If not, what is an alternative factor that to If not, what is an alternative factor structure that better captures asynchronous online students’ SRL? The MSLQ has reported factor validity for motivational and cognitive factors using CFA (Pintrich et al., 1993). CFA is one of the most common methods used to report factor validity; therefore in order to answer the first research question, a CFA was conducted using Mplus. Although the MSLQ reported factor validity in their manual (Pintrich et al., 1991) and research article (Pintrich et al., 1993), they do not report EFA. EFA has been used most commonly as a statistical method to reduce the number of variables and to find a latent factor in measurement development (Henson & Roberts, 2006). Also, EFA has been used to find alternative factors of the existing measurements in diverse research settings such as different countries, institutions, and participants. Therefore, EFA was used to answer the second question using SPSS.
Method Participants A total of 193 online students at a large mid-western research university in the United States participated in this study during the 2005 and 2006 fall semesters. Students took diverse online courses including educational psychology, statistics, nursing, journalism, instructional design, multi-media development, special education, and library science. The number of participants in each course varied from 3 to 12 students. The average number of participants in each course was 6.23. Of the 193 participants, most (n = 153) were female, over 26 years of age (n = 166), native English speakers (n = 178), graduate students (n = 169), and had taken at least one other online course before taking this course (n = 158). None of the online courses had a face-to-face meeting and all of them were delivered completely through learning management systems such as Blackboard, Sakai, and WebCT.
Factor Validity of MSLQ in Online
9
Procedure Students responded to an online MSLQ from the middle to the end of their course in the 2005 and 2006 Fall semesters. Respondents were recruited via online discussion boards in each online course and the survey was delivered via online. Students who visited the online survey read the purpose of the study. If they agreed to the online consent form, the survey was administered. Online course contexts All the courses for this study were delivered in a text-heavy asynchronous online format where all the descriptions and activities are mediated by texts. The structure of the online courses was similar; students were given weekly or bi-weekly assignments and were supposed to responsibly manage learning on their own. Most of the communication between students and instructors and among students took place through either asynchronous discussion boards or emails. In order to understand students’ learning tasks, syllabi analysis from the 31 asynchronous online courses was conducted using content analysis used by Eberly, Newton, and Wiggins (2001). All of the online learning tasks were categorized into six themes. The numbers of syllabi used for the analyses were as follows: two from educational psychology, one from educational statistics, one from curriculum and instruction, two from journalism, three from special education, six from library science, seven from instructional design, five from multimedia development, and four from nursing. Based on the analysis, student activities evaluated in an asynchronous online course varied from two to twelve activities and students were evaluated for a total average of six activities in each course, categorized in the following way: 1) individual projects, 2) discussion for knowledge construction, 3) examinations, 4) individual projects with peer feedback, 5) discussion for sharing information, and 6) group projects with a group artifact. The results showed that the most common learning task in this study involved individual projects (48.2%), where students were required to develop and process their own projects in a given situation by choosing or narrowing their topic, finding resources, analyzing, synthesizing and producing a final product.
Cho and Summers
10
Table 1 Six Types of Learning Tasks in Asynchronous Online Courses Types of Online Learning Task
Task Description
Percentage
Individual Projects
Students are required to process their own 48.2 % projects in a given situation by choosing or narrowing the topic, finding resources, analyzing, synthesizing, and producing the final product. They can interact with either instructor or peers but if they need to interact, mostly interact with the instructors.
Discussion for Knowledge Construction
14.3 % Students are required to interact with others after they read the assigned articles or book chapters. They are expected to post messages and reply to their peers’ postings. By doing so, students share their opinions about specific topics, experiences, and building knowledge. Also, students are expected to articulate their own ideas explicitly and self-reflect by reviewing what they’ve done and analyze or compare it to others or experts.
Examination
11.6 % Students are required to take-home exams such as either essay or multiple choice questions or students are required to take a quiz at a designated time with time limits. Students are not allowed to interact with peers with these types of tasks.
Individual Projects with Peer Feedback
Students are required to process their own 8.9 % tasks in a given situation by choosing or narrowing the topic, findings resources, analyzing, synthesizing and producing the final product. They are also required to get feedback from peers and to improve their products based on the peer feedback.
Discussion for Sharing Information
Students are required to find a source and 7.1 % provide summarization and share it with others by posting it in a discussion board. Also, they are encouraged to help each other.
Factor Validity of MSLQ in Online
Group Project with With shared goals, students are required to Group Artifact collaborate with team members in a given situation. As a group they are required to choose or narrow the topic, find resources, analyze, synthesize, and produce group artifacts. In addition, they are expected to share, exchange, or explain ideas, divide roles, and provide feedback with one another
11
6.3 %
The second most common learning task involved discussion for knowledge construction (14.3%). This task required students to read an article or textbook and share or reflect about their online experiences. Many times students were required to regularly participate in the discussion. The third most common learning task was an examination (11.6%). However, the asynchronous online examination was a little different from a traditional examination in most cases, either because it was a take home exam or a technology mediated exam where students logged onto the learning system and took the exam at a designated time. Some online instructors asked students to be in a physical place such as library while taking their exam, so they could find answers using library sources. The fourth most common task was individual projects with peer feedback (8.9%). This type of task requires students to do their own projects. Then, they are expected to provide feedback on others’ projects and receive feedback from others to improve their own projects. The fifth most common task was discussion for sharing information (7.1%). For example, some online courses required students to post a resource with summary in order to help other classmates by sharing information voluntarily. The final most common task was a group project with a group artifact (6.3%). Specifically, students were required to collaborate for the completion of the projects, requiring frequent or intensive discussion and agreement from the group members. The remaining 3.6% of the tasks were not categorized because there was little commonality among the tasks. Measures The full 81 item MSLQ was used for this research. Like the paper version of the MSLQ, students’ responses were recorded by using a 7-point scale ranging from (1) not at all true of me to (7) very true of me. The
Cho and Summers
12
MSLQ consisted of two scales: motivation and learning strategies scale. The Motivation scale (total items n = 31) consisted of six subscales including intrinsic goal orientation (n = 4), extrinsic goal orientation (n = 4), task value (n = 6), control of learning beliefs (n = 4), self-efficacy for learning and performance (n = 8), and test anxiety (n = 5). Learning strategies scale (total items n = 50) consists of nine subscales including rehearsal (n = 4), elaboration (n = 6), organization (n = 4), critical thinking (n = 5), metacognitive self-regulation (n = 12), time and study environment management (n = 8), effort regulation (n = 4), peer learning (n = 3), and help seeking (n = 4).
Results Confirmatory factor analysis of the original MSLQ model Motivation scale. In order to investigate the factor validity of the MSLQ in AOLE, CFAs were conducted separately for the motivation and learning strategies scales of the MSLQ as did Pintrich and his colleagues (Pintrich et al., 1993). For these purposes, several model fit indices were calculated to assess the MSLQ measurement fit: Chi-square, Chi-square/ degree of freedom (df), Comparative Fit Index (CFI) and the Tucker and Lewis Index (TLI); and Root Mean Square Error Approximation (RMSEA) and Standardized Root Mean Square Residual (SRMR). For the motivation scale of the MSLQ, χ2 (419) = 1089.70 (p < .001), χ2/df = 2.60, CFI = .80, TLI = .78, RMSEA = .09 (90 percent C. I. .084 - .10), and SRMR = .09. According to the criteria for a good model fit (Hu & Bentler, 1999), one desires a non-significant chi-square statistic, χ2/df < 2, CFI and TLI ≥ .96 and SRMR ≤ .10 or RMSEA ≤.06 and SRMR ≤.10. Such combination rules help to minimize the twin threats of rejecting the right model and retaining the wrong model. However, others have put less stringent criteria on model fit, such that CFI and TLI ≥ .90 (Marsh, Balla, & McDonald, 1988), and RMSEA and SRMR range between .05 to 1.0 (Brown & Cudeck, 1993), which is more realistic for smaller data sets in educational settings. Using either stringent or non-stringent criteria, the model fit indices of the motivation scale of the MSLQ were not satisfactory. However, the model fit was close to meeting less stringent criteria and it showed a potential to improve the model fit of the motivation scale of the MSLQ in AOLE. Learning strategies Scale. A CFA was conducted with the learning strategies scale of the MSLQ, yielding several model fit indices: χ2 (1139)
Factor Validity of MSLQ in Online
13
= 2781.17 (p < .001), χ2/df = 2.27, CFI = .67, TLI was .64, RMSEA = .08 (90 percent C.I. 0.076 - 0.084), and SRMR = .11. Chi-square was significant p < .001 which is normal although non-significance is desired. χ2/df was close to 2. However, CFI and TLI were far less than .90, both RMSEA and SRMR were far greater than .05. Therefore, none of the indices satisfy even the most liberal criteria for a good fit. Therefore, with the current data set it was determined that the original measurement model for the MSLQ learning strategies scale needs to be improved to be used as a measure of students’ SRL in AOLE. Summary of the CFA results. The current study compared the factor validity of the MSLQ using Pinrich et al. (1993) to that of the MSLQ in AOLE. When it comes to factor validity comparison, the model fit of the MSLQ tended to have poorer factor validity in AOLE compared to Pintrich et al.’s (1993) original measurement model. In general, the results showed that with the data set the six and nine factor model does not fit to AOLE. As Benson (1998) and Gable (1998) suggested it was necessary to check different number of factor structures of the MSLQ in diverse situations (e.g., subjects and target populations). Exploratory factor analysis: An alternative model In order to find an alternative model of the MSLQ in AOLE, EFA was conducted. EFA with motivation and learning strategy scales of the MSLQ was conducted separately. EFA with Motivation Scale. 31 items from the motivation scale of the MSLQ have been used for the EFA. An initial principal-axis factor analysis was conducted to check if the assumptions necessary for EFA were met. The Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy for the initial EFA was .87. A value closer to 1 indicates patterns of correlations are relatively compact; therefore, factor analysis should yield distinct and reliable factors (Dillon & Worthington, 2003). Also, Bartlett’s test of sphericity was significant at the .001 level indicating that the correlation matrix is not an identity matrix; therefore there are some relationships between the variables. An initial principal-axis extraction was performed. The initial communalities ranged from .36 to .82. Only two items’ initial communalities were lower than .40. Those items came from extrinsic goal orientations: question 11 (communalities: .39) and question 13 (communalities: .36). Seven factors’ eigen values were greater than 1, but the original moti-
14
Cho and Summers
vation scale of the MSLQ consisted of six factors. The scree plot suggests from five to seven factors. It was decided to analyze the data using five, six, and seven factor solutions and find the most appropriate number of factors to explain the data set. An oblique rotation (Direct Oblimin in SPSS, delta = 0) was used because each motivation subscale was hypothesized as correlated as Pintrich et al. (1993) indicated in his article. Also, the intercorrelations among factors ranged from -.01 to .48. Four criteria were applied: (a) each item loading value was greater than .40, (b) items with absolute loading value greater than .35 on more than one factor were deleted, (c) any discrepancies between cross-loading with an absolute value less than .15 were deleted, and (d) each factor had to have a minimum of three items (Pett, Lackey, & Sullivan, 2003). After exploring factor structures with each number of forced solutions, it was determined that the five factor solutions were the most interpretable. These factor structures were chosen over the other factor solutions for the following reasons: (a) the five factor solutions result in the most robust factor structure, (b) in six factors, only two items loaded in two factors (e.g., intrinsic goal orientation and control of learning beliefs), and (c) in seven factors, the last factor loaded with zero number of items. The five factors explained a total 60.35% of the variance. Table 2 presents factors, factor loadings, communalities, and reliability statistics. The first factor accounted for 29.79% of the variance (eigenvalue = 9.23). This factor included six items from task value and one item from intrinsic goal orientation. The second factor accounted for additional 12.49% of the variance (eigenvalue = 3.87). The factor consisted of five items from test anxiety. The third factor accounted for another 7.52% of the variance (eigenvalue = 2.33). This factor consisted of five items coming from self-efficacy for learning and performance. The fourth factor accounted for 5.75% of the variance (eigenvalue = 1.78). The factor consisted of six items: two items from self-efficacy for learning and performance, two items from intrinsic goal orientation, and two items from control of learning. The fifth factor accounted for 4.80% of the variance (eigenvalue = 1.49). The factor consisted of four items from extrinsic goal orientation. Extracted communalities for the 27 items ranged from .25 to .85 after oblique rotation. The internal consistency estimates of the five subscales were moderate to high: .91 (Factor 1), .84 (Factor 2), .92 (Factor 3), .77 (Factor 4), .65 (Factor 5), and .82 (Total).
Factor Validity of MSLQ in Online
15
Table 2 Results from the exploratory factor analysis with the Motivation of the MSLQ with Oblique Rotation (Oblimin; delta = 0) Factor loadings F1
F2
F3
F4
F5
M
SD
h2
a
Understanding the subject matter of this course is very important to me.
.90 (.89)
-.08 (-.05)
.00 (.34)
-.04 (.36)
.08 (.09)
5.91
1.09
.79
.89
I think the course material in this class is useful for me to learn.
.82 (.85)
-.02 (-.04)
.03 (.34)
.04 (.41)
-.02 (.01)
6.03
1.04
.72
.89
TSV
I like the subject matter of this course.
.78 (.83)
-.05 (-.09)
.02 (.34)
.12 (.47)
-.08 (-.07)
5.72
1.28
.72
.89
TSV
It is important for me to learn the course material in this class.
.78 (.76)
-.00 (.03)
-.00 (.28)
-.04 (.30)
.10 (.13)
6.05
.98
.59
.90
I am very interested in the content area of this course.
.76 (.79)
.00 (-.01)
-.06 (.25)
.13 (.45)
-.07 (-.04)
5.72
1.27
.65
.90
The most satisfying thing for me in this course is trying to understand the content as thoroughly as possible.
.71 (.79)
.02 (-.02)
.13 (.40)
.06 (.41)
-.04 (.01)
5.65
1.25
.64
.90
I think I will be able to use what I learn in this course in other courses.
.56 (.57)
.18 (.16)
.09 (.24)
-.03 (.23)
.02 (.10)
5.63
1.31
.36
.92
I have an uneasy, upset feeling when I take an exam.
.07 (.07)
.79 (.77)
.07 (-.10)
-.05 (-.04)
-.02 (.23)
3.07
1.94
.61
79
I feel my heart beating fast when I take an exam.
.04 (.07)
.78 (.76)
.06 (-.10)
.03 (.02)
-.02 (.23)
2.93
1.93
.59
.80
When I take a test I think about items on other parts of the test I can’t answer.
-.04 (-.03)
.64 (.70)
-.11 (-.23)
.11 (.03)
.10 (.29)
3.08
1.71
.50
.81
Items TSV q27
TSV q23
q26
q10 TSV q17 IGO q22
TSV q4
TSA q19 TSA q28 TSA q8
Cho and Summers
16
TSA
When I take tests I think of the consequences of failing.
-.03 (-.04)
.64 (.67)
-.06 (-.20)
.01 (-.05)
.07 (.26)
3.39
2.10
.45
.81
When I take a test I think about how poorly I am doing compared with other students.
-.02 (-.08)
.55 (.65)
-.23 (-.34)
.05 (-.06)
.16 (.32)
2.54
1.72
.48
.82
SLE
I expect to do well in this class.
.01 (.35)
.03(.19)
.93 (.92)
.00 (.29)
-.05 (.01)
6.18
1.01
.85
.89
SLE
Considering the difficulty of this course, the teacher, and my skills, I think I will do well in this class.
.08 (.39)
.05 (-.19)
.87 (.89)
-.01 (.29)
.00 (.05)
6.10
1.02
.80
.89
I believe I will receive an excellent grade in this class.
.04 (.31)
-.05 (-.17)
.78 (.79)
-.04 (.21)
.04 (.08)
5.95
1.11
.63
.91
I’m confident I can do an excellent job on the assignments and tests in this course.
.05 (.39)
-.13 (-.32)
.74 (.84)
.16 (.42)
-.04 (-.04)
5.82
1.10
.75
.90
I’m certain I can master the skills being taught in this class.
.14 (.47)
-.19 (-.30)
.52 (.71)
.31 (.53)
.08 (.05)
5.98
1.02
.66
.92
I’m certain I can understand the most difficult material presented in the readings for this course.
.01 (.38)
-.15 (-.19)
.19 (.43)
.66 (.73)
.11 (.06)
5.60
1.31
.60
.71
In a class like this, I prefer course material that arouses my curiosity, even if it is difficult to learn.
.18 (.45)
-.04 (-.10)
.00 (.26)
.61 (.69)
-.08 (-.10)
5.83
1.16
.51
.73
I’m confident I can understand the most complex material presented by the instructor in this course.
.03 (.37)
-.25 (-.29)
.22 (.47)
.56 (.65)
.15 (.08)
5.65
1.23
.57
.74
It is my own fault if I don’t learn the material in this course.
.04 (.28)
.14 (.08)
.05 (.18)
.53 (.56)
-.08 (-.04)
5.20
1.56
.33
.74
q14 TSA q3
q21
q31
SLE q5 SLE q20
SLE q29 SLE q6
IGO q16
SLE q15
CLB q9
Factor Validity of MSLQ in Online
IGO q1
CLB q25
EGO
q11
EGO q7
EGO q13
EGO q30
17
In a class like this, I prefer course material that really challenges me so I can learn new things.
.25 (.42)
-.13 (-.13)
-.14 (.13)
.50 (.57)
-.03 (-.07)
5.59
1.21
.39
.75
If I don’t understand the course material, it is because I didn’t try hard enough.
-.03 (.18)
.17 (.15)
-.01 (.09)
.49 (.47)
.00 (.05)
4.62
1.53
.25
.77
The most important thing for me right now is improving my overall grade point average, so my main concern in this class is getting a good grade.
-.09 (-.12)
-.03 (.22)
-.18 (-.17)
.00 (-.10)
.66 (.64)
3.30
1.74
.46
.57
Getting a good grade in this class is the most satisfying thing for me right now.
.19 (.18)
.01 (.22)
-.05 (.03)
-.05 (.01)
.61 (.62)
4.77
1.53
.42
.58
If I can, I want to get better grades in this class than most of the other students.
-.09 (.02)
.04 (.15)
.16 (.17)
.08 (.08)
.50 (.51)
4.51
2.02
.30
.59
I want to do well in this class because it is important to show my ability to my family, friends, employer, or others.
.08 (.09)
.21 (.32)
.10 (.08)
-.09 (-.04)
.44 (.51)
4.31
1.89
.31
.59
Note. There are 27 items. Unique pattern coefficients > .40 are in bold. Pattern coefficients are presented first, followed by structure coefficients in parentheses. Analysis is based on 193 observations. Motivation item ranges from 1 to 7. Likert scale anchors for Motivation ranged from 1 = not at all true of me to 7 = very true of me. Internal consistency estimates for Factor 1, 2, 3, 4, and 5 were a = .91, a = .84, a =.92, a = .77, and a = .65, respectively. TSV = Task Value; IGO = Intrinsic Goal Orientation; SLE = SelfEfficacy for Learning and Performance; CLB = Control of Learning Beliefs; EGO = Extrinsic Goal Orientation; h2 = item communalities at extraction; a = Cronbach’s alpha coefficient if item deleted.
Cho and Summers
18
Table 3 Results from the EFA with the Learning Strategies of the MSLQ with Oblique Rotation (Oblimin; delta = 0) Factor loadings
CRT q51
CRT q71
CRT q66
ELB q62
CRT
q47
ELB q81
ELB q64
Items
F1
F2
F3
F4
M
SD
h2
a
I treat the course material as a starting point and try to develop my own ideas about it
.77 (.74)
.08 (.27)
-.13 (.11)
-.05 (.08)
4.74
1.62
.57
.89
Whenever I read or hear an assertion or conclusion in this class, I think about possible alternatives.
.74 (.75)
.08 (.28)
-.03 (.21)
-.03 (.11)
4.93
1.49
.56
.89
I try to play around with ideas of my own related to what I am learning in this course.
.68 (.65)
-.06 (.13)
-.01 (.18)
-.01 (.07)
5.58
1.31
.43
.90
I try to relate ideas in this subject to those in other courses whenever possible.
.65 (.66)
-.05 (.15)
.09 (.28)
-.03 (.06)
5.73
1.09
.45
.90
When a theory, interpretation, or conclusion is presented in class or in the readings, I try to decide if there is good supporting evidence.
.65 (.70)
.14 (.35)
.00 (.23)
.03 (.18)
4.74
1.59
.51
.89
I try to apply ideas from course readings in other class activities such as lecture and discussion.
.63 (.68)
.04 (.26)
.08 (.29)
.07 (.19)
5.44
1.57
.48
.89
When reading for this class, I try to relate the material to what I already know.
.62 (.63)
-.09 (.12)
.08 (.26)
.04 (.12)
6.06
1.13
.40
.90
Factor Validity of MSLQ in Online
CRT q38
MSR
q61
MSR q41
MSR q76
ELB q53
ELB
q69
ORZ q42
19
I often find myself questioning things I hear or read in this course to decide if I find them convincing.
.59 (.53)
.06 (.15)
-.19 (-.02)
-.18 (-.09)
4.36
1.77
.34
.90
I try to think through a topic and decide what I am supposed to learn from it rather than just reading it over when studying for this course.
.55 (.60)
.15 (.32)
.04 (.23)
.01 (.15)
4.98
1.41
.39
.89
When I become confused about something I’m reading for this class, I go back and try to figure it out.
.53 (.57)
-.10 (.06)
.28 (.42)
-.12 (-.04)
6.07
.94
.42
.90
When studying for this course I try to determine which concepts I don’t understand well.
.50 (.60)
.17 (.40)
.07 (.27)
.18 (.33)
4.94
1.50
.45
.89
When I study for this class, I pull together information from different sources, such as lectures, readings, and discussions.
.50 (.61)
.04 (.31)
.22 (.41)
.25 (.37)
5.24
1.46
.50
.89
I try to understand the material in this class by making connections between the readings and the concepts from the lectures.
.46 (.56)
.11 (.34)
.12 (.31)
.23 (.35)
5.05
1.72
.41
.90
When I study for this course, I go through the readings and my class notes and try to find the most important ideas.
.42 (.53)
.07 (.26)
.25 (.40)
.08 (.20)
5.49
1.47
.36
.90
Cho and Summers
20
MSR q54
MSR q36
ORZ q63
PEL q34
ORZ q32
ELB q67
MSR q55
REH q72
ORZ q49
Before I study new course material thoroughly, I often skim it to see how it is organized.
.42 (.48)
.14 (.30)
-.01 (.16)
.11 (.22)
5.03
1.78
.27
.90
When reading for this course, I make up questions to help focus my reading.
.25 (.41)
.73 (.75)
-.12 (.05)
-.08 (.19)
2.94
1.78
.63
.86
When I study for this course, I go over my class notes and make an outline of important concepts.
-.03 (.25)
.73 (.74)
.22 (.31)
-.03 (.24)
3.34
1.96
.59
.86
When studying for this course, I often try to explain the material to a classmate or friend.
.27 (.42)
.71 (.75)
-.16 (.02)
-.06 (.21)
2.95
1.73
.63
.86
When I study the readings for this course, I outline the material to help me organize my thoughts.
-.01 (.24)
.66 (.67)
.16 (.26)
-.03 (.22)
3.35
2.08
.48
.86
When I study for this course, I write brief summaries of the main ideas from the readings and my class notes.
.04 (.38)
.64 (.69)
.12 (.24)
.06 (.30)
3.05
1.98
.49
.86
I ask myself questions to make sure I understand the material I have been studying in this class.
.18 (.37)
.62 (.71)
-.06 (.11)
.13 (.37)
3.89
1.86
.55
.86
I make lists of important items for this course and memorize the lists.
-.08 (.13)
.59 (.62)
.02 (.10)
.15 (.35)
2.59
1.70
.41
.87
I make simple charts, diagrams, or tables to help me organize course material.
.00 (.21)
.56 (.59)
.09 (.18)
.06 (.27)
3.35
2.06
.36
.87
Factor Validity of MSLQ in Online
REH
When I study for this class, I practice saying the material to myself over and over.
-.05 (.09)
.44 (.45)
-.05 (.02)
.09 (.23)
2.52
1.63
.21
.88
I often feel so lazy or bored when I study for this class that I quit before I finish what I planned to do.
.04 (.23)
-.03 (.06)
.67 (.67)
-.06 (.02)
5.59
1.54
.46
.81
I make good use of my study time for this course.
.24 (.45)
-.06 (.18)
.66 (.74)
.19 (.29)
5.17
1.48
.64
.80
I often find that I don’t spend very much time on this course because of other activities.
-.12 (.13)
.13 (.22)
.66 (.65)
.06 (.17)
4.72
1.81
.45
.81
When course work is difficult, I either give up or only study the easy parts.
.18 (.31)
-.13 (-.01)
.60 (.62)
-.06 (-.00)
6.00
1.30
.43
.82
TSM
I find it hard to stick to a study schedule.
-.09 (.10)
.07 (.11)
.58 (.56)
-.05 (.03)
4.36
1.92
.32
.82
EFR
Even when course materials are dull and uninteresting, I manage to keep working until I finish.
.29 (.43)
-.16 (.05)
.53 (.61)
.11 (.17)
5.74
1.23
.46
.82
I make sure that I keep up with the weekly readings and assignments for this course.
.27 (.44)
.02 (.22)
.47 (.57)
.15 (.26)
5.64
1.57
.43
.82
TSM
I attend this class regularly.
.10 (.27)
-.01 (.17)
.43 (.49)
.26 (.33)
6.10
1.37
.32
.82
TSM
I rarely find time to review my notes or readings before an exam.
-.00 (.14)
.10 (.12)
.43 (.43)
-.12 (-.04)
5.55
1.44
.20
.83
q39
EFR q37r
TSM q43 TSM q77r
EFR q60r
q52r
q74
TSM q70
q73
q80r
21
Cho and Summers
22
MSR q33r
HPS q68
HPS q75
PEL q45
PEL q50
During class time I often miss important points because I’m thinking of other things.
-.05 (.10)
.09 (.12)
.41 (.40)
-.07 (.01)
5.29
1.55
.17
.83
When I can’t understand the material in this course, I ask another student in this class for help.
-.04 (.10)
.10 (.36)
-.05 (.05)
.79 (.82)
2.97
1.91
.68
.73
I try to identify students in this class whom I can ask for help if necessary.
.11 (.22)
.01 (.30)
-.03 (.10)
.78 (.79)
3.56
2.16
.64
.77
I try to work with other students from this class to complete the course assignments.
-.09 (.00)
.01 (.22)
-.07 (-.00)
.72 (.70)
3.03
1.86
.50
.78
When studying for this course, I often set aside time to discuss course material with a group of students from the class.
-.01 (.11)
.18 (.36)
-.06 (.03)
.55 (.60)
2.28
1.56
.39
.83
Note. There are 38 items. Unique factor loadings > .40 are in bold. Pattern coefficients are presented first, followed by structure coefficients in parentheses. Analysis is based on 193 observations. Learning Strategies item ranges from 1 to 7. Likert scale anchors for Learning Strategies ranged from 1 = not at all true of me to 7 = very true of me. Internal consistency estimates for Factor 1, 2, 3, and 4 were a = .90, a = .88, a =.83, and a = .82, respectively. CRT = Critical Thinking; ELB = Elaboration; MSR = Metacognitive Self-Regulation; ORZ = Organization; PEL = Peer Learning; REH = Rehearsal; EFR = Effort Regulation; TSM = Time and Study Environment Management; HPS = Help Seeking ; h2 = item communalities at extraction; a = Cronbach’s alpha coefficient if item deleted. EFA with Learning Strategies Scale. 50 items on the Learning Strategies scale of the MSLQ have been used for EFA. The Kaiser-Meyer-Olkin (KMO) for the initial EFA was .86 and Barlett’s test of sphericity was significant at the .001 level. Both KMO and Barlett’s test of sphericity indicated adequate sample size to proceed. An initial principal-axis extraction analysis was performed. The initial communalities ranged from .39 to .85. Only
Factor Validity of MSLQ in Online
23
two items’ communalities were lower than .40: question 56 (communalities: .39) and question 80 (communalities: .39). Each factor was assumed to correlate somewhat and intercorrelations among each factor ranged from .13 to .33. Consequently, principal axis factor (PAF) analysis with oblique rotation (Oblimin, delta = 0) was conducted. The same four criteria used for the motivation scale of the MSLQ for EFA were applied to the EFA with a learning strategies scale. 12 factors’ eigenvalues were greater than 1; however, the original learning strategies scale of the MSLQ consisted of nine factors. The scree plot suggests from four to six factors. It was decided to analyze the data using four, five, and six factor solutions to find the most appropriate number of factors to explain the data set. After exploring factor structures with different numbers of factors (e.g., four, five, and six factors), it was determined that the four factor solutions were most interpretable. The four factor structure was chosen over the other factor solutions for the following two reasons: (a) the four factor structure was the most robust structure; it yielded items with strong factor loading and fewer cross-loadings than other solutions, (b) in the forced five and six factor solutions, the last factor ended up with zero number of items. The four factors explained 44.41% of variance. Table 3 presents the factors, factor loadings, communalities, and reliability statistics. The first factor accounted for 23.72% of the variance (eigenvalue = 11.86). This factor includes five items from Critical Thinking, five items from Elaboration, four items from Metacognitive Self-Regulation, and one item from Organization. The second factor accounted for 9.15% of the variance (eigenvalue = 4.57). The factor consists of three items from Organization, two items from Metacognitive Self-Regulation, two items from Rehearsal, one item from Elaboration, and one item from Peer Learning. The third factor accounted for 6.79% of the variance (eigenvalue =3.39). The factor consists of six items from Time and Study Environment Management, three items from Effort Regulation, and one item from Metacognitive Self-Regulation. The fourth factor accounted for 4.76% of the variance (eigenvalue = 2.38). The factor consists of two items from Effort Regulation and two items from Time and Study Environment Management. Extracted communalities for the 38 item inventory ranged from .17 to .68 after oblique rotation. The internal consistency estimates of four subscales were found to be moderate to high: .90 (Factor 1), .88 (Factor 2), .83 (Factor 3), .82 (Factor 4), and .91 (Total). Summary of the EFA. EFA with the MSLQ revealed an alternative
24
Cho and Summers
model of the MSLQ in AOLE. It was found that five motivation factors and four learning strategy factors represent the latent variables with the data set. Regarding the motivation scale of the MSLQ, the EFA showed test anxiety, self-efficacy for learning and performance, and extrinsic goal orientation loaded as Pintrich et al.’s original subscales. However, not all the items loaded the same way as the original model of the MSLQ. One item from intrinsic goal orientation loaded with task value items. In addition, two items from self-efficacy for learning and performance, two items from intrinsic goal orientation, and two items from control of learning beliefs loaded together and clustered around a new factor characterized as “positive attitude on learning material”. Some might be interested in the recovery of test anxiety. According to the syllabi analysis where this research has been conducted, the examination accounted for 11.6% of the total learning tasks students pursued and not all the classes had exams. The way to take online examinations also differed from that of traditional examinations. Online students had take-home exams or computer generated exams. Despite the difference between online and traditional exams, test anxiety loaded as a separate factor - similar to the original factors of the MSLQ. A possible explanation is students might be responding to the items based on their exam experiences in other face to face courses, or students may have interpreted the exam similar to face to face learning environments. Also, intrinsic items and control beliefs items were not recovered separately. In EFA, when it was forced to six factor structures, intrinsic and control beliefs were separately recovered. However, six factors were not chosen because each factor should have three items, at least (Pett, Lackey, & Sullivan, 2003). The results suggest that future researchers add more items in intrinsic goal orientation and control beliefs about learning and re-test the factor structures with the use of EFA. The EFA result with the motivation scale of the MSLQ was similar to Artino and McCoach’s (2008) recent study. Using Expectancy Theory, Artino and McCoach developed two factor structures (task value and selfefficacy) of the Online Learning Value and Self-Efficacy Scale (OLVSES) in multimedia rich U.S. Navy online training programs. As in Artino and McCoach’s research, the motivation scale of the MSLQ revealed task value and self-efficacy which explained a great portion of motivation in AOLE. According to the scores of each motivation factor, it was found that students who participated in this study have high task value (M = 5.8) and self-efficacy on their performance and learning (M = 6.0). Contrary to the EFA with the motivation scale of the MSLQ where three of the original factors were recovered, the EFA with the learning strat-
Factor Validity of MSLQ in Online
25
egies scale of the MSLQ revealed very different factors of learning strategies used in AOLE. None of the learning strategies subscales were recovered as found in the MSLQ after the EFA. Rather, EFA with learning strategies revealed new factors in asynchronous learning environments. In addition, it was not easy to describe the characteristics of the new factors with the clustered items. Although it was not clear, Factor 1 represents learning strategies relevant to reading and Factor 2 represents learning strategies relevant to course material in general. Factor 3 describes self-management of time, learning materials, and class participation. Factor 4 describes learning strategies relevant to interaction with others. The scores of each learning strategy indicated that many online students use Factor 1 (M = 5.2) and Factor 3 (M = 5.4) strategies and less students use Factor 2 (M = 3.11) and Factor 4 (M=2.96) strategies. Implications for Future Research The results of the confirmatory and exploratory factor analysis suggest that the 15 factors of the MSLQ be reconsidered in AOLE. Also, the statistical analysis of the CFA and EFA with the MSLQ suggested the necessity of modifying the current version of the MSLQ to measure students’ SRL in AOLE. When it comes to modifying the items of the MSLQ, consideration of the learning task is helpful because learning tasks significantly impact students’ motivation and learning strategies. The analysis of the asynchronous online syllabi revealed three main characteristics of the asynchronous online learning nature that future researchers need to consider when revising or adding new items of the MSLQ: opportunity of choice in projects, the importance of interaction with others, and use of discussion boards. Many of the learning tasks (e.g., individual projects and even examinations) provided students with opportunities to choose topics and manage their projects. Therefore, when students deal with projects, they are expected to actively interact with their instructors and manage their tasks on their own (Dabbagh & Kitsantas, 2005). Based on the nature of individual project-oriented tasks, possible items to be added or revised in the MSLQ are “I actively interact with my course instructor to clarify a course task” or “I request further information from my online instructor whenever it is necessary.” In addition, many of the learning tasks (e.g., individual task with peer feedback, group task with group artifact, discussion for knowledge construction, or discussion for sharing information) were accomplished through
Cho and Summers
26
student interaction. In AOLE, students are expected to socially interact and learn with one another (Hill, Wiley, Nelson, & Han, 2004; Whipp & Chiarelli, 2004). In asynchronous social interactions, students, as a group, work together while they interpret tasks, set rules for their own groups, and divide their roles. The nature of social tasks creates interdependence among group members. If each member does not share information, play their roles or encourage others, it can lead to frustration in other members (Hill et al., 2004). In particular, in asynchronous social interactions, the communication and decision making processes takes a longer time than in traditional learning contexts because the time in asynchronous online learning is distributed. Therefore, students are expected to regularly check other members’ postings and respond to others in a timely manner (Whipp & Chiarelli, 2004). From this point of view, possible items to be added or revised to the MSLQ are “I regularly log in to the online course to monitor others’ postings” or “I respond to others in a timely manner.” It was also found use of discussion threads as a learning tool is common in AOLE (Hill et al., 2004). The nature of asynchronous discussion threads makes students more interactive with other students. In their interaction research between students and the teacher in AOLE, Heckman and Annabi (2006) found asynchronous students were the main contributors to a discussion. In AOLE, teachers post questions and multiple students answer those questions while multiple other students interact with each other by asking questions, answering questions, or arguing with one another. In order to actively involve students in discussions and create quality discussions, students are expected to articulate their ideas and interact effectively with others. From this description, possible items to be added or revised to the MSLQ are “I elaborate my ideas in a specific way” or “I check others’ postings to evaluate my own comprehension of the material.” Conclusion The current research provides online SRL researchers some insights into the use of the MSLQ when it comes to measuring online SRL. Different from other previous studies designed to evaluate the factor validity of the MSLQ where a small portion of the items of the MSLQ have been used to check the factor validity, the current study used all of the items of the MSLQ in AOLE. The study found the sixth and ninth factor structures of the motivation and learning strategies scale of the MSLQ were not satisfactory to meet the CFA criteria and need to be improved to be used to mea-
Factor Validity of MSLQ in Online
27
sure SRL in AOLE. Therefore, online SRL researchers should be aware of the limitations of the MSLQ as a research tool in AOLE. However, the EFA results showed that the MSLQ has a potential to be used to measure asynchronous online students’ SRL. Also, a revision of the MSLQ was suggested to measure online SRL by adding or revising items considering the three natures of AOLE (e.g., opportunity for choice in projects, interaction with others, and discussion participation). References Artino, A. R., McCoach, D. B. (2008). Development and initial validation of the online learning value and self-efficacy scale. Journal of Educational Computing Research, 38, 279-303. Benson, J. (1998). [Review of the Motivated Strategies for Learning Questionnaire]. In J. C. Impara & B. S. Plake (Eds.), The thirteenth mental measurements yearbook (pp. 680-681). Lincoln, NE: Buros Institute of Mental Measurements. Dabbagh, N., & Kitsantas, A. (2005). Using web-based pedagogical tools as scaffolds for self-regulated learning. Instructional Science, 33, 513-540. Dillon, F. R., & Worthington, R. L. (2003). The Lesbian, Gay, and Bisexual Affirmative Counseling Self-Efficacy Inventory (LGB-CSI): Development, validation, and training implications. Journal of Counseling Psychology. 50(2), 235-251. Duncan, T. G., & McKeachie, W. J. (2005). The making of the Motivated Strategies for Learning Questionnaire. Educational Psychologist, 40(2), 117-128. Eberly, M. B., Newton, S., & Wiggins, R. A. (2001). The syllabus as a tool for student-centered learning. The Journal of General Education, 50, 56-74. Gable, R. K. (1998). [Review of the Motivated Strategies for Learning Questionnaire]. In J. C. Impara & B. S. Plake (Eds.), The thirteenth mental measurements yearbook (pp. 681-682). Lincoln, NE: Buros Institute of Mental Measurements. Hadwin, A. F., Nesbit, J. C., Jamieson-Noel, D., Code, J., & Winne, P. H. (2007). Examining trace data to explore self-regulated learning. Metacognition Learning, 2, 107-124. Heckman, R., & Annabi, H. (2006). How the teacher’s role changes in on-line case study discussions. Journal of Information Systems Education, 17, 141150. Henson, R. K., & Roberts, J. K. (2006). Use of exploratory factor analysis in published research: Common errors and some comment on improved practice. Educational and Psychological Measurement, 66(3), 393-416.
28
Cho and Summers
Hill, J. R., Wiley, D., Nelson, L. M., & Han, S. (2004). Exploring research on internet-based learning: From infrastructure to interactions. In D. H. Jonassen (Ed.), Handbook of research on educational communications and technology (pp. 422-460). Mahwah, NJ: Lawrence Erlbaum Associates. Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1-55. Marsh, H. W., Balla, J. R., & McDonald, R. P. (1988). Goodness-of-fit indexes in confirmatory factor analysis: The effect of sample size. Psychological Bulletin, 103, 391-410. Muis, K. R., Winne, P. H., & Jamieson-Noel, D. (2007). Using a multitrait-multimethod analysis to examine conceptual similarities of three self-regulated learning inventories. Britsh Journal of Educational Psychology, 77, 177195. Pett, M. A., Lackey, N. R., & Sullivan, J. J. (2003). Making sense of factor analysis : The use of factor analysis for instrument development in health care research. Thousand Oaks, CA : Sage Publications. Pintrich, P. R., Smith, D. A., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the Motivated Strategies for Learning Questionnaire. Ann Arbor, MI: The Regents of the University of Michigan. Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the motivated strategies for learning questions (MSLQ). Educational and Psychological Measurement, 53, 801-813. Richardson, J. T. E. (2007). Motives, attitudes, and approaches to studying in distance education. Higher Education, 54, 385-416. Whipp, J. L., & Chiarelli, S. (2004). Self-regulation in a web-based course: A case study. Educational Technology Research &Development, 52, 5-22.