Measuring High School Students' Attitudes Toward Computing

9 downloads 262065 Views 423KB Size Report
Mar 13, 2010 - Women are severely underrepresented in computing [29], representing only 20% of computer science bachelor's degrees awarded across the ...
Measuring High School Students’ Attitudes Toward Computing Daniel Heersink

Barbara M. Moskal

Colorado School of Mines Mathematical and Computer Science, 1500 Illinois St.

Colorado School of Mines Mathematical and Computer Science, 1500 Illinois St.

Golden, Co 80439 1-303-273-3860

Golden, Co 80439 1-303-273-3867

[email protected]

[email protected] Women are severely underrepresented in computing [29], representing only 20% of computer science bachelor’s degrees awarded across the nation in 2006 [19]. Studies have examined gender differences that influence enrollment in computing related degree programs [3,4,24]. High school girls, in particular, have identified the following reasons for their lack of interest in computing: i.) lack of female role models, ii.) limited or no knowledge of the applications of computing, iii.) interest in things other than computers, and iv.) negative perceptions of computing as “nerdy” [15]. The broader population of students has expanded this list to include: i.) a perception that the number of jobs in computing is decreasing, ii.) a lack of familiarity with computing fields and iii.) incorrect perceptions that computing professionals spend the majority of their time programming and rarely use computing in problem solving [5].

ABSTRACT Many projects throughout the United States are underway that seek to increase the appeal of technology as a field of study. To better understand the impacts of such projects, validated instruments are needed which measure students’ attitudes and beliefs in both computer science and information technology. This paper describes the development and validation of two assessment instruments. One measures attitudes and beliefs about computer science; the other measures attitudes and beliefs about information technology. The questions that comprise these instruments are identical with the exception of the use of the terms “computer science” and “information technology”. Both instruments sought to measure five constructs: confidence, interest, gender, usefulness, and professional. Based on the results of factor analyses, high school students are able to distinguish among these constructs in computer science but not in information technology. This raises questions as to what high school students understand about field of information technology.

Today’s students grew up with technology and video games. Through these experiences, they have come to view technology as fast-paced and exciting. Yet, their first programming experiences are tedious and boring [2,16,22]. Studies have further shown that many students lack confidence in their basic programming skills [9]. The dot.com bust has also had a negative impact on students’ perceptions of the field and of professionals in the field, especially the perceptions held by young women as to the field [7,25].

Categories and Subject Descriptors Research Studies

General Terms Measurement, Human Factors

Keywords

One method of reversing the damaging trend of decreasing numbers of students pursuing computing degrees is to change students’ perceptions of computing fields. Many studies that have sought to reverse this trend have relied on qualitative methods to measure students’ attitudes and beliefs with respect to computing [8,10,13,14]. Qualitative methods provide detailed information concerning the participating group; however, they are limited as to the extent to which they can be generalized beyond that group.

Assessment, outreach, high school computing

1. INTRODUCTION As the employment demand for science and technology majors increases, the number of students enrolling in computing degrees in the United States is declining [21]. Enrollment in introductory programming courses and in computer science and information technology degree programs has decreased over the last eight years. Studies [17,24,26,28] have been conducted that seek to better understand the attitudes and beliefs of students that contribute to this decline. Most of these studies use qualitative research techniques and many have focused on female students.

Some studies have sought to address this limitation by developing surveys that measure students’ attitudes toward computer science. For example, Wiebe, Williams, Yang and Miller [27] created a survey that seeks to measure attitudes in computer science based on an instrument which was originally designed to measure students’ attitudes in mathematics [10]. A major assumption of their work is that the same factors that discourage participation in mathematics also discourage participation in computer science; no evidence is presented by these authors to support the accuracy of this assumption. Other instruments [20] have also been developed to examine students’ attitudes with respect to computer science once students have enrolled in a computer science course. These instruments are limited in that a major obstacle is getting students to enroll in that first computer science experience.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. SIGCSE’10, March 10–13, 2010, Milwaukee, Wisconsin, USA. Copyright 2010 ACM 978-1-60558-885-8/10/03...$10.00.

446

The study presented here introduces two surveys that are designed to measure students’ beliefs and attitudes with respect to computing with respect to five constructs. These five constructs were selected based on the qualitative research in computing concerning factors that discourage student participation. The first instrument is designed for computer science and the second for information technology. Factor analyses were used to examine the effectiveness of each instrument in measuring the following constructs on a high school population: Confidence Construct (C): students’ confidence in their own ability to learn computing skills; Interest Construct (I): students’ interests in computing; Gender Construct (G): students’ perceptions of computing as a male field; Usefulness Construct (U): students’ beliefs in the usefulness of learning computing; and Professional Construct (P): students’ beliefs about professionals in computing. The primary difference between the two instruments is the replacement of “computer science” in the computer science instrument with “information technology” in the information technology instrument. The computer science instrument was originally designed for a first year college population [11, 18] and this study represents the first effort to validate the instrument on a high school population. Representative questions from each construct can be found in Table 1 and the full surveys are available through requests from the authors. High school was selected as the target population for this investigation because this is a period in which students form opinions about future majors and careers.

are that of the authors and are not necessarily reflective of that of the NSF or of the investigators of the collaborative projects.

2. RESEARCH QUESTION The research questions that guided this investigation are: 1. Based on a factor analysis, is there evidence to support that the computer science and information technology surveys measure the five intended constructs when used with a high school population? 2. Are the two instruments equally effective for measuring students’ attitudes and beliefs within the two fields (i.e., computer science and information technology)?

3. METHODS This section begins with a description of the two assessment instruments. This is followed by a description of the participating student population, administrative procedure and analysis process.

3.1 Instrument Both instruments, the computer science and information technology survey, were originally designed with 52 statements in Likert format (Strongly Agree, Agree, Disagree, Strongly Disagree). A neutral option was not included in order to force a decision between agree or disagree. The only difference between the computer science instrument and the information technology instrument was the replacement of the phrase “computer science” with “information technology.” As a result of an initial factor analysis and Cronbach’s Alpha on an undergraduate student population, the computer science survey was reduced to 38 statements prior to the current administration [11,18]. The information technology instrument was tested in this study for the first time, and therefore was in original 52 statement format.

Table 1. Representative questions from each construct Confidence:

Interest:

Gender:

Usefulness:

Professional:

C4. I can learn to understand computing concepts. C8. I am uncertain that I can achieve good grades (C or better) in computing courses. I5. I like to use computer science to solve problems. I9. I think computer science is interesting. G1. I doubt that a woman could excel in computing courses. G15. Men and women can both excel in computing courses. U6. Knowledge of computing skills will not help me secure a good job. U7. I do not use computing skills in my daily life. P1. Doing well in computer science does not require a student to spend most of his/her time at a computer. P2. A student who performs well in computer science will probably not have a life outside of computers.

3.2 Subjects The data reported here is restricted to the students whose parents or legal guardians signed consent to participate in NSF DRL0623808 or DRL- 0737679. No information is available on students whose parents did not provide consent. All subjects selfselected to enroll in the summer enrichment programs. The computer science version of the attitudes survey was administered to 77 high school students who participated in one of four workshops offered during the summer of 2008 in the following states: North Carolina, South Carolina, Mississippi, and California. Attending students ranged in grade level from high school freshman to high school seniors. The information technology version of the attitudes survey was administered to 63 high school students who participated in a workshop during the summer of 2008 in Indiana. Attending students ranged in grade level from high school freshman to high school seniors.

3.3 Administrative Process The attitude survey was administered to each of the participating groups in a pre and post format. The pretest occurred prior to workshop instruction; the posttest immediately following the workshop instruction. However, for the purpose of this investigation, the goal was to confirm that the survey was measuring students’ attitudes with respect to computer science or information technology without intervention. Therefore, this analysis examines pretest results only.

The study presented here was partially support by the National Science Foundation (NSF) DUE-0512064 and was in collaboration with NSF DRL-0623808 and NSF DRL- 0737679. The latter two projects required validated assessment instruments to support the measurement of students’ attitudinal changes as a result of interventions in computer science and information technology, respectively; the former project supports the validation of these instruments. The ideas and opinions expressed

447

3.4 Analysis Methods

4.2 Computer Science

Responses to each survey were converted to numerical scores. For positively phrased questions, “Strongly Agree” was coded as a “4”, “Agree” as a “3”, “Disagree” as a “2” and “Strongly Disagree” as a “1”. Negatively phrased questions were coded in reverse order such that a positive response always corresponded to a higher numerical value. For the gender questions, a gender neutral attitude reflected a higher numerical score.

This section addresses the results of the factor analysis on the computer science instrument. The results of the promax rotation factor analysis indicated that question P1, “A student who performs well in computer science will probably NOT have a life outside of computers”, factored into the usefulness construct instead of the professional construct. This may indicate that the responding students viewed this statement as a reflection of the extent to which computer science is useful to his/her own life rather than as a stereotype concerning computer science as a profession. Also, question U6, “I do NOT use computing skills in my daily life”, factored into the confidence construct instead of the anticipated usefulness construct. Students may not use computing skills in their own life because they are not confident in their skills.

A confirmatory factor analysis using a promax [1] rotation was performed on the number-valued survey responses. The promax rotation is oblique as opposed to the traditional orthogonal rotation. Oblique rotations allow for correlations between the different factors, and in a survey such as this, the five constructs are designed to be correlated. Also, using oblique rotations will result in factor loadings that are very close to the orthogonal rotation when the factors are only minimally correlated or uncorrelated. Thus, the oblique rotation is the preferable method of analysis when it is anticipated that across constructs the factors will correlate [6].

A common practice in factor analysis is to set a cutoff for the acceptable factor loadings between 0.3–0.4. Loadings within this range or higher are considered significant and factor loadings less than this range are examined individually to determine their significance [6]. In this study, a cutoff of 0.35 was used.

4. RESULTS This section begins with a description of the participating student population. This is followed by a discussion of the results of the factor analysis on the computer science and information technology instruments.

Question P1 was removed and the factor analysis was performed on the remaining 37 questions. This analysis resulted in all questions factoring into the intended factors with the exception of question U6. This question continued to load highly on the confidence construct, but also loaded on the intended, usefulness construct. Also, question P2, “A student who performs well in computer science will probably not have a life outside of computers” was below the 0.35 cutoff established for this study. This question continued to load highest on the intended construct though, so the question was not removed. A third factor analysis was conducted after removing the problematic question U6. This negatively impacted the loading of other statements, resulting in the decision to maintain U6 in the analysis but as a measure of confidence rather than usefulness.

4.1 Descriptions Both surveys included three demographic questions at the end of the survey. The first question asked for gender. The second question asked for grade level with the four options being ninth, tenth, eleventh or twelfth grade. The final demographics question asked for ethnicity with the following possible choices: i) American Indian; ii) Asian; iii) Black or African American; iv) Hispanic; v) White; vi) Multi-Racial; vii) Other, Please Specify; and viii) Choose not to Respond.

4.3 Information Technology

Forty-five percent of respondents to the computer science instrument were female. Forty percent of the computer science survey respondents were in ninth grade, twenty-four in the tenth grade, twenty-four in the eleventh grade and twelve in the twelfth grade. Approximately half (48%) self-identified as White, twenty six percent as Black or African American, twelve percent as Asian, six percent as Other, with the remaining eight percent (representing two students) being American Indian, Hispanic, Multi-Racial, and Choose not to Respond. Their exact responses are not identified here to ensure the confidentiality of this smaller segment of the student population.

As was the case for the computer science version of the instrument, a promax rotation factor analysis was completed on the collected data. Only the gender construct loaded as expected and none of the statements from the professional construct were loading on a single factor. This may be an indication that high school students do not know what it means to be a professional in information technology. The professional construct statements were eliminated from the data and a secondary factor analysis was completed. Results of this second factor analysis indicated the existence of only two factors – a gender construct and a second construct comprising usefulness, confidence and interest in information technology. Based on this analysis, it appears that with respect to information technology, the constructs under investigation are ill-defined for a high school population.

Of the students that responded to the information technology survey, sixty-six percent of respondents were female. The respondents were distributed as follows among the four grade levels with nineteen percent in ninth grade, thirty-seven percent in tenth grade, thirty percent in the eleventh grade and fourteen percent in the twelfth grade. The majority of the respondents selfselected as White (65%). Twelve percent self-selected as Black or African American, nine percent as Multi-Racial, seven percent as Asian and six percent as Hispanic. One person or 1% of respondents chose not to respond.

Based on this analysis, the information technology survey has been shortened to include only twenty questions, ten for each of two factors. The questions that were selected for this revised instrument were those that loaded highest with regard to each factor. Additional data will be collected to validate this revised instrument.

448

Based on this analysis, the information technology instrument is not yet ready for general use. Additional research and validation is needed. As a result of the current investigation, this instrument has been rewritten with only two constructs, a gender construct and a general interest construct, and has also been shortened to 20 questions. An open-ended question has been added which asks students to define the term “information technology”. This instrument was pilot tested in the summer of 2009 and an analysis is underway. In response to the second research question, only the computer science instrument is ready for general use in the measurement of students’ attitudes and beliefs with respect to the five intended constructs.

5. DISCUSSION This article reports on the research efforts that have been used in the ongoing validation of a computer science attitudes survey and the initial efforts to develop an information technology attitudes survey. Both instruments were administered during summer workshops to self-selected high school students and both were analyzed using factor analysis. With the removal of one question from the computer science instrument and the redefinition of another question to a new category, the factor analysis provided support that the instrument was measuring the five intended constructs for a high school population. Based on the factor analysis, the information technology instrument was only measuring two factors: gender and a general factor collectively addressing students’ interest, confidence, and perceived usefulness of information technology. It is possible that interest, confidence and usefulness were reflected through the responses but that the various factors were so highly correlated that they became indistinguishable through the analysis process. Another interpretation is that high school students were unable to distinguish their attitudes and beliefs, because they do not know what information technology is. The later interpretation is of greater concern to the computing community and warrants future investigation. This may indicate that students are unaware that there are options in computing outside of computer science and this limits the potential growth of information technology.

The study presented here has three known limitations. First, factor analyses do not define constructs; rather they provide statistical evidence as to correlations. In this investigation, expert review was used to define and confirm the five factors. Second, different populations of students were used in the study of each instrument. Variations among these groups may have had unexpected impacts on the reported results. Third, all of the participating students where self-selected. Students who self-select to participate in a summer workshop may respond in a manner that is different from those that do not select to participate in such programs.

6. ACKNOWLEDGEMENTS The authors would like to thank the NSF for their support of this research effort (NSF DUE-0512064) as well as the collaborating projects (NSF DRL-0623808; NSF DRL- 0737679). The authors acknowledge the contributions of Wanda Dann, Carnegie Mellon University, Steven Cooper, Purdue University, and Mark Guzdial, Georgia Tech, in the development of the original 52 question version of the computer science survey and Alka Herriger, Purdue University, in the instrument’s conversion to an information technology survey. Wanda, Steven and Alka further supported the validation of these instruments with a high school population through collaborative projects (NSF DRL-0623808; NSF DRL0737679). Ashlyn Munson, Andy Hoegh and Nathan Behrens each participated as graduate research assistants during various components of the survey development process. The late Randy Pausch, Carnegie Mellon, developed the Alice software which inspired the need for the original computer science survey.

The results do provide strong evidence that the computer science survey is measuring students’ attitudes and beliefs with respect to computer science for the five intended constructs, or the first component of research question 1. The five constructs measured through this instrument and the statements that comprise each construct resulted from a review of the literature and expert feedback [18]. The factor analysis indicates that the questions that comprise these constructs are correlated. The results are less promising with respect to the information technology survey. Based on the factor analysis, only the gender construct appears to be effectively measured through this instrument. In response to the second half of research question 1, the evidence does not support that the information technology survey is measuring the five intended constructs. Three of the constructs, however, can be measured as a general construct.

7. REFERENCES [1] Abdi, H. 2003. FactorRotations in Factor Analyses. In: Lewis-Beck, M. Bryman, A., Futing T. (eds.): Encyclopedia for Research Methods for the Social Sciences. Sage: Thousand Oaks, CA, 792-795.

These results also raise concerns as to the common practice of altering validated surveys to fit the needs of new programs, without additional validation. In this case, the term “computer science” was changed to “information technology”. The factor analysis of the first instrument provides evidence to support its validity; the factor analysis of the second instrument provides evidence to challenge its validity in the original survey form. What appeared to be a minor language alteration significantly impacted the instrument’s capability of providing accurate and interpretable measurements. Unfortunately, the same methods that were used in the development of the computer science survey instrument, a literature review and expert review, cannot be used for the information technology survey. Much of the research that is available does not distinguish between the two fields. Additionally, many computer science instructors also act as information technology instructors, blurring the professional distinction between the fields.

[2] Bailey, M. W., Coleman, C. L., and Davidson, J. W. 2008. Defense against the dark arts. In Proceedings of the 39th SIGCSE Technical Symposium on Computer Science Education (Portland, OR, USA, March 12 - 15, 2008). SIGCSE '08. ACM, New York, NY, 315-319. [3] Beyer, S., DeKeuster, M., Walter, K., Colar, M., and Holcomb, C. 2005. Changes in CS students' attitudes towards CS over time: an examination of gender differences. In Proceedings of the 36th SIGCSE Technical Symposium on Computer Science Education (St. Louis, Missouri, USA, February 23 - 27, 2005). SIGCSE '05. ACM, New York, NY, 392-396. [4] Beyer, S., Rynes, K., Perrault, J., Hay, K., and Haller, S. 2003. Gender differences in computer science students. In

449

Proceedings of the 34th SIGCSE Technical Symposium on Computer Science Education (Reno, Nevada, USA, February 19 - 23, 2003). SIGCSE '03. ACM, New York, NY, 49-53.

[17] Lewis, C. 2007. Attitudes and beliefs about computer science among students and faculty. SIGCSE Bull. 39, 2 (Jun. 2007), 37-41.

[5] Carter, L. 2006. Why students with an apparent aptitude for computer science don't choose to major in computer science. In Proceedings of the 37th SIGCSE Technical Symposium on Computer Science Education (Houston, Texas, USA, March 03 - 05, 2006). SIGCSE '06. ACM, New York, NY, 27-31.

[18] Moskal, B., Behrens, N., Guzdial, M., Tew, A., Dann, W. and Cooper, S. 2007. “Computer science assessment instrument development: Evaluating attitudes and outcomes.” Paper in the proceedings of the National STEM Assessment Conference, Washington DC, pp. 194-201. [19] National Science Foundation. 2006. Table C-14 Bachelor’s Bachelor's degrees, by race/ethnicity, citizenship, sex, and field: 2006.

[6] Costello, Anna B. & Jason Osborne 2005. Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Practical Assessment Research & Evaluation, 10, 7.

[20] Palaigeorgiou, G., E., Siozos, P.D., Konstantakis, N.I., & Tsoukalas, I. A. 2005. “A computer attitudes scale for computer science freshman and its educational implications”, J. of Computer Assisted Learning, 21, 5, 330-342.

[7] Craig, A., Paradis, R., and Turner, E. 2002. A gendered view of computer professionals: preliminary results of a survey .SIGCSE Bull. 34, 2 (Jun. 2002), 101-104.

[21] Patterson, D. A. 2005. Restoring the popularity of computer science. Commun. ACM 48, 9 (Sep. 2005), 25-28.

[8] Dewitt, P. 2007. “Effectiveness of Alice in computer science education.” Master’s Thesis, Colorado School of Mines.

[22] Rankin, Y., Gooch, A., and Gooch, B. 2008. The impact of game design on students' interest in CS. In Proceedings of the 3rd international Conference on Game Development in Computer Science Education (Miami, Florida, February 27 March 03, 2008). GDCSE '08. ACM, New York, NY, 31-35.

[9] D'Souza, D., Hamilton, M., Harland, J., Muir, P., Thevathayan, C., and Walker, C. 2008. Transforming learning of programming: a mentoring project. In Proceedings of the Tenth Conference on Australasian Computing Education - Volume 78 (Wollongong, NSW, Australia, January 01 - 01, 2008). S. Hamilton and M. Hamilton, Eds. Conferences in Research and Practice in Information Technology Series, vol. 315. Australian Computer Society, Darlinghurst, Australia, 75-84.

[23] Reges, S. 2006. Back to basics in CS1 and CS2. In Proceedings of the 37th SIGCSE Technical Symposium on Computer Science Education (Houston, Texas, USA, March 03 - 05, 2006). SIGCSE '06. ACM, New York, NY, 293-297.

[10] Fennema, E., and Sherman, J.A. (1976). “Fennema-Sherman mathematics scales”, JSAS: Catalog of Selected Documents in Psychology, 6, 31.

[24] Rowell, G. H., Perhac, D. G., Hankins, J. A., Parker, B. C., Pettey, C. C., and Iriarte-Gross, J. M. 2003. Computerrelated gender differences. In Proceedings of the 34th SIGCSE Technical Symposium on Computer Science Education (Reno, Nevada, USA, February 19 - 23, 2003). SIGCSE '03. ACM, New York, NY, 54-58.

[11] Hoegh, A. and Moskal, B. 2009. “Examining Science and Engineering Students’ Attitudes Toward Computer Science.” Paper in the proceeding of the Frontiers in Education Conference, San Antonio, Texas. [12] Hutchinson, A. 2005. “A statistical analysis of outcomes in an educational assessment: Impact of the Alice curriculum on male and female performances and attitudes at community colleges.” Master’s Thesis, Colorado School of Mines.

[25] Schulte, C. and Knobelsdorf, M. 2007. Attitudes towards computer science-computing experiences as a starting point and barrier to computer science. In Proceedings of the Third international Workshop on Computing Education Research (Atlanta, Georgia, USA, September 15 - 16, 2007). ICER '07. ACM, New York, NY, 27-38.

[13] Hutchinson, A., Moskal, B., Dann, W. & Cooper, S. 2008. “Impact of the Alice curriculum on community college students’ attitudes and learning with respect to computer science.” Paper appears in the proceedings of the annual meeting of the American Society for Engineering Education, Pittsburgh, PA .

[26] Schulte, C. and Magenheim, J. 2005. Novices' expectations and prior knowledge of software development: results of a study with high school students. In Proceedings of the First international Workshop on Computing Education Research (Seattle, WA, USA, October 01 - 02, 2005). ICER '05. ACM, New York, NY, 143-153.

[14] Hutchinson, A., Moskal, B., Dann, W., Cooper, S., & Navidi, W. 2008. “The Alice curricular approach: A community college intervention in introductory programming courses.” Innovations 2008: International Network for Engineering Education Research. Begell House Publishing, Chapter 15, ISBN, 978-0-9741252-8-2 (p. 157-176).

[27] Wiebe, E., Williams, L., Yang, K. & Miller, C. 2003. “Computer Science Attitude Survey”, Technical Report, Department of Computer Science, NC State University, Raleigh, NC, TR-2003-01. [28] Wiedenbeck, S. 2005. Factors affecting the success of nonmajors in learning to program. In Proceedings of the First international Workshop on Computing Education Research (Seattle, WA, USA, October 01 - 02, 2005). ICER '05. ACM, New York, NY, 13-24.

[15] Jepson, A. and Perl, T. 2002. Priming the pipeline. SIGCSE Bull. 34, 2 (Jun. 2002), 36-39. [16] Ladd, B. C. 2006. The curse of Monkey Island: holding the attention of students weaned on computer games. J. Comput. Small Coll. 21, 6 (Jun. 2006), 162-174.

450

Suggest Documents