In engineering, technology, and the ... reasons that women in STEM doctoral programs drop out and .... psychology, engineering, and educational technology.
Development of the Science Technology Engineering and Mathematics – Active Listening Skills Assessment (STEM-ALSA) Kerrie G. Wilkins, Bianca L. Bernstein Counseling and Counseling Psychology Program Arizona State University Tempe, AZ USA
Caroline J. Harrison CareerWISE Project Manager/Post-Doctoral Researcher Arizona State University Tempe, AZ, USA
Jennifer M. Bekki Department of Engineering Arizona State University Mesa, AZ USA
Robert K. Atkinson School of Computing, Informatics, and Decision Systems Engineering Arizona State University Tempe, AZ USA
Abstract —The purpose of this investigation was to develop the STEM Active Listening Skills Assessment (STEM-ALSA), a conceptually grounded instrument designed to measure four components of active listening, a key element of communication in an academic setting. The STEM-ALSA is comprised of three unique scales that measure a person’s knowledge (12 items), ability to apply (25 items), and self-efficacy (5 items) with respect to active listening. Two pilot studies were conducted with N = 99 upper level undergraduate students enrolled in STEM disciplines to develop and evaluate the instrument. Results of an exploratory factor analysis identified both a unidimensional factor structure for each of the three scales and total scores with adequate internal consistency reliability estimates. The STEM-ALSA provides a mechanism for measuring active listening skills among students in STEM. Keywords-Assessment; communication; graduate students in STEM
I.
INTRODUCTION
Effective communication skills are an essential commodity in today’s workplace. In engineering, technology, and the sciences, communicating disciplinary knowledge to the public is considered critical. Interpersonal communication skills (ICS) are also recognized as key transdisciplinary capabilities necessary for career success [1]. For example, an estimated 50-75% of scientists’ work involves communicating with others individually, in small groups, and in teams [2]. The Engineer of 2020 report [3] and other researchers [4], [5], [6] have also highlighted the fact that communication skills will become increasingly important as engineers are required to communicate in globally diverse and interdisciplinary teams. However, despite the expressed importance of these communication skills, numerous research studies, reports from industrial recruiters, and anecdotal evidence by educators have indicated that graduates of science, technology, engineering, and mathematics (STEM) disciplines are inadequately equipped to communicate effectively in the workplace [7]. This leads organizations such as The Society for Manufacturing Engineers to list “lack of communication skills” among the top “competency gaps” in engineers’ education [8]. In addition to plaguing recent graduates of STEM disciplines who work in industry, inadequate training in ICS at This work was supported by the National Science Foundation, Grants 0634519 and 0910384. Any opinions, findings and conclusions and recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
the undergraduate level has implications for graduate studies in STEM. It has been stated that “science these days
…increasingly draws on skills in written and oral communication with scientists and non-scientists alike” [9]. For the most part, however, the oral communications that are central in students’ daily practices are conversational and informal. Additionally, students who are able to communicate their needs to their graduate program are more likely to complete their degrees [10]. In fact, recent research suggests that graduate students who take a more active role in developing ICS are more successful in graduate school [11], [12]. An ICS identified as a particularly important element for communicating across disciplines is “active listening” [13]. Active, or empathic, listening can be traced to Carl Rogers [14] and is a cornerstone of his humanistic psychology [15]. Since its introduction, active listening skills have been found to play an important role in effective communication and have now become a mainstay of communication training programs across a variety of fields [16]. In addition to being widely recognized as an important interpersonal communication skill, active listening has also been shown to be a teachable skill. For example, research has shown that the listening skills of counseling students [17] and helpline volunteers [18] improved with active listening training. Moreover, the effects of such training continued for at least nine months in the case of the helpline volunteers [19]. Yet, despite the importance of the skill and the fact that it has been shown to be something that can be learned, there are few empirically validated instruments available to measure active listening skills. This paper discusses the development of the STEM Communication Skills Assessment (STEM-ALSA), which is comprised of three unique measures: knowledge of, self efficacy in the domain of, and skill in the application of active listening. The context of all the items in the instrument is the advisor-advisee relationship. Advising is at the heart of the institutional and interpersonal structures that make up graduate education [20]. Consequently, it is imperative that the advisoradvisee relationship takes on a supportive stance. This is especially the case for female doctoral students, for whom the graduate program milieu is often described as a “chilly
climate” [21], [22]. The STEM-ALSA was developed for a specific study within the CareerWISE research program. In the following sections, we provide a brief description of CareerWISE to help situate the motivation for the instrument development, outline the process of instrument development, and describe the scales within the instrument in more detail. II.
CareerWISE RESEARCH PROGRAM
The CareerWISE research program is a large, NSFfunded, multidisciplinary research program housed at Arizona State University. The program strives to both understand the reasons that women in STEM doctoral programs drop out and to develop and disseminate a resource to strengthen key personal and interpersonal skills so that women will be better equipped to persist in their doctoral degree programs. Built on an extensive foundation of theory and research, the CareerWISE resource (http://careerwise.asu.edu) is an online resilience training program designed to address the personal and interpersonal challenges of women in science and engineering fields by strengthening their personal assets and supports [23]. Key objectives of the resilience training program are to enhance the communication skills of doctoral women and to improve interpersonal problem solving skills. The CareerWISE resource is unique in that it is an individualized program that pairs empirically based pedagogical materials with an interactive simulation environment designed to hone users’ ICS skills. It is the first program of its kind to provide systematic training in ICS customized for female students in STEM. Consequently, instruction in active listening skills is an essential building block in the ICS within the CareerWISE program. III.
OVERVIEW OF THE THE INSTRUMENT
The development of the STEM-ALSA was underscored by the following definition of active listening: “active listening requires that the listener try to understand the speaker's own understanding of an experience without the listener's own interpretive structures intruding on his or her understanding of the other person” [13, p. 35]. The goal in active listening is to develop a clear understanding of the speaker’s concern and also to clearly communicate the listener’s interest in the speaker’s message [18]. The CareerWISE team specified the following four sub skills for active listening and defined them as indicated in Table I: asking open ended questions, listening for critical information, communicating via nonverbal cues, and perception checking. Items in the instrument were specifically included to assess each of these four elements. The STEM-ALSA is comprised of three unique scales. The Knowledge Assessment scale measures the respondent’s selfreported current knowledge of active listening. The following are two sample items from this scale: “I know how to restate a speaker’s message to verify my understanding” and “I know how to convey nonverbally that I am interested in what the other person is saying.” Response options for this scale are arrayed on a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). The Self Efficacy Assessment scale is designed to measure an individual’s confidence in her own ability to use active listening skills in an academic setting. Two sample items from
this scale are: “I can detect the important messages in a conversation with professors,” and “I can ensure that I have understood the speaker’s message.” Response options for this scale are also arrayed on a 5-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly agree). TABLE I. DESCRIPTION OF ACTIVE LISTENING SUB-SKILLS Asking open-ended questions Attending to nonverbal cues
Questions that are broadly framed to encourage elaboration and allow for responses other than yes or no. Observing the speaker’s nonverbal cues (e.g., facial expressions, hand gestures, posture, etc.) for information about the meaning of the speaker’s message.
Listening for critical information
Identifying the main points of a speaker’s message.
Perception checking
Ensuring that the listener understands the speaker's message by paraphrasing the listener’s interpretation of the speaker’s feelings and message content.
The third scale, Skills Assessment, measures the participant’s ability to actually apply her active listening skills. Skills are often measured using an observation rating approach [24]. However, actual observation can be costly and time intensive. To get around this issue, the STEM-ALSA uses a self-report format to measure skill application. In this section of the instrument, a series of scenarios is presented, modeling situations that could realistically occur for a female doctoral student in STEM. Each scenario includes a stated goal for the student. Following each scenario are four responses, each corresponding to a particular course of action that could assist or hinder the student in achieving her desired outcome. Participants are asked to rate the likelihood of achieving the desired goal for each action on a five-point Likert scale with response options ranging from 1 (very unlikely) to 5 (very likely). Figure I gives a sample scenario and its corresponding items. A month ago, Dr. Simpson asked Sarah to organize an informal bi-weekly meeting at which graduate students from the department would present their research to each other. Today is the first of those meetings, and Sarah is presenting her own research to kick things off. During her presentation, she notices that Dr. Simpson appears to be falling asleep and is not paying attention to what she is saying. For each of the following actions, indicate how likely it is to assist Sarah in getting his attention. 1. 2. 3. 4.
Pause for a moment to convey nonverbally that she is waiting for his attention. Talk as usual while ignoring his lack of interest. Catch Dr. Simpson’s gaze when he looks up. Speak with more animation and address him by name. Figure I: Sample item from the Skill Assessment scale
IV.
INSTRUMENT DEVELOPMENT METHOD AND RESULTS
The STEM-ALSA instrument was developed in three phases. The first phase consisted of initial item development and expert feedback, the second phase involved piloting and then modifying the instrument based on expert feedback (pilot study #1), and the third phase included a second pilot and associated modifications (pilot study #2).
A. Phase 1 – Initial Item Development Initially, 16 items were written for the Knowledge Assessment scale, 32 items for the Skills Assessment scale, and 8 items for the Self Efficacy Assessment scale. Items were developed using the literature on active listening and were examined by an interdisciplinary research team consisting of students and faculty from the disciplines of counseling psychology, engineering, and educational technology. To assess content validity, three experts in psychological measurement and interpersonal communication provided open-ended comments on the original items in each of the three scales. The experts rated each item on content appropriateness and clarity/readability using a 5-point scale that ranged from 1 (not at all appropriate or clear) to 5 (very appropriate or clear). Based on their feedback, several items were revised for clarification. Two of the original content experts then re-evaluated the three revised measures for appropriateness and clarity. B. Phase 2 – Pilot Study #1 Methods The purpose of pilot study #1 was to examine the initial factor structures of the items in each of the three scales in the STEM-ALSA instrument. During the pilot, undergraduate women majoring in mathematics, sciences, and engineering at a large southwestern public university were recruited through university organizations and contacts within the departments. Participants in the study were given a short online introduction to communication skills and then were presented with the instrument so that their active listening skills could be assessed. Demographic information was also collected from participants. A total of 72 participants (primarily juniors and seniors with an average age of 21.6) completed the anonymous online pilot study. The majority of participants reported that they were US citizens with English as their primary language, and over 90% of those who responded stated they had an advisor with whom they interacted. Participants who completed the study were given a $25 gift card redeemable at the university bookstore. Using the participant responses, a principal-components analysis was performed on the items in the STEM-ALSA instrument. A separate analysis was performed for each of the three scales. Of note is that the responses in the Skills Section were scored in a unique way. First, three experts were asked to score each item for how likely it would be to produce the desired outcome. The mean of those values was recorded as the final “best answer.” Participant scores, then, were recoded based on this best answer. Participant responses that were in the same direction as the best answer (i.e., above or below the score of “3”, the midpoint of the scale) were scored as “1,” and those that were not in the direction of the best answer were scored as “0.” For example, if the best answer based on for an item based on expert scoring was 4.33, and a participant’s score was 5 (also above a score of “3”), then the participant’s score was recoded to be a “1.” On the other hand, if the participant’s score was “2”, which is in opposite direction of the best answer, the participant score was recoded to be a “0.”
C. Phase 2 – Pilot Study #1 Results Analyses of each of the three scales in the STEM-ALSA indicated that a one-factor solution was most interpretable. In the Knowledge Assessment scale, the factor structure accounted for 33.31% of the variance. All 16 items loaded at or above 0.40, and the internal consistency of the scale was good, with a Cronbach’s alpha [25], of 0.84. In the SelfEfficacy scale, three of the eight items were deleted, as the factor only accounted for a small amount of their unique variance. After excluding these three variables, the structure of this scale was reanalyzed. The revised structure accounted for 62% of the total variance and was comprised of five items that loaded above 0.70. Cronbach’s alpha [25] for the total score of the one factor solution in the Self Efficacy scale was found to be 0.85. The results were not quite as good for the Skills Assessment scale. The one-factor solution in that scale, while most interpretable, only accounted for 14 % of the total variance. The Cronbach’s alpha coefficient [25] for the total score of the one factor solution in the Skills Assessment scale was also only 0.38. D. Phase 3 – Pilot Study #2 Methods After completing pilot study #1, the three experts in the areas of interpersonal communications and psychometrics were again consulted to examine the items of the three scales, paying particular attention to the Skills Assessment scale. Based on their analysis, items in the Skills Assessment scale for which the experts disagreed on the general direction of the best response were dropped (e.g., when two of the three experts thought it was generally a good course of action, and one didn’t). Following the expert input, a second pilot study was conducted to further examine the factor structure of the revised STEM–CSA. For this pilot, undergraduate students in an introductory computer informatics course at a large southwestern public university were recruited. As in the first pilot study, participants in this pilot were given a short online introduction to communication skills and then were presented with the instrument so that their active listening skills could be assessed. Demographic information was also collected. Students’ majors included liberal arts and communication, as well as computing. A total of 27 participants (primarily sophomores and juniors, with an average age of 21.3) completed the second pilot study. The majority of participants reported that they were US citizens with English as their primary language, and over 76% of those who responded stated they had an advisor with whom they interacted. Participants who completed the study received course credit for their participation. E. Phase 2 – Pilot Study #2 Results Analysis of the Knowledge Assessment scale in pilot study #2 supported the one-factor solution. However, the factor solution accounted for less that 50% of the individual variance for four of the 16 items. Consequently, these four items were excluded from the instrument. The revised structure accounted for 56% of the total variance and included 12 items that loaded above 0.50 (See Table II.). Cronbach’s alpha for the total score of the one factor solution was found to be 0.93.
A one-factor solution in the second analysis of the Self Efficacy Assessment was also robust and conceptually sound. The measure accounted for 69% of the total variance, and included 5 items that loaded above 0.70 (See Table III.). The alpha coefficient for the total score of the one factor solution was found to be 0.88. Based on expert suggestion and the statistical results of pilot study #1, a number of items were deleted and/or revised in the skills-assessment section. Analysis of the revised, 25 item STEM- Skills Assessment indicated that a one-factor solution again yielded the most interpretable solution. This factor structure accounted for 16% of the variance. The alpha coefficient for the total score of the one factor solution was found to be 0.52. Factor loadings ranged from 0.04 to 0.72. Therefore no support was found for this scale. As such, further revision is necessary. V.
CONCLUSION
The development and initial validation of the STEMALSA is an important preliminary step in filling the gap of empirically-validated instruments for measuring active listening. The scales, Knowledge Assessment, Self Efficacy Assessment, and Skills Assessment were designed to measure a respondent’s perceived knowledge, self efficacy, and ability to apply active listening skills respectively. The results indicated that two of the three scales in the instrument, Knowledge and Self Efficacy, demonstrated high internal consistency and fit a unidimensional factor solution. STEM-ALSA also contained a unique section for measuring the application of active listening skills. In this section, a scenario based approach was used, representing a novel approach for measuring the ability to apply active listening skills. However, the analyses suggest that this section of the instrument needs further validation and examination of the factor structure. In future work, we plan to further assess the scenario based approach of the Skills Assessment scale and examine the construct validity of the STEM-ALSA scales. Finally, of note is that although the STEM-ALSA was developed for use in the context of a specific study, further studies can evaluate the usefulness of the instrument for measuring academic communication skills in other populations (e.g., STEM undergraduate students) as well as within other professions. TABLE II. STEM-KNOWLEDGE ASSESSMENT ITEMS, FACTOR LOADINGS, MEANS, AND STANDARD DEVIATIONS FOR STUDY 2
1. 2. 3. 4.
Items I know how to quiet my own thoughts in order to listen carefully. I know how to listen and watch for the main points of a speaker's message. I know how to identify the overarching message even when other topics come up. I know how to ask a question that doesn’t give away the answer I’m hoping to receive.
5.
I know how to ask questions that will encourage the other person to elaborate. 6. I know how to find out more about the other person’s perspective. 7. I know how to ensure that I have understood someone’s point of view. 8. I know how to restate a speaker’s message to verify my understanding. 9. I know how to check whether what I heard is what the speaker meant. 10. I know how to convey nonverbally that I am interested in what the other person is saying. 11. I know how to recognize when someone I am conversing with is distracted. 12. I know how to be consistent in what I’m saying and how I’m saying it.
0.76
3.85
1.20
0.75
4.19
.79
0.78
3.96
1.09
0.82
4.19
.92
0.87
3.81
1.04
0.73
4.11
1.05
0.72
4.26
.984
0.71
3.70
1.10
Note.. Scores on individual items in the STEM- CSA Knowledge Assessment scale ranged from 1-5. Total scores ranged from 5-60.
TABLE III. SELF EFFICACY ASSESSMENT ITEMS, FACTOR LOADINGS, MEANS, AND STANDARD DEVIATIONS FOR STUDY 1
1. 2.
3. 4. 5.
Items I can detect the important messages in a conversation with professors. I can identify the intended meaning of a verbal message even when it is phrased ambiguously. I can behave in a manner that is consistent with how I am feeling. I can verify my perceptions of what the other person is telling me. I can ensure that I have understood the speakers’ message.
Factor 1 loadings
M
SD
0.86
3.93
0.92
0.88
4.04
0.85
0.72
3.78
0.97
0.90
3.81
0.88
0.77
3.78
0.85
Note .Scores on individual items in the STEM- CSA Self Efficacy Assessment scale ranged from 1-5. Total scores ranged from 5-25.
ACKNOWLEDGMENT We would like to thank the experts who reviewed the STEMALSA instrument for their valuable input and suggestions.
Factor 1 loadings
M
SD
0.57
3.93
.96
0.71
4.37
.69
[2]
0.81
3.63
1.08
[3]
0.70
3.96
.90
[4]
REFERENCES [1]
B. L. Bernstein, et al. (in press). The continuing evolution of the research doctorate. In M. Nerad & B. Evans (Eds.). Preparing PhDs for a Global Future: Forces and Form in Doctoral Education Worldwide. Rotterdam, Netherlands: Sense Publishers. A. L. Darling, and D. P. Dannels, “Practicing engineers talk about the importance of talk: A report on the role of oral communication in the workplace.” Communication Education, vol. 52, pp.1-16, 2003. The Engineer of 2020: Visions of Engineering in the New Century. National Academy of Engineering, 2004. National Acadmies Press, Washington DC. J. P. Trevelyan, “Reconstructing engineering from practice. Engineering Studies,” vol. 2, no.3, pp. 175-195, 2010.
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
N. Spinks, N. L. J. Silburn, and D. W. Birchall, “Educating Engineers for the 21st Century: The Industry View.” Henley, England: Henley Management College, 2006. N. Spinks, N. L. J. Silburn, and D. W. Birchall, “Making it all work: the engineering graduate of the future, a UK perspective.” European Journal of Engineering Education, vol. 32, no. 3, pp. 325-335, 2007. D.Vest, M. Long, and T. Anderson, “Electrical Engineers’ perceptions of communication training and their recommendations for curricular change: Results of a national survey,” IEEE Transactions on Professional Communication, vol. 39, no. 1, March 1996, pp. 38–42. E.L. Allen, A.J. Muscat, and E.D.H. Green, “Interdisciplinary Team Learning in a Semiconductor Processing Course,” Proceedings, 1996 Frontiers in Education Conference, ASEE/IEEE, 1996. A. Moore, “What you don’t learn at the bench: Conclusions from the EMBO/ELSF-organized meeting on career prospects in the life sciences.” EMBO Rep. 3, pp. 1018–1020, 2002. B. E. Lovitts, Leaving the Ivory Tower: The causes and consequences of departure from doctoral study. Lanham, MD: Rowman & Littlefield Publishers, 2001. T. L. Raoul Tan, and D. Potocnik, “Are you experienced? Junior scientists should make the most of opportunities to develop skills outside the laboratory.” EMBO Rep. 7, pp. 961–964, 2006. E. M. Tomazou & G. T. Powell, “Look who’s talking too: Graduates developing skills through communication.” Nature Reviews Genetics, Vol. 8, September 2007. H.Weger, G. R. Castle, & M. C. Emmett. “Active listening in peer interviews: The influence of message paraphrasing on perceptions of listening skill.” The International Journal of Listening, vol. 24, pp. 3449, 2010. C. R. Rogers, (1951). Client-centered therapy. Boston: HoughtonMifflin.
[15] A. B. Orlov, “Carl Rogers and contemporary humanism.” Journal of Russian and East European Psychology, 30, pp. 36-41, 1992. [16] L. O’Shea, R. Algozzine, D. Hammittee, D. O’Shea, Families and Teachers of Iindividuals with Disabilities: Collaborative Orientations and Responsive Practices. Boston: Allyn & Bacon, 2000. [17] D. McNaughton, D. Hamlin, J. McCarthy, D. Head-Reeves, and M. Schreiner, “Learning to listen, teaching an active learning strategy to preservice education professionals,” Topics in Early Childhood Special Education, vol. 27, pp. 223-231, winter 2007. [18] D. H. Levitt, “Active listening and counselor self-efficacy: Emphasis on one micro-skill in beginning counselor training.” The Clinical Supervisor, vol 20, pp. 101-l 15, 2001. [19] A. Paukert, B. Stagner, and K. Hope, “The assessment of active listening skills in helpline volunteers.” Stress, Trauma, and Crisis, vol. 7, pp. 6176, 2004. [20] V. Chapman and T. Sork, “Confessing regulation or telling secrets? Opening up the conversation on graduate supervision,” Adult Education Quarterly, vol 51, pp. 94-107, 2001. [21] S. Prentice, “The conceptual politics of chilly climate controversies”, Gender and Education, vol. 12, pp. 195-207, 2000. [22] R. Hall, and B. Sandler, “The classroom climate: A chilly one for women” Project on the Status and Education of Women, Association of American Colleges, Washington, DC, 1982. [23] B. L. Bernstein, “Managing barriers and building supports in science and engineering doctoral programs: Conceptual underpinnings for a new online training program for women.” Journal of Women and Minorities in Science and Engineering, vol. 17, pp. 29-50, 2011. [24] N. Mishima, H. Kubot, S. Nagata. “The development of a questionnaire to assess the attitude of active listening.” Journal of Occupational Health, vol 42, pp. 111-118, 2000. L.J. Cronbach, “Coefficient alpha and the internal structure of tests.” Psychometrika,vol. 16, pp. 297-334, 1951.