Linguistic Discourse Variables as Indicators of Reflective Online ...

15 downloads 7388 Views 147KB Size Report
The most obvious of course interactions take place between the instruc- ... versation by the medium—in this case a computer network—and the hu- man users ...
THE AMERICAN JOURNAL OF DISTANCE EDUCATION, 20(4), 231–244 Copyright © 2006, Lawrence Erlbaum Associates, Inc.

Linguistic Discourse Variables as Indicators of Reflective Online Interaction Mark Hawkes Department of Instructional Technology Dakota State University The purpose of this study was to determine the efficacy of traditionally spoken linguistic analysis approaches for understanding the nature and outcomes of online interaction. The study took place with twenty-eight elementary school teachers in ten suburban Chicago schools involved in a technology-supported, problem-based learning curriculum development effort. The asynchronous and face-to-face communications of participants were monitored to test the utility of linguistic discourse variables for understanding interaction. The evidence showed that similar sense-making and interaction strategies are used in both face-to-face and online dialogue but that the strategies were significantly more prevalent in the face-to-face than online dialogue. When critical reflection was studied as an outcome of both forms of dialogue, asynchronous electronic communication was significantly more reflective than face-to-face discourse.

Distance learning researchers and instructors continue to have a fascination with online interaction. This interest evolves from a desire among online instructors to have learners take part in high quality, online interactions (Slagter van Tryon and Bishop 2005). Because the literature on online learning shows that the quality of dialogue and the development of the learner depend heavily on facilitation, instructors work tirelessly to promote worthwhile exchanges between online learning participants (Keeley 2004; Slocum, Towns, and Zielinski 2004; Taylor 2005). Ultimately, the extent to which learners find value in their online learning experience and are satisfied with the results rests on the quality of those interactions (Dooley, Kelsey, and Lindner 2003).

Correspondence should be sent to Mark Hawkes, Dakota State University, Department of Instructional Technology, Madison, SD 57042. E-mail: [email protected]

231

LINGUISTIC DISCOURSE VARIABLES

The most obvious of course interactions take place between the instructor and learner in the form of motivational messages (Visser et al. 2002) and exchanges on instructional content (Carswell et al. 2000). Interaction between learners focuses on the content or protocol of the course (Lee and Gibson 2003) or on social exchange (Sonnenwald and Li 2003; Bernard et al. 2004; Roblyer and Weincke 2003). Learner interaction with the instructional content itself involves the nature of the coursework, modalities of content engagement, and learner preferences for engagement strategies (Harris, Dwyer, and Leeming 2003; Weller 2000). The construct of online or distance learning “interaction” as first defined by Moore (1989) is one that continues to evolve and expand. A number of additional refined definitions of interaction have emerged from the primary categories provided here. Within the distance learning field, there is acceptance of these differing conceptions of interaction. There is criticism, however, for the ways in which interaction is analyzed and measured. Although the analysis of online dialogue is necessary to determine the nature and occurrence of interaction, much of that analysis is limited to frequency counts of postings and messages. Jonassen (1996, 263) suggested that when evaluating the quality of students’ contributions to asynchronous conferences that quantifiable criteria both in the number and the length of posted messages be used. Researchers who have followed this advice have used message frequency counts as the basis for interaction (Cunningham-Atkins et al. 2004; Orrill 2002). Student interaction also has been measured as a ratio of student reactions to an initial document posting (Ronteltap and Eurelings 2002). In other cases, quantifiable interaction outcomes were based on message structure, such as Edelstein and Edwards’s (2002) criteria of message coherence and relevance to previously posted messages. Others have proposed even more procedurally focused criteria, such as Bauer and Anderson (2000), who advocated assessment that focuses on the degree to which a student uses complex, grammatically correct sentences on a regular basis. Morrison and Ross (1998) recommended that points be deducted for off-topic comments students make to the collaborative discourse. Swan (2001) based interaction on student responses to multiple choice questions on whether learners had “a great deal,” “sufficient,” “insufficient,” or “no” interaction with the instructor. The frequency of postings in online learning or criteria for quality university-level work, such as writing clarity and organizational skills, can be useful gauges of online interaction. However, they do not help us understand how online discourse participants engage in a way that results in 232

HAWKES

some shared understanding or new knowledge relevant to the instruction. Nor do the traditionally practiced approaches to assessing online interaction suitably explain how the social dimensions of interaction support group structure and dynamics that are necessary for building learning communities (Kreijns, Kirschner, and Jochems 2003). The purpose of this study was to determine the efficacy of linguistic analysis tools for understanding the nature of online interaction.

Conversational Dynamics of Online Dialogue Although asynchronous online dialogue is constructed almost exclusively from linguistic signs, linguists have thoughtfully considered online communication in their studies on human conversation. Studying online communication helps determine what contributions are made to the conversation by the medium—in this case a computer network—and the human users (Herring 1996, 4). Applying the accumulated expertise of the discipline of linguistics on spoken and written work lends additional insight into the structure, purpose, and composition of online interaction. Human involvement in conversation is said to take a combination of three forms: self-involvement of the speaker, interpersonal involvement between the speaker and the hearer, and involvement of the speaker with what is being discussed (Chafe 1985, 116). In an analysis of conversation, Tannen (1989, 17) identified a number of linguistic features she suggested are “involvement strategies” between speaker, hearer, and content. Tannen indicated these involvement strategies are a basic force in both conversational and literary discourse. One of the strategies that Tannen (1989, 23) identified was “participation in sense-making.” Tannen suggested that discourse is made effective because the more work readers or hearers do to supply meaning, the deeper their understanding is and the greater their sense of involvement is with both the text and the author. Yet in the context of computer-mediated communication (CMC), the nonverbal cues conveying social or affective information transmitted through vision (facial expressions, posture, gaze), olfaction (cologne/perfume, body odor), or audition (voice volume, inflection, tone) are absent. This is why online interaction has sometimes been characterized as impersonal, unfriendly, and task oriented and sometimes described as leading to disinhibited behavior (Sproull and Kiesler 1986). It is suggested that such qualities could make group problem solving in the online environment ineffective (Hiltz, Johnson, and Turoff 1986). 233

LINGUISTIC DISCOURSE VARIABLES

In the absence of paralinguistic communication cues, one might consider the online environment a poor facilitator of sense-making strategies. However, compensatory linguistic strategies, such as structuring turn taking and providing equal access to the conversational floor, ease the task of posing clarifying questions about what a participant may have said. The online environment may also augment the process of sense making by minimizing overlapping speech that is often a characteristic of face-to-face group dialogue. Repetition and fixity are additional involvement strategies that Tannen (1989) discussed. Of repetition, Tannen noted that a persistent feature of conversation is the recollection and reshaping of previous bits of dialogue to apply to new contexts (Tannen 1989, 37). This type of involvement strategy bonds individual participants in both conversation and relationships. Fixity is described as the adaptation of a word or phrase and the repeated use of that word or phrase through an episode of discourse by participants (Tannen 1989, 45). Another conversational involvement strategy is spontaneity. Tannen suggested that speakers repeat, rephrase, and echo words in conversation without stopping to think and as an automatic and spontaneous way of participating in conversation (Tannen 1989, 96). The way spontaneity occurs, Tannen instructed, is through the use of first- and second-person pronouns. Using linguistic discourse analytical approaches to measure the interactive content of online dialogue presents a new opportunity for the field of distance education to determine the extent to which true interaction in online learning environments takes place. However, interaction is only an intermediate effect and should be gauged as a medium leading to a more valuable outcome (Anderson, du Plessis, and Nickel 2001). To make the evaluation of asynchronous interaction consistent with the nature of the discourse itself, the focus must be on the outcomes of the dialogue rather than the structure. In this study, the terminal outcome desired of the interaction was collaborative and critical reflection.

Collaborative and Critical Reflection Research describes social learning as an act undertaken by multiple group members in which learning involves sharing and examining teaching experiences with peers (Vygotsky 1978). Comparing and contrasting individual beliefs against those of a group constitutes a process by which meaning—and subsequently knowledge—is formed. The process of collaborative reflective activities are social-professional wherein assumptions and be234

HAWKES

liefs are examined, challenged, rejected, or adopted through discourse and critical inquiry among a group of people who share similar interests. Cooperative dialogue strongly influences what students learn and how they learn it. Reflection is noted to be an element of collaboration, an affiliate of dialogue, and a critical step in the process of social learning (Darling-Hammond 1996; Putnam and Borko 1996). Reflection is a continual process that engages learners in framing and reframing problems while designing and evaluating solutions. In the context of asynchronous interaction, reflective properties are what, our experience suggests, has interdisciplinary application to learners, courses, and content. In this study, we attempted to understand online collaborative interaction by analyzing communication from a knowledge creation standpoint. We explored the use of linguistic strategies by participants in electronic communications as a means to overcome the loss of more traditional communication channels used to make cognitive, situational, and social sense. In contrast to content learning outcomes, shared knowledge creation focuses on the processes of interaction and outlines learning as an interpretation of the meaning that is made in communication with others (Stahl 2003). We propose that linguistic analysis tools have the potential to help make learning visible by examining artifacts of online dialogue using spoken discourse variables. Methods This research involved twenty-eight fifth- and sixth-grade teachers who volunteered to participate in problem-based curriculum development in a large Chicago suburb elementary school district (see Table 1). The distinguishing feature of the problem-based learning (PBL) model used here is the emphasis on co-development. For teachers, the objective was to establish a rich knowledge-building community of practice around PBL. Participating teachers worked with colleagues from other schools to create and deliver two interdisciplinary instructional units in ways that would uniquely and appropriately engage their students. Persistent contact with co-participants outside of their school was encouraged to engage teachers around highly authentic tasks—curricular and instructional—that applied directly to their classrooms and schools. To support the yearlong initiative to build teacher capacity for developing PBL curricula, teacher teams of three to four members met on select days after school. Because union policy and district activities limited the face-to-face contact between teachers participating in the project from various schools, network communication channels were used to host some of 235

LINGUISTIC DISCOURSE VARIABLES

Table 1. Characteristics of Study Participants Study Sample Participant Characteristics Number of teachers Number of schools Average school size Teacher gender Female Male Teaching tenure (average) Teacher ethnicity Asian Black Hispanic White

Districtwide

N

%

N

%

28 10 623

100 39 —

975 27 657

100 — —

25 3 18

89 11 —

845 130 15

87 13 —

0 0 0 28

0 0 0 100

10 7 10 948

1 1 1 97

the ongoing curriculum co-development and communication among teachers. The primary asynchronous communication mode was e-mail. The communication generated through intranet e-mail constituted the primary data by which discourse interaction and the resulting reflective outcomes were examined. To determine the nature of interaction when teachers communicate under normal circumstances, a baseline of teacher discourse was gathered by audio recording six face-to-face meetings of teacher work groups ranging from the start to the end of the program. Teachers were informed that their meetings were being audio taped as a part of the data collection for project evaluation. Teachers were not aware that their interaction and reflective discourse were the variables of interest in this study. The recording of the face-to-face meetings ran concurrently with the collection of ongoing group asynchronous communication. Teachers were free to use the communication tools in any way they chose. The interaction was neither moderated nor facilitated, presenting an opportunity to see how and for what purposes network use naturally evolved. Although participation via electronic means was not required of teachers, it was assumed that given their geographic distance from other team members, teachers might use the communication mode to address the task of developing content for integrated PBL curricular units. The e-mail interactions between PBL project participants produced 179 usable messages posted over a four-month period. This total does not in236

HAWKES

clude messages posted by administrative staff, misposted messages (resent, misdirected), cross-posted messages from other forums, or messages authored by nonproject personnel. The audio-recorded face-to-face discourse—which served as a comparative frame of analysis to the e-mail—was “chunked” to equalize in size and number the discourse units produced through the two forms of communication. The chunking of face-to-face discourse was guided by principles of distributional accountability (Schiffrin 1987). Groups of exchanges were said to be distributionally accountable when they perform the same function. The chunking process of the face-to-face discourse resulted in 222 distributionally accountable chunks ranging in size from one to twelve utterances (see Table 2). After all identifying information (school and individual) was removed from e-mail messages and face-to-face transcripts, and the communication was comparably framed, discourse analysis began. To understand the levels of participant interaction from the perspective of spoken discourse strategies on the two modes of discourse, three common linguistic discourse variables were selected: involvement strategies, conversational cooperation, and sequential accountability. These variables are defined as follows: Involvement strategies were defined by Tannen (1989) as the use of selected words that increase the involvement of the hearer. Specific strategies include the use of “wh” clauses (what, where, why, when, who), indefinite pronouns (anybody, everyone, everything), and amplifiers (absolutely, intensely, totally). Conversational cooperation identifies questions as basic conversational components and asserts that participants cooperate in taking a conversation in a common direction through a series of questioning and answering. When participants fail to answer questions raised in dialogue, conversational cooperation is violated, and interactivity is reduced (Grice 1975). Table 2. Face-to-Face and Computer-Mediated Dialogue Content Comparison Communication Mode Face-to-face discourse Computer-mediated discourse

Number of Words

Utterances/ Messages

Chunks/ Messages

Words per Chunk/Post

19,000 19,303

846 179

222 179

86 108

237

LINGUISTIC DISCOURSE VARIABLES

Sequential accountability refers to the meaning a message or utterance has when it is paired with another utterance. Because the order and co-occurrence of an utterance is generally made meaningful through its relationship with a previous utterance, an unrelated utterance is considered sequentially unaccountable and interactively weak (Schiffrin 1987). Both the face-to-face and computer-mediated conferencing discourse were evaluated on the basis of each discourse variable. In the cases of “wh” clauses, indefinite pronouns, amplifiers, and sequentially unaccountable utterances, each occurrence in both the face-to-face and the computer-mediated discourse was tallied. Conversational cooperation was determined by measuring the ratio of answered to unanswered questions in the discourse. A chi-square test of association illustrated the relationship between the observed proportions of each interaction variable. To compare the reflectiveness of the interaction, a team of three raters, independent of the school district, was trained to rate each of the face-to-face and electronically produced messages. The training used the Simmons et al. (1989) taxonomy for assessing reflective thinking. This taxonomy of reflective levels for group discourse ranges from a point at which responses merely describe events and appear disconnected from the observer to a point at which responses richly describe events and attempt to explain them in light of theory or principle. Raters judged each of the chunked exchanges in the face-to-face dialogue (n = 222) and the computer-mediated distance discourse using the seven-level rubric described here. Results Assessing Participant Interaction Specific discourse strategies used to heighten the interaction of participants in the dialogue were examined. Table 3 presents a comparison of the face-to-face and electronic discourse using the chi-square test of association between the observed proportions in each interaction variable. In all five cases, the ratio of the selected interaction variables in face-to-face communication exceeded those found in the e-mail messages between discourse participants. In three of those five variables, “wh” clauses, amplifiers, and conversational cooperation results were significant at the p < .01 level. The results show that face-to-face discourse is generally more interactive than the e-mail discourse. 238

HAWKES

Table 3. Chi-Square Values on Discourse Interaction Variables Discourse Variables

Parameter

CMC (n)

Face-to-Face (n)

χ2

p

Involvement strategies “Wh” clauses Indefinite pronouns Amplifiers Conversational cooperation Sequential accountability

Words Words Words Questions Utterances

220 435 220 25 15

326 474 288 95 43

22.99 2.404 10.348 10.007 3.009

.001 .065 .001 .002 .064

Assessing Reflective Content Reflection was examined as an indicator of the medium’s ability to promote interactivity between discourse participants. Talking, sharing, exploring, and analyzing are important interactions in sense making and, by themselves, constitute key components in the critical reflection process. To determine the levels of reflection in the dialogue, ratings were assigned to the e-mail messages and the face-to-face discourse chunks by three independent raters. When the raters were tested for their consistency, interrater reliability analyses on the face-to-face discourse achieved an item alpha level of .80 whereas the e-mail discourse achieved an item alpha level of .87. When the two communication media were compared on their overall level of reflectiveness, an independent t test for equality of means between groups shows that the e-mail communication had significantly higher ratings on the reflective scale than face-to-face communication (see Table 4). A breakdown of the ratings of reflectiveness in Figure 1 shows the percentages of ratings assigned to each of the seven reflective levels. The majority of messages (70% for the face-to-face, 63% for e-mail) were rated at the first and second levels of reflection and generally showed that neither e-mail nor the face-to-face discourse is abundantly reflective. The fact that two very different distributions appeared for each discourse mode with positive skews of very different strengths (stronger at the higher end for e-mail than

Table 4. Comparison of Communication Media on Levels of Reflectiveness Communication Mode Face-to-face discourse Computer-mediated discourse

N

M

SD

t

p

222 179

2.34 2.73

.962 1.049

4.14

.001

239

LINGUISTIC DISCOURSE VARIABLES

Figure 1.

Percentage of Rater Observation at Each Reflective Level

face-to-face discourse) suggests that the two media might serve very different purposes, where reflection may have an intended or unintended role.

Discussion The results showed discourse strategies that generally appear in face-to-face dialogue also appear with frequency in electronic communication, which suggests that these linguistic strategies are a reliable basis for comparing and understanding interaction. When these two modes of interaction were compared on their ability to unite listeners and elicit their participation, three of the five variables of the face-to-face dialogue analyzed appeared significantly more interactive than the e-mail discourse. A variation between the electronic and face-to-face interaction was the latency with which sense-making cues were received. Because of the time independent nature of electronic communication, content as well as contextual information relays at a slower pace. This finding is consistent with previous research suggesting that the transmission of socioemotional cues and other patterns of communication occur at a significantly lower rate in electronic communication than face-to-face communication (Walther 1992; 1993). Because interaction is only an intermediate step toward a more significant outcome, participant reflection was identified as a variable of interest. In the end, a direct comparison between the two media showed that e-mail communication exhibited more reflective qualities than the face-to-face dialogue. 240

HAWKES

The media are different in that e-mail-facilitated reflection appeared to be more sustained while involving a greater number of contributors. In addition, e-mail-facilitated reflection is less context focused than that of face-to-face contact. More so than face-to-face discourse, computer-mediated reflection is generated by the inclusion of ideas and theories outside of the teachers’immediate curriculum development experiences. A process for assessing interaction that accounts for what Zhang and Fulford (1994) suggested are social and instructional aspects of interaction would account for the nature of the conversation and the collaboratively reflective content it produces. Northrup’s (2001) ideas of interaction attributes consisting of strategies and attributes to facilitate interaction (interaction with content, collaboration, conversation, intrapersonal interaction, performance support) could also serve as a suitable framework for assessing overall participant interactivity resulting in situationally relevant outcomes. In confirming reflection as a desirable and measurable outcome of asynchronous online interaction over face-to-face dialogue, we discovered that reflection is situational—embedded in the tasks, processes, and events of practice. The asynchronous online modality appears to be a way to satisfy knowledge-based approaches to real-time needs and problems. Because e-mail and other asynchronous communications can be used anytime and anywhere system components can be found, they remain a reliable conduit to knowledge-based resources. Several possible issues combine to constrain the findings of this study. Although the teachers who were a part of this project worked at some distance from each other, their familiarity and experience in working with each other before their e-mail communication may explain the difference in interaction between e-mail and face-to-face discourse. Teacher participants’ level of comfort with the technology also could significantly influence the level of reflection the participant demonstrates via e-mail. Despite these limitations, the study establishes the use of spoken discourse variables as a basis for understanding the nature of online interaction and critical reflection as an outgrowth of electronically supported interaction. This is significant as the context of online interaction grows increasingly wider, and new approaches to facilitating participant interaction in distance environments proliferate. References Anderson, E. C., J. du Plessis, and T. Nickel. 2001. Participation in international teleconferences and discussions: Implicit assumptions. Educational Technology Research and Development 49 (3): 119–123. 241

LINGUISTIC DISCOURSE VARIABLES

Bauer, J. F., and R. S. Anderson. 2000. Evaluating students’ written performance in the online classroom. In Principles of effective teaching in the online classroom, ed. R. E. Weiss, D. S. Knowlton, and B. W. Speck, 65–72. San Francisco: Jossey-Bass. Bernard, R. M., A. Brauer, P. C. Abrami, and M. Sturkes. 2004. The development of a questionnaire for predicting online learning achievement. Distance Education 25 (1): 31–47. Carswell, L., P. Thomas, M. Petre, B. Price, and M. Richards. 2000. Distance education via the Internet: The student experience. British Journal of Educational Technology 31 (1): 29–46. Chafe, W. L. 1985. Linguistic differences produced by differences between speaking and writing. In Literacy, language, and learning: The nature and consequences of reading and writing, ed. D. R. Olsen, N. Torrance, and A. Hildyard, 105–123. Cambridge, MA: Cambridge University Press. Cunningham-Atkins, H., N. Powell, D. Moore, D. Hobbs, and S. Sharpe. 2004. The role of cognitive style in educational computer conferencing. British Journal of Educational Technology 35 (1): 69–80. Darling-Hammond, L. 1996. The right to learn and the advancement of teaching: Research, policy, and practice for democratic education. Educational Researcher 25 (6): 5–17. Dooley, K. E., K. D. Kelsey, and J. R. Lindner. 2003. Doc@distance: Immersion in advanced study and inquiry. Quarterly Review of Distance Education 4 (1): 43–50. Edelstein, S., and J. Edwards. 2002. If you build it, they will come: Building learning communities through threaded discussion. Online Journal of Distance Learning Administration 5 (1): 1–10. Grice, H. P. 1975. Logic and conversation. In Syntax and semantics, ed. P. Cole and J. Morgan, 41–58. New York: Academic. Harris, R. N., W. O. Dwyer, and F. C. Leeming. 2003. Are learning styles relevant in Web-based instruction? Journal of Educational Computing Research 29 (1): 13–28. Herring, S. 1996. Computer-mediated communication: Linguistic, social, and cross-cultural perspectives. Philadelphia: Benjamins. Hiltz, S. R., K. Johnson, and M. Turoff. 1986. Experiments in group decision making: Communication process and outcome in face-to-face versus computerized conferences. Human Communication Research 13: 225–252. Jonassen, D. H. 1996. Computers in the classroom: Mindtools for critical thinking. Englewood Cliffs, NJ: Prentice-Hall.

242

HAWKES

Keeley, P. 2004. Main labnet moderators guide. Augusta, ME: Maine Mathematics and Science Alliance. Kreijns, K., P. A. Kirschner, and W. Jochems. 2003. Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: A review of the research. Computers in Human Behavior 19: 335–353. Lee J., and C. C. Gibson. 2003. Developing self-direction in an online course through computer-mediated interaction. The American Journal of Distance Education 17 (3): 173–187. Moore, M. G. 1989. Three types of interaction. The American Journal of Distance Education 3 (2): 1–6. Morrison, G. R., and S. M. Ross. 1998. Evaluating technology-based processes and products. In Changing the way we grade student performance: Classroom assessment and the new learning paradigm, ed. R. S. Anderson and B. W. Speck, 69–77. San Francisco: Jossey-Bass. Northrup, P. 2001. A framework for designing interactivity in Web-based instruction. Educational Technology 41 (2): 31–39. Orrill, C. H. 2002. Supporting online PBL: Design considerations for supporting distributed problem solving. Distance Education 23 (1): 41–57. Putnam, R. T., and H. Borko. 1996. Teacher learning: Implications of new views of cognition. In The international handbook of teachers and teaching, ed. B. J. Biddle, T. L. Good, and I. F. Goodson, 673–708. Dordrecht, The Netherlands: Kluwer. Roblyer, M. D., and W. R. Weincke. 2003. Design and use of a rubric to assess and encourage interactive qualities in distance courses. The American Journal of Distance Education 17 (2): 77–98. Ronteltap, F., and A. Eurelings. 2002. Activity and interaction of students in an electronic learning environment for problem-based learning. Distance Education 23 (1): 11–22. Schiffrin, D. 1987. Discourse markers. Cambridge, MA: Cambridge University Press. Simmons, J. M., G. M. Sparks, A. Starko, M. Pasch, A. Colton, and J. Grinberg. 1989. Exploring the structure of reflective pedagogical thinking in novice and expert teachers: The birth of a developmental taxonomy. Paper presented at the annual conference of the American Educational Research Association, April, San Francisco. Slagter van Tryon, P. J., and M. J. Bishop. 2005. Identifying “e-mmediacy” strategies for Web-based instruction: A Delphi study. Quarterly Review of Distance Education 6 (4): 318–328.

243

LINGUISTIC DISCOURSE VARIABLES

Slocum, L. E., M. H. Towns, and T. J. Zielinski. 2004. Online chemistry modules: Interaction and effective faculty facilitation. Journal of Chemical Education 81 (7): 1058–1065. Sonnenwald, D. H., and B. Li. 2003. Scientific collaboratories in higher education: Exploring learning style preferences and perceptions of technology. British Journal of Educational Technology 34 (4): 419–431. Sproull, L., and S. Kiesler. 1986. Reducing social context cues: Electronic mail in organizational communication. Management Science 32:1492–1512. Stahl, G. 2003. Building collaborative knowing: Elements of a social theory of learning, In What we know about CSCL in higher education, ed. J. W. Strijbos, P. Kirschner, and R. Martens, 53–85. Amsterdam: Kluwer. Swan, K. 2001. Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education 22 (2): 306–331. Tannen, D. 1989. Talking voices: Repetition, dialogue, and imagery and conversational discourse. New York: Cambridge University Press. Taylor, P. 2005. An examination of the effect of facilitation training on the development and practice of participants in an online induction program for teachers of science and mathematics. Ph.D. diss. proposal, Montana State University, Bozeman. Visser, L., T. Plomp, R. J. Amirault, and W. Kuiper. 2002. Motivating students at a distance: The case of an international audience. Educational Technology Research and Development 50 (2): 94–110. Vygotsky, L. S. 1978. Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press. Walther, J. B. 1992. Interpersonal effects in computer-mediated interaction: A relational perspective. Communication Research 19 (1): 52–90. ———. 1993. Impression development in computer-mediated interaction. Western Journal of Communication 57:381–398. Weller, M. J. 2000. Creating a large-scale, third generation, distance education course. Open Learning 15 (3): 243–252. Zhang, S., and C. P. Fulford. 1994. Are interaction time and psychological interactivity the same thing in the distance learning television classroom? Educational Technology 34 (6): 58–64.

244

Suggest Documents