Learning and Assessing Collaborative Problem ...

7 downloads 1312 Views 2MB Size Report
ISTE, 2014. Atlanta. Iris Wolf .... Teaching and assessing problem solving in online collaborative ... online learning tools: Innovations in teacher preparation (pp.
Learning and Assessing Collaborative Problem Solving Skills Yigal Rosen Pearson, USA

Iris Wolf World ORT, Israel

ISTE, 2014 Atlanta

Introduction • PISA 2015 collaborative problem solving assessment framework • Lack of research on learning and assessment solutions • Study I: student performance in human-to-human and human-to-agent collaborative assessment tasks • Study II: ANIMALIA science collaborative problem solving. Joint learning and assessment project with World ORT, Israeli Ministry of Education

Standardized CPS assessment

In order to perform standardized assessment of CPS skills on an individual level, each student should be matched with various types of group members, should be tested on a variety of CPS skills, and must apply the skills in varied contexts. One solution to this standardization is to use computer-based agents to serve as the collaborators in the interactions.

Study I: Automatically-scored CPS assessment

4

Study I: Objectives and Method Exploring students’ CPS performance in two settings: humanto-human (H-H) and human-to-agent (H-A). • Data collected in the United States, Singapore, and Israel. • 179 14-year-old students: 136 in H-A, and 43 in H-H. • Schools were selected based on sufficient technology infrastructure, students proficient in English, and interest in 21st century skills •

5

Study I: Results Students who collaborated with a computer agent showed significantly higher level of performance in CPS. 1 0.8 0.6

Effect Size

0.4

0.6* 0.4*

0.5*

0.2 0

-0.2 -0.4 -0.6

Shared understanding

Monitoring progress Quality of feedback

Problem-solving -0.3

Limitations and Directions for Future Studies • The study is based on a relatively small and non-representative sample of 14-year-old students in three countries and selfreported school achievement • The study operationalized the communication between the partners in CPS through a phrase-chat to ensure standardization and automatic scoring, while other approaches could be considered, including spoken conversations and openchat. • Only one type of CPS task has been explored. It is possible that the comparability of findings between H-A and H-H performances in other problem-solving and collaboration contexts will be different.

Study II: ANIMALIA Science CPS •

• •



Interdisciplinary team of teachers and students from Israel, United States and Mexico are collaborating across a newlydeveloped eight-week online module A total of 312 students (8th and 9th grade), 16 teachers, across 8 schools are participating in the pilot study Students work in teams of 4 partners across schools and countries to gather and evaluate information about a fictionalized eco-system Data collection will be completed in July 2014 by external and internal teams

Professional development for teachers

Science pretest and posttest

Setting the stage for the scenario

Eco-system map

Interdependency between team members

Challenges Language: English, Hebrew, Spanish • Time zone differences • Social interactions • Complexity of the Animalia task: science and collaboration activities • Alignment with school curriculum • Teacher support and incentives • Management and coordination • Technology •

Next Steps Language: English as a second language (ESL) • Time zone differences: synchronous collaboration • Social interactions: activities embedded into the task • Complexity of the Animalia task: enrich science and collaboration activities • Alignment with school curriculum: Animalia in support of a larger curriculum, hands-on science activities, credit to students • Management and support: onsite World ORT project manager, webinars, technology support •

Further reading •

• •

• • • •

16

Graesser, A., Foltz, P., Rosen, Y., Forsyth, C., & Germany, M. (in press). Assessing collaborative problem solving. In B. Csapo, J. Funke , & A. Schleicher (Eds.), The nature of problem solving. OECD Series. OECD (2013). PISA 2015 Collaborative Problem Solving Framework. OECD Publishing. O’Neil, H. F., Jr., & Chuang, S. H. (2008). Measuring collaborative problem solving in low-stakes tests. In E. L. Baker, J. Dickieson, W. Wulfeck, & H. F. O’Neil (Eds.), Assessment of problem solving using simulations (pp. 177-199). Mahwah, NJ: Lawrence Erlbaum Associates. Puntambekar, S., Erkens, G., & Hmelo-Silver, C. (2011). Analyzing interactions in CSCL: Methods, approaches and issues. New York: Springer. Rosen, Y., & Foltz, P. (in press). Assessing collaborative problem solving through automated technologies. Research and Practice in Technology Enhanced Learning. Rosen, Y. (2014). Comparability of conflict opportunities in human-to-human and human-toagent online collaborative problem solving. Technology, Knowledge and Learning, 18(3). Rosen, Y., & Rimor, R. (2012). Teaching and assessing problem solving in online collaborative environment. In R. Hartshorne, T. Heafner, & T. Petty (Eds.), Teacher education programs and online learning tools: Innovations in teacher preparation (pp. 82-97). Hershey, PA: Information Science Reference, IGI Global.

THANK YOU! [email protected] [email protected]

Suggest Documents