Sixth International Conference on Technological

0 downloads 0 Views 5MB Size Report
copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific .... different Bachelor's and Master's degree courses at three public ... Gómez. 2018. Development and use of mobile technologies that foster student´s evaluative ... of students in the assessment process through self-assessment or.
Proceedings TEEM’18

Sixth International Conference on Technological Ecosystems for Enhancing Multiculturality Salamanca, Spain October 24th – 26th, 2018

Editor: Francisco José García-Peñalvo University of Salamanca

TEEM’18 is organized by the Research GRoup in InterAction and eLearning (GRIAL) and Research Institute for Educational Sciences (IUCE) at the University of Salamanca.

Sixth International Conference on Technological Ecosystems for Enhancing Multiculturality (TEEM’18) GRIAL Research Group Research Institute for Educational Sciences (IUCE) Paseo de Canalejas 169 37008 Salamanca, Spain

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Association for Computing Machinery (ACM) 2 Penn Plaza, Suite 701 New York New York 10121-0701 ISBN: 978-1-4503-6518-5

Designed by: Alejandro Carnicero Lucía García Holgado Felicidad García Sánchez Research GRoup in InterAction and eLearning (GRIAL)

II

Improving presentation skills in the context of software project management teaching Terje Samuelsen; Ricardo Colomo-Palacios; Ole Anders Danielsen ........................................................................ 66 Newton’s cradle: a smartphone-based video analysis approach Pablo Martín-Ramos; Mário S.M.N.F. Gomes; Manuela Ramos Silva ...................................................................... 71 Motivating Engineering students to Improve Teamwork and Time Management Sophie Gorgemans; María-Jesús Alonso-Nuez; Jorge Rosell-Martínez ................................................................... 78 The application of new teaching methodologies: experience in actual situations Isabel Revilla; Ana M. Vivar-Quintana ....................................................................................................................... 86 Experience in the implementation of projects in professional environment in a 1st cycle of studies of Civil Engineering Diogo Ribeiro; Teresa Neto; Ricardo Santos; Maria de Fátima Portela .................................................................... 93 Educational content using Blind/Deaf Communications API Paula Escudeiro; Bertil Marques; Piedade Carvalho; Ana Barata; Patrícia Queirós; Ana de Sousa; Carlos Dias; Emanuel Rocha; João Ulisses ................................................................................................................................ 100 Training engineering students for the world of work: a case study Milagros Huerta; Néstor Mora; Carlota Armillas; Javier Jacob Núñez .................................................................... 105 Practical Work and Assessment to Stimulate Students’ Participation and Motivation in Fluid Transport Issues Maria Teresa Sena-Esteves; Cristina Morais; Anabela Guedes; Isabel Brás Pereira; Margarida Ribeiro; Filomena Soares; Celina Pinto Leão ....................................................................................................................................... 113 Outdoor Intelligent Shader. An EPS@ISEP 2018 Project Christopher Mahon; Manuel Baptista; Marta Majewska; Melanie Tscholl; Sven Bergervoet; Benedita Malheiro; Manuel F. Silva; Cristina Ribeiro; Jorge Justo; Paulo Ferreira; Pedro Guedes ...................................................... 122 Evaluation in education and guidance ................................................................................................................ 128 Evaluation in education and Guidance: a perspective from 2018 María-José Rodríguez-Conde; Susana Olmos-Migueláñez; Adriana Gamazo; Joe O'Hara ................................... 129 Construction and validation of a questionnaire to assess student satisfaction with mathematics learning materials Alién García-Hernández; Teresa González-Ramírez .............................................................................................. 134 Development of information literacy in primary and secondary schools in Castile and León (Spain) Manuel Lucas-Ledesma; Juan Antonio Hernández-Fuentevilla; Óscar Carbonell-Carqués; Antonio Miguel Seoane-Pardo; María José Daniel-Huerta; Purificación Cardenal-Lubiano ............................................................ 139 Big Data in Education: Detection of ICT Factors Associated with School Effectiveness with Data Mining Techniques Fernando Martínez-Abad; Adriana Gamazo; María José Rodríguez-Conde .......................................................... 145 Development and use of mobile technologies that foster students' evaluative judgement: a design-based research Jaione Cubero-Ibáñez; María Soledad Ibarra-Sáiz; Gregorio Rodríguez-Gómez ................................................... 151 Needs detected by the agents involved in Dual Vocational Training. A single-case study Marta Virgós Sánchez; Joaquin Lorenzo Burguera ................................................................................................. 157 Using Visualizations to Improve Assessment in Blended Learning Environments Mikel Villamañe; Ainhoa Alvarez; Mikel Larrañaga; Oscar Hernández-Rivas; Jessica Caballero .......................... 165

XVIII

Development and use of mobile technologies that foster students' evaluative judgement: a design-based research Jaione Cubero-Ibáñez†

María Soledad Ibarra-Sáiz

EVALfor Research Group University of Cadiz Spain [email protected]

EVALfor Research Group University of Cadiz Spain [email protected]

Gregorio Rodríguez-Gómez EVALfor Research Group University of Cadiz Spain [email protected]

ABSTRACT

1 INTRODUCTION

Considering the importance of the digital aspects of our society, we must rely on the opportunities offered by technology for improving assessment. This paper presents a study that indicates the development and use of mobile technologies in participatory assessment processes, with the intention of encouraging students' critical and evaluative judgement. For this, design-based research [1] is used to follow a mixed intervention design, in which assessment as learning and empowerment (student participation and feedback) will be put into practice through the use of mobile technologies. The sample will be composed of 1,065 students on different Bachelor's and Master's degree courses at three public universities in Spain. The study sets out to demonstrate the value added by Technology-Enhanced Assessment (TEA) to the emerging field of mobile learning, as well as its contribution to the development of students' evaluative judgement.

This research stems from the difficulties faced by university students in delivering evaluative judgements, as well as other assessment skills (reasoning, critical thinking, decision-making), which foster strategic and permanent lifelong learning [2]. It is a common criticism that university students lack critical, reflexive and argumentative capacity [3]. This seems to be due to the limited opportunities that they receive in their university education to develop the skills they need to become independent and autonomous learners [4], making it necessary to rethink university practices to prepare students with critical assessment skills [5] so that they can consciously and successfully play their role as evaluators in the classroom and throughout their lives [6]. Taking into account today's context of virtualization and digitalization, Technology-Enhanced Assessment (TEA), a broad term that covers the various methods by which technology can be used to support the management and delivery of assessment [7], is a key ally in giving support to pedagogy and in fostering better assessment practices. This "does not mean simply replacing current types of assessment with digital versions, but instead making use of technology to address some of the operational and pedagogical aspects of assessment" [8]. The objective of this study is to empirically verify the results and the impact obtained from the use of mobile technologies in the development of evaluative judgement in participatory assessment processes as a means for the training of democratic and critical citizens.

CCS CONCEPTS • Applied computing → Education; Collaborative learning • Human-centered computing → Ubiquitous and mobile computing; Ubiquitous and mobile computing design and evaluation methods

KEYWORDS Evaluative judgement, assessment technologies, Higher Education

as

learning,

mobile

ACM Reference format:

2 THEORETICAL FRAMEWORK

Jaione Cubero-Ibáñez, María Soledad Ibarra-Sáiz, and Gregorio RodríguezGómez. 2018. Development and use of mobile technologies that foster student´s evaluative judgement: a design-based research. In Proceedings of 6th International Conference Technological Ecosystems for Enhancing Multiculturality, Spain, October 2018 (TEEM’18) (Salamanca, Spain, October 24-26, 2018), F. J. García-Peñalvo Ed. ACM, New York, NY, USA, 6 pages. https://doi.org/10.1145/3284179.3284207

The cornerstone of the proposal is part of the so-called

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. TEEM'18, October 24-26, 2018, Salamanca, Spain © 2018 ACM. ISBN 978-1-4503-6518-5...$15.00 http://dx.doi.org/10.1145/3284179.3284207

sustainable assessment, proposed as a type of assessment '"that meets the needs of the present and prepares students to meet their own future learning needs" [9] and focusses specifically on assessment as learning and empowerment [10] (see Figure 1), fostering a participatory and transparent assessment in which the student can make judgements about their own and their peers’ output, favoring self-regulation and the development of other transversal skills which are required to function in life. This approach is characterized mainly by a) the participation of students in the assessment process through self-assessment or peer-to-peer assessment; and b) the consideration of feedforward as a means of providing information that helps students achieve sustainable learning.

TEEM 2018, October 2018, Salamanca, Spain

J. Cubero-Ibáñez et al. learning as a combination of interactions between students, their devices and other people (see Figure 2), arguing that this provides improved collaboration among students, access to information and a deeper contextualization of learning, thereby empowering students by helping them to better assess and select relevant information, redefine their goals and reconsider their understanding of concepts within an ever-changing and growing framework.

Figure 1: Assessment as learning and empowerment [10]

2.1 Students´ evaluative judgement The term evaluative judgement, as a higher-level cognitive ability required for life-long learning [11], is relatively new to higher education literature. It is the capability to make decisions about the quality of work of oneself and others [12]. There are two essential components for evaluative judgement that operate as complementary to each other: understanding what constitutes 'quality' and applying this understanding through an assessment of the work, whether one's own or that of another person. This second step could be considered as the performance of evaluative judgements, with this being the means to exercise and develop the evaluative judgement itself [13]. It is essential, therefore, to provide opportunities for students to exercise and develop their evaluative judgement. This will require an active and iterative commitment with the laying down of criteria, the declaration of judgements on various work samples, dialogic feedback with peers and tutors geared towards the understanding of quality, and the articulation and justification of judgements with a focus on immediate and future tasks [12]. Following the authors of publications such as those in [14-16], we defend the need to emphasize the explicit development of the ability to make evaluative judgements, highlighting that “developing students’ evaluative judgement should be a goal of higher education, to enable students to improve their work and to meet their future learning needs” [12].

2.2 Mobile learning The ubiquity of mobile devices is changing the way people interact with content and with their surroundings. They enable two-way communication in real-time, while helping educators to efficiently respond to students' needs. This development is having an impact on both the range and the creation of educational content [17]. Mobile learning, however, involves more than simply using a mobile device to access content and communicate with others: it requires consideration of student mobility. Following [18], we understand it as "the processes (both personal and public) of gaining knowledge via exploration and conversation through multiple contexts between people and interactive technologies". In the same vein, [19] champions mobile 152

Figure 2: A model for framing mobile learning [19].

3 PROPOSAL 3.1 Context of the proposal This study provides continuity and explores the research lines of the EVALfor-SEJ 509 Research Group [20], which has a long track record in the design and development of assessment technologies in training contexts. Specifically, it highlights the EvalCOMIX® web service [21], which is designed to enhance the possibilities of the electronic assessment of learning management systems or virtual learning environments such as Moodle, LAMS, etc. This proposal stems from the interest in enhancing students' evaluative judgement through the development and use of mobile technologies. The proposal will be implemented during the 2019/20 academic year in 16 subjects belonging to different Bachelor's and Master's degrees at the Universities of Cadiz (UCA), Salamanca (USAL) and Rovira i Virgili (URIV).

3.2 Research question and operational objectives The research question that guides the inquiry reflected in this study is as follows: Does student participation in assessment as learning and empowerment processes through the use of mobile technologies foster students' evaluative judgement? To answer this question, we intend to empirically verify the results and the impact gained from the use of mobile technologies in the development of students' evaluative judgement after

Development and use of mobile technologies that foster students´ evaluative judgement: a design-based research engaging in participatory assessment processes. In order to achieve this objective, we have set ourselves the following operational goals: 1. 2. 3. 4.

To develop a Conceptual Framework for the study To design, programme and develop mobile technologies that streamline the implementation of students' evaluative judgement in participatory assessment processes. To experimentally apply these mobile technologies in different subjects within a framework of assessment as learning and empowerment. To evaluate the effects (results and impact) of student participation in assessment by using mobile technologies.

TEEM 2018, October 2018, Salamanca, Spain confirm the initial hypothesis, it will be necessary to define the variables (description) and lay down indicators for each one of them to later identify the established relationships. b) Likewise, a case study will be carried out to explore how students and professors perceive the experience, focusing on aspects such as satisfaction, usefulness of the designed applications, etc.

3.3 Methodology We are proposing this research project on the logical basis of the design-based research (DBR) methodology, complying with the characteristics that [1] suggest for this methodology: it is situated in a real educational context; it is focused on the design and verification of a meaningful intervention; it uses mixed methods; and it involves a series of iterations (at least two cycles) in which researchers and practitioners work together. Specifically, this study focusses on university classrooms and arises from a detected need, i.e. the development of the students' evaluative judgement. The research presents a mixed approach intervention design (See Figure 3):

Figure 4: Causal model of relationship between variables Both methodologies complement one another and will be specified in at least two continuous cycles (Figure 5) of design validation, implementation, analysis and redesign, leading the different iterations to the enhancement of the prototype (mobile devices) and the refinement of the intervention.

Figure 3: Intervention design based on [22]. a) On the one hand, a quasi-experimental pre-test/post-test design will be carried out, taking participation and feedback as independent variables, self-regulation as an intermediate variable (connector), and the students' evaluative judgement as an independent variable, defined in terms of: the ability to make judgements, confidence in their own judgements, and confidence in the judgements made by others. This will all be placed in a context in which mobile technologies are used (See Figure 4). This is a causal model of relationships, in which we want to verify whether the occurrence of the first causes the other. In order to

Figure 5: Iterative cycles based on DBR methodology 3.3.1 Participants. The sample of the study is composed of 1,065 students on different Bachelor's and Master's degrees studying at three universities in Spain. In accordance with the design of the study, two iterative cycles are established to implement the proposal. Table 1 and table 2 show the specific sample for each period: 1st term and 2nd term of the 2019/20 academic year, respectively. 153

TEEM 2018, October 2018, Salamanca, Spain

J. Cubero-Ibáñez et al.

Table 1. Sample of the pilot study: 1st term 2019/20 Master or Degree Business Administration (UCA)

Module Project Management, (Cádiz, Jerez, Algeciras)

135

Information sources Tutors

Finance and Accounting (UCA)

Project Management

25

Early Childhood Education (UCA)

Educational Innovation and Research

150

Education (USAL)

Assessment Techniques and Instruments

60

Educational Guidance

60

Educational Guidance

20

Career Guidance

20

Economics and Business Management

50

Master in Secondary Education (USAL) Business Administration (URIV)

Table 3. Data collection

n

Data collection techniques Observation

Data collection instruments • • •



Students

Survey



• Total

520

Table 2. Sample of 2nd intervention: 2nd term 2019/20 Master or Degree

Module

n

Early Childhood Education (UCA)

Systematic Observation

120

Early Childhood Education (UCA)

Tutoring and Family

25

Primary Education (UCA)

Educational Foundations

150

Assessment of Courses, Centers and Teachers

80

Educational Guidance

50

Family and Professional Guidance

20

IT Design

50

Communication and Oral Expression

50

Education (USAL)

Journalism and Audiovisual Communication (URIV)

Total

545

3.3.2 Instruments. Data will be collected using the following observation and survey techniques (Table 3). 154

Tutor Diary Critical incident register Initial competence level pre-test questionnaire Final competence level post-test questionnaire Questionnaire about the satisfaction and usefulness of the designed applications Final group interview

Observations will take place using tutor diaries and critical incident registers during implementation. Participating students will be surveyed at the beginning (pre-test) and at the end (posttest) of the intervention on their competence level (evaluative judgement). They will also answer a questionnaire about the satisfaction and usefulness of the designed mobile applications. Finally, in the group interview, general and specific aspects of the suitability of the intervention and its learning will be discussed. 3.3.3 Data Analysis. Quantitative data will be analyzed using SPSS, in order to describe and establish relationships or patterns between the different variables being studied. The qualitative analysis of data from the diaries and critical incident registers, interviews and open questions on the questionnaire will be undertaken globally by the data analysis process established by [23].

4 EXECUTION PROCESS The study will be undertaken in three main phases formed by different activities: a. Design phase • Development of conceptual framework: Construction of the theoretical model, determining its characteristics, requirements, criteria and methodology. Definition of study variables and establishment of indicators for each one to identify the defined relationships. • Analysis of existing mobile applications for assessment:

Development and use of mobile technologies that foster students´ evaluative judgement: a design-based research -

Identify and analyze all mobile applications being used in the field of assessment and which enable student participation. This enables us to detect what the range is and which needs are posed. • Design and development of mobile technologies Develop an app to agree on assessment criteria: CRITERIAL APP, which allows us to propose criteria, define, vote, select, etc. Introduce improvements into the EvalCOMIX Web Service. Adapt the EvalCOMIX Web Service to a mobile application. Create a mobile assessment space, where different tools or mobile devices are gathered to facilitate mobile assessment. b. Implementation phase This phase will experimentally apply these mobile technologies to different subjects within a framework of assessment as learning and empowerment. This will require teacher follow-up in the face-to-face classroom environment, as well as the independent work of students through the designed mobile devices and the activities programmed on the virtual campus. It will be divided into two stages: • Awareness: inform students about the benefits and goals of the intervention. • Exploration: development of the intervention in the different groups selected in two iterations. 1. Initial pilot part: application in the 1st-term subjects of the 2019/2020 academic year (1st iteration) 2. Data collection: application in the 2nd-term subjects of the 2019/2020 academic year (2nd iteration). c. Assessment phase • Design and validate data collection instruments. • Collection of data before application of the programme: Apply and analyze data from the initial competence level pre-test questionnaire. • Collection of data during the application of the programme: Analyze tutor journals and critical incident records. • Collection of data after the intervention: Apply and analyze data from the discussion group with participating tutors. Apply and analyze data of the final group interviews with students. Apply and analyze data from the final competence level post-test questionnaire. #e assessment of the intervention in its development will address both the execution itself and the contextual framework in which it is applied. Special a$ention will be given to the climate in which it is applied and the coherence between real needs and programming. #e most marked characteristic of the methodological process is that the information collected will be used to tailor the intervention to the needs and characteristics of the recipients and contexts (iterative cycles based on DBR

TEEM 2018, October 2018, Salamanca, Spain methodology). #e assessment of the results will address achievements and benefits.

5 EXPECTED OUTCOMES Taking into account today's ever-changing context of unprecedented transformations and increasing global interdependence, this study sets out to contribute towards a participatory and critical society allowing for the inclusion, innovation and reflection of its citizens. More specifically, it envisages: • Providing mobile technologies that enable student participation in the assessment process. • Increasing the level of university students’ evaluative judgement skills. • Increasing the degree of university-based innovation in coherence with today's digital context. • Educating participative, critical, committed students who transfer these skills to a real-life context and contribute to the consolidation of an inclusive and reflective society.

ACKNOWLEDGMENTS This paper has been possible thanks to different projects financed in the public call: TRANSeval Project. Transforming educational learning assessment I+D+i 2017/01.

REFERENCES [1]

Terry Anderson and Julie Shattuck. 2012. Design-Based Research: A decade of progress in education research. Educational Researcher, 41(1), 16-25. DOI:10.3102/0013189X11428813 [2] María Soledad Ibarra-Sáiz and Gregorio Rodríguez-Gómez. 2014. Modalidades participativas de evaluación: Un análisis de la percepción del profesorado y de los estudiantes universitarios. Revista de Investigación Educativa, 32 (2), 339-361. DOI: 10.6018/rie.32.2.172941 [3] Tamsin Haggis. 2003. Constructing images of ourselves? A critical investigation into 'approaches to learning' research in higher education. British Educational Research Journal 29(1),89-104. [4] David Boud and Associates. 2010. Assessment 2020: Seven propositions for assessment reform in higher education. Sydney: Australian Learning and Teaching Council [5] Xiongyi Liu and Lan Li. 2014. Assessment training effects on student assessment skills and task performance in a technology-facilitated peer assessment. Assessment and Evaluation in Higher Education, 39, 275-292. [6] Margaret Price, Chris Rust, Berry O’Donovan, Karen Handley and Rebecca Bryant. 2012. Assessment literacy: The Foundation for Improving Student Learning. OCSLD: Oxford. [7] Christine Redecker and Oystein Johannessen. 2013. Changing assessment— Towards a new assessment paradigm using ICT. European Journal of Education, 48(1), 79–96. [8] University of Reading. 2018. Engage in Assessment Introducing Technology Enhanced Assessment. Retrieved june 13, 2018 from https://www.reading.ac.uk/engageinassessment/using-technology/eiaintroducing-technology-enhanced-assessment.aspx [9] David Boud. 2000. Sustainable assessment rethinking assessment for the learning society. Studies in continuing education, 22 (2),151-167. [10] Gregorio Rodríguez-Gómez and María Soledad Ibarra-Sáiz. 2015. Assessment as Learning and Empowerment: Beyond Sustainable Learning in Higher Education. In Marta Peris-Ortiz, & José María Merigó Lindahl (Eds.), Sustainable learning in higher education, innovation, technology, and knowledge management. (pp. 1-20). London: Springer-Verlag. doi:10.1007/978-3-319-10804-9_1

155

TEEM 2018, October 2018, Salamanca, Spain [11] John Cowan. 2010. Developing the ability for making evaluative judgements. Teaching in Higher Education, 15(3), 323–334. https://doi.org/10.1080/13562510903560036 [12] Joanna Tai, Rola Ajjawi, David Boud, Phillip Dawson, and Ernesto Panadero. 2017. Developing evaluative judgement: enabling students to make decisions about the quality of work. Higher Education, pp. 1–15. Springer Netherlands. https://doi.org/10.1007/s10734-017-0220-3 [13] Joanna Tai, Benedict Canny, Terry Haines, and Elizabeth Molloy. 2016. The role of peer-assisted learning in building evaluative judgement: opportunities in clinical medical education. Advances in Health Sciences Education, 21(3), 659. https://doi.org/10.1007/s10459-015-9659-0 [14] David Boud and Nancy Falchikov. 2006. Aligning assessment with longterm learning. Assessment and Evaluation in Higher Education, 31(4), 399413. [15] David Boud and Rebeca Soler. 2016. Sustainable assessment revisited, Assessment & Evaluation in Higher Education, 41(3), 400-413, DOI: 10.1080/02602938.2015.1018133 [16] David Carless, Kennedy Chan, Jessica To, Margaret Lo and Elizabeth Barrett. 2018. Developing students’ capacities for evaluative judgement through analysing exemplars. In David Boud, Rola Ajjawi, Phillip Dawson

156

J. Cubero-Ibáñez et al.

[17]

[18]

[19]

[20] [21] [22] [23]

and Joanna Tai (Eds), Developing Evaluative Judgement in Higher Education: Assessment for knowing and producing quality work. London: Routledge. Adams Becker, S., Cummins, M., Davis, A., Freeman, A., Hall Giesinger, C., and Ananthanarayanan, V. 2017. NMC Horizon Report: 2017 Higher Education Edition. Austin, Texas: The New Media Consortium. Myke Sharples, Josie Taylor and Giasemi Vavoula. 2006. A Theory of Learning for the Mobile Age. In R. Andrews and C. Haythornthwaite, The Sage Handbook of Elearning Research, (pp.221-247). Sage publications (hal00190276) Marguerite Koole. 2009. A Model for Framing Mobile Learning. In M. Ally (ed.), Mobile Learning: Transforming the Delivery of Education and Training. Edmonton. EVALfor Research Group - Assessment and Evaluation in Formative Contexts.University of Cadiz and University of Seville. http://evalfor.net http://evalcomix.uca.es John Creswell. 2015. A concise introduction to mixed methods research. Thousand Oaks, CA, EE. UU: SAGE Gregorio Rodríguez, Javier Gil and Eduardo García. 1999. Metodología de la Investigación Cualitativa.. Archidona, MA: Aljibe.