issues through the recognition that limited adoption at the practitioner level is ... developers in the adoption of different LA technologies in Estonia, Australia ... to LA innovations, we reviewed different conceptual frameworks and reviews on this ..... Lonn, Aguilar, & Teasley, 2013; Macfadyen & Dawson, 2012), and even by ...
Preprint submitted for review Orchestrating Learning Analytics (OrLA): supporting the adoption of learning analytics at the practitioner level Luis P. Prieto, Tallinn University (Estonia) María Jesús Rodríguez-Triana, Tallinn University (Estonia) and École Polytechnique Fédérale de Lausanne (Switzerland) Roberto Martínez-Maldonado, University of Technology Sydney (Australia) Yannis Dimitriadis, Universidad de Valladolid (Spain) Dragan Gašević, Monash University (Australia) and University of Edinburgh (UK) Despite the recent surge of research in learning analytics (LA), its adoption still is a slow and complex process. Previous research has identified the knowledge gaps and communication among different stakeholders (especially, at the practitioner level) as critical factors for successful adoption of LA innovations. This paper aims to address these issues through the recognition that limited adoption at the practitioner level is not particular to LA technology. Under the label “orchestration”, educational technology researchers have tried to refocus their work to emphasize the classroom-level constraints faced in everyday educational activities, in order to favor adoption. In this paper, we review both the learning analytics and orchestration literature to elicit the main aspects to consider in interstakeholder communication about the adoption of LA at the practitioner level. As a result of this review, we propose conceptual frameworks organizing these issues, and tools to support inter-stakeholder communication about LA adoption. We also provide examples that illustrate how researchers and practitioners are using these frameworks and tools to focus communication and decision-making during the adoption of LA innovations in Australia, Spain and Estonia.
Introduction Aside from its role as a driver for institutional efficiency in higher education, Learning Analytics (LA) is growing in popularity also as an approach to inform everyday teaching and learning practice at all educational levels (Gašević, Dawson, & Siemens, 2015). Yet, despite the increase in research about this kind of practitioner-level use of LA, its large-scale adoption is still slow (Ali, Asadi, Gašević, Jovanović, & Hatala, 2013; Ferguson et al., 2014, 2016). Prompted by the needs of educational institutions (e.g., universities, national ministries) to understand and plan LA adoption, a number of researchers have started looking more systematically at the challenges of adopting LA, from an institutional/strategic point of view (Colvin et al., 2015; Gašević, Dawson, & Pardo, 2016; Tsai & Gašević, 2017, Tsai, MorenoMarcos, Tammets, Gašević, 2018). One of the crucial challenges for adoption that this growing body of literature identifies, is the lack of communication among stakeholders, and the knowledge gaps that some stakeholders (particularly practitioners) may have about the LA innovations (Colvin et al., 2015; Ferguson et al., 2014; Macfadyen & Dawson, 2012; Macfadyen, Dawson, Pardo, & Gašević, 2014; Tsai & Gašević, 2017; Tsai et al., 2018). Slow technology adoption, however, is not a challenge specific only to LA. This issue has been extensively studied in the field of technology acceptance, both in general (e.g., the TAM model, see Davis Jr, 1986; Lee, Kozar, & Larsen, 2003) and in education (Aldunate & Nussbaum, 2013; Terzis & Economides, 2011). In fact, for the last decade researchers have been looking at how educational technology impacts everyday teaching and learning practice, and the severe constraints under which it operates. This strand of research is often referred to as “designing for orchestration” (Dillenbourg, 2013; Roschelle, Dimitriadis, & Hoppe, 2013), where orchestration is defined as “the process of productively coordinating supportive interventions across multiple learning activities occurring at multiple social levels” (Dillenbourg, Järvelä, & Fischer, 2009, p. 12). That is, orchestration is the main activity of practitioners (and other classroom-level actors) in everyday educational practice. In this paper, we contend that this orchestration-related research focus can be very beneficial to understand the process of LA adoption, and can be used to support inter-stakeholder communication in such adoption processes. To achieve this goal, we must:
Preprint submitted for review a. b.
understand what the communication should be about (i.e., what are the potential knowledge gaps), and scaffold the communication process itself.
This paper presents our work towards overcoming the first of those challenges (i.e., what communication should be about), through a synthesis of existing literature on conceptualizations of LA adoption and educational technology adoption from the point of view of orchestration. This analysis of literature has been synthesized into two conceptual frameworks organizing the issues (i.e., the potential knowledge gaps) for inter-stakeholder communication. These frameworks can be seen as “boundary objects” to aid in the communication between the different stakeholder communities (Star & Griesemer, 1989). The paper also presents our initial work towards addressing the second challenge (scaffolding the communication process). A communication instrument has been derived from these frameworks (a series of forms for different stakeholders). We present illustrative examples of how we are using these conceptual frameworks and the communication instrument in our ongoing efforts involving practitioners and developers in the adoption of different LA technologies in Estonia, Australia and Spain. The rest of the paper is organized as follows: first, we present our review of literature on LA and educational technology adoption; next, we present the conceptual frameworks and communication support tool that synthesize the main topics for communication detected in the review. We then illustrate the use of these boundary objects through three cases of their application in LA adoption processes in different countries. Finally, the implications and future outlook of this line of work are discussed.
Orchestration and Learning Analytics: A review of literature Review method To understand what topics and issues are important in the communication among stakeholders regarding the adoption of an LA innovation at a practitioner level, we have reviewed and analyzed literature sources in learning analytics and orchestration-related research. Given the breadth of these two fields, our analysis has identified four categories of papers especially relevant to model important notions in LA innovations and their adoption at the classroom-level: ● ● ● ●
LA conceptual frameworks: To understand what issues are generally important in LA research and how LA innovations can be conceptualized. LA codes of practice: To understand how LA is applied in everyday practice and what kinds of practices it involves, we reviewed LA “codes of practice”, created by different institutions to guide practitioners in appropriate use of LA. LA adoption: To understand what aspects are important in the process of LA adoption, we reviewed papers looking at this process beyond the account of particular institutions’ case studies. Orchestration-related frameworks: To understand how the notion of orchestration may be applicable to LA innovations, we reviewed different conceptual frameworks and reviews on this topic, which go beyond the account of particular empirical studies or the adoption of a particular educational technology proposal.
These literature sources have been sampled from the main venues for each of these areas: for LA, the Learning Analytics and Knowledge (LAK) conference proceedings and the Journal of Learning Analytics; for orchestration, the scientific workshops organized around the topic between 2011 and 2015, and the special section on the topic published in the Computers and Education journal. Additional sources have been taken from queries to databases like Google Scholar, intended to capture “grey literature” (Kitchenham, 2004) and other sources that may have been overlooked. The initial literature search was performed in the Spring of 2016, and a complementary search was performed in November 2017, to add recently-appeared sources on these topics. By applying the criteria outlined above for each of the four categories of papers, a total of 38 papers were finally selected for analysis. The procedure to analyze these papers included the extraction of the following items from each paper: the main contribution of the paper, its scope of application (e.g., learning context), the main needs or questions it addressed, and the topics or issues that the contribution identifies. These topics identified in
Preprint submitted for review each literature source were then clustered in a bottom-up fashion (i.e., joining very similar aspects) into 23 overall topics. These 23 topics were then clustered again into a theory-driven, researcher-oriented framework, and a simplified, more stakeholder-friendly framework (see the section describing the OrLA frameworks below). Figures 1 and 3 represent graphically this process of topic clustering. Below we provide a brief summary of the reviewed literature sources and the main topics mentioned. Learning Analytics: Frameworks, codes of practice and adoption LA conceptual frameworks Since its establishment around 2011, the LA research community has worked intensively to define its goals, main processes, and most important aspects. These self-reflection efforts, materialized into conceptual frameworks, are fundamental for conceptualizing and communicating about the LA innovations with stakeholders that are to adopt them. These frameworks have often addressed stakeholder concerns such as the kinds of support provided by LA solutions, and the comparison between different LA systems, or how to choose among LA systems (Bakharia et al., 2016; Cooper, 2012; Siemens & Baker, 2012). Other frameworks have touched upon the underlying assumptions of an LA innovation (Cooper, 2012) or, more broadly, what is the rationale behind an innovation (Cooper, 2012; EminMartinez et al., 2014). Other conceptual frameworks, in turn, have provided guidelines for the design of LA technologies (Knight & Anderson, 2016; Martinez-Maldonado et al., 2016; McPherson, Tong, Fatt, & Liu, 2016). Another important aspect that LA frameworks can help us understand is how everyday educational practice at the classroom level (i.e., the orchestration) is changed by introducing these innovations. Most commonly, LA frameworks have emphasized pedagogical guidelines for how to use LA solutions effectively (Drachsler & Greller, 2016; Emin-Martinez et al., 2014; Greller & Drachsler, 2012; Steiner, Kickmeier-Rust, & Albert, 2016). Other frameworks have focused on the kinds of affordances offered by LA solutions, and the actionability of LA insights (Vatrapu, 2012). Another group of LA frameworks has focused on the interaction between LA and local constraints and challenges. Such contextual constraints are mentioned by several authors (Drachsler & Greller, 2016; Greller & Drachsler, 2012; Steiner et al., 2016), with special emphasis on the expectations that different stakeholders (e.g., teachers, students) may have about LA and LA outcomes (Drachsler & Greller, 2012). Other frameworks have stressed the support or facilitation that can be available in the local context (e.g., training, rewards, institutional support), which can make LA more effective and appealing for its wider adoption (Davenport, Harris, & Morison, 2010). Multiple LA frameworks have also highlighted the importance of taking into account teacher-specific (or other stakeholder-specific) challenges such as insufficient data literacy, beliefs, and attitudes (Davenport et al., 2010; Greller & Drachsler, 2012; Vatrapu, 2012). As in many other areas of technology adoption, among the LA frameworks included in the review there exists an implicit assumption that an appraisal of costs and benefits of an LA innovation can be instrumental for its adoption. Several frameworks have highlighted the importance of evaluating LA (Papamitsiou & Economides, 2014; Willis III, Campbell, & Pistilli, 2013), with multiple authors explicitly acknowledging the innovation’s added value for students’ learning as the main indicator to be ascertained in such evaluations and appraisals (Greller & Drachsler, 2012; Papamitsiou & Economides, 2014; Willis III et al., 2013). Others have mentioned the synergies with the existing socio-technical ecosystem as also important in such evaluations (Bakharia et al., 2016). Finally, it should be noted that another common theme appearing in the analyzed LA frameworks is the emphasis on ethics and privacy issues (Drachsler & Greller, 2016; Steiner et al., 2016). For instance, several authors have highlighted the inclusion of users (e.g., teachers, students) in the design process of LA innovations, as a way to achieve more responsible LA innovations (Knight & Anderson, 2016; McPherson et al., 2016). LA codes of practice Related to ethics and privacy as major factors for the acceptance and adoption of LA, several “codes of practice” have been put forward by different authors and institutions to guide researchers and practitioners. Many of these codes of practice have indeed a strong focus on ethics in the application of
Preprint submitted for review LA (Pardo & Siemens, 2014; Rodríguez-Triana, Martínez-Monés, & Villagrá-Sobrino, 2016; Sclater, 2014; Sclater & Bailey, 2015; Slade & Prinsloo, 2013). Ethics, however, is not the only topic covered by these codes of practice. We can also find references to the aforementioned appraisal of costs and benefits of LA, indicating the importance of engaging practitioners and other stakeholders so that they can clearly see the potential and tradeoffs of LA (Gunn et al., 2015). Others have also mentioned how finding synergies with existing technologies and practices can also be key in better perceiving LA’s benefits (Rodríguez-Triana et al., 2016). Several codes of practice also emphasize different aspects of the LA innovation, particularly proposing technology design guidelines (Gunn et al., 2015). Pedagogical guidelines are mentioned in these codes of practice, related not only to the use of the technology, but also the pedagogical practices themselves (Pardo & Siemens, 2014; Sclater, 2014; Sclater & Bailey, 2015; Slade & Prinsloo, 2013; Wise, 2014). It is worth mentioning that contextual constraints and challenges (from identifying meaningful data sources, to adapting the solutions to the local workload limitations and learning goals) also feature prominently in some of these sources (Rodríguez-Triana et al., 2016). Conceptualizing LA adoption After a few initial years in which LA was considered a novel, emergent approach, LA researchers are now considering the adoption in actual practice as a critical issue for the field, in the sense of large-scale institutional adoption (as noted in the introduction), but also at the classroom-level (Ali et al., 2013; Colvin et al., 2015; Gašević et al., 2016; Tsai & Gašević, 2017; Tsai et al., 2018). Among the LA literature sources that explicitly address this kind of adoption, we again find several highlighting the costs and benefits that have to be well understood by stakeholders for such an adoption to occur, especially in terms of LA’s added value for learning (Tsai & Gašević, 2017), and the importance of evaluating LA innovations to more clearly ascertain these benefits (Ali et al., 2013; Tsai & Gašević, 2017). These adoption-oriented sources discuss many of the issues mentioned in more general LA frameworks, such as the importance of addressing privacy and ethical issues as a necessary stepping stone towards adoption (Gašević et al., 2016). They also pay attention to how the LA innovation is conceptualized, e.g., regarding its assumptions (Colvin et al., 2015) and rationale (Gašević et al., 2016), and how compatible they are with the local context. Referring to the influence of local constraints and challenges as critical factors for adoption (Gašević et al., 2016), several authors find that institutional support, facilitation or training are critical to achieve local adoption (Colvin et al., 2015; Tsai & Gašević, 2017). This institutional support can also be instrumental in tackling teacher-specific challenges like data literacy, which are recognized as one of the main hurdles for LA adoption (Gašević et al., 2016; Tsai & Gašević, 2017). We can see these sources as addressing topics of educational practice at the classroom-level (i.e., orchestration-related issues): taking into account the existing orchestration before the introduction of LA (e.g., in terms of actors and roles), as well as how LA changes these roles (Colvin et al., 2015). More generally, pedagogical guidelines on the effective use of LA are considered necessary for its sustainable adoption (Tsai & Gašević, 2017). The adoption of educational technology at the practitioner level: Orchestration Adding to the aforementioned issues and topics detected in the LA literature, and especially the newlyemerged sources focused on the adoption of these innovations, we can look at previous work on orchestration, which defines critical issues in educational technology, and how it can be designed so that it is more easily adopted at the classroom-level (Dillenbourg, 2013; Dimitriadis, Prieto, & Asensio-Pérez, 2013; Nussbaum & Diaz, 2013; Prieto, Wen, Caballero, & Dillenbourg, 2014; Terzis & Economides, 2011). A good amount of this orchestration-related literature emphasizes the proposal of educational technologies that fit the very restrictive constraints and challenges that teachers (and students) face in everyday practice, e.g., in terms of time, attention, energy, cognitive resources, etc. (Dillenbourg, 2013; Prieto, Dimitriadis, Asensio-Pérez, & Looi, 2015; Prieto, Holenko-Dlab, Abdulwahed, Gutiérrez, &
Preprint submitted for review Balid, 2011). They also highlight other means of support and facilitation (e.g., lesson plans, guidelines of technology usage, etc.) as crucial for effective adoption of any classroom technology (Dimitriadis et al., 2013; Nussbaum & Diaz, 2013; Terzis & Economides, 2011). As already recognized above, innovations that are adapted to teacher-specific challenges are more likely to succeed (Prieto et al., 2015, 2011; Terzis & Economides, 2011), while other authors also consider the users’ perception of the technology as critical for adoption (Terzis & Economides, 2011). Similarly to the LA literature, in orchestration-related sources the cost-benefit appraisal of the innovation is identified as critical, with several authors focusing on how to evaluate an innovation from the point of view of classroom orchestration (Nussbaum & Diaz, 2013; Prieto et al., 2014; Terzis & Economides, 2011). Similarly, several authors highlight that the main focus of these evaluations should be the added value for student learning (Phiri, Meinel, & Suleman, 2016; Terzis & Economides, 2011), even if affective aspects and social value of the innovations are also recognized as important (Terzis & Economides, 2011). One of the main topics in which orchestration research has made significant advances (i.e., where much of its added value lies, compared to the LA literature) is in providing tools and frameworks to conceptualize learning technology innovations, for instance, by establishing design guidelines for “orchestrable” (i.e., more easily adoptable) classroom technologies (Cuendet, Bonnard, Do-Lenh, & Dillenbourg, 2013; Dillenbourg, 2013; Dimitriadis et al., 2013; Phiri et al., 2016; Prieto et al., 2014; Sharples, 2013). These guidelines (which highlight values like minimalism or teacher control) can also be very useful for LA technology and innovation design, as may be works that cover issues associated with usability and learners’ experience (Terzis & Economides, 2011) or how a particular educational technology compares to others (Tchounikine, 2013). Another issue where orchestration-related literature can create great added value is in how it provides tools to characterize orchestration, both before and after the introduction of an innovation: defining the activities and tasks that classroom orchestration entails (Dillenbourg, 2013; Phiri et al., 2016; Prieto et al., 2015, 2011; Tchounikine, 2013), or the importance of actors and roles (i.e., who performs each of these tasks), and how the innovation changes them (Prieto et al., 2015, 2011). In this regard, some of the works on orchestration have also provided pedagogical guidelines that may support the adoption of a technology for a particular pedagogical purpose (Dimitriadis et al., 2013; Hämäläinen & Vähäsantanen, 2011; Nussbaum & Diaz, 2013).
Orchestrating Learning Analytics (OrLA) framework The synthesis of the existing literature summarized in the previous section reveals plenty of frameworks and factor checklists about learning analytics and orchestration which are relevant for the process of LA adoption, aimed at different groups of stakeholders (teachers, researchers, tech developers). There is, however, not so much work on supporting inter-stakeholder dialogue to connect supply-side and demandside (Ferguson et al., 2016), even though these knowledge gaps and communication issues have been repeatedly identified as crucial by existing studies on LA adoption (Ali et al., 2013; Ferguson et al., 2014; Lonn, Aguilar, & Teasley, 2013; Macfadyen & Dawson, 2012), and even by studies on innovation implementation before the rise of LA (McIntosh, 1979). Understanding LA adoption: an Activity Theory view In a first attempt to understand which issues are critical in the process of LA adoption (and communicating with classroom-level stakeholders about it), we aimed at answering the question “What issues are important for the adoption of learning analytics in learning and teaching practice?”. We resorted to socio-cultural activity theory (as re-formulated by Engeström, 1987) to structure the results of our literature review. We clustered the 23 emergent topics from the literature analysis above into Engeström’s classic structure (which was specifically targeted at human activities that involve some kind of change and tensions - clearly the case of LA adoption). According to Engeström’s conception, human activity systems involve one or more human actors (subject) acting towards a goal (outcome), by producing objects (in a very general sense, including also knowledge or experiences). This production is mediated primarily by tools. However, the activity is also embedded in a socio-cultural context that involves a wider community, with its own rules and division of labor.
Preprint submitted for review Given our focus on the classroom-level adoption of LA, we can consider practitioners as the main subjects of the activity (although it could also be modified to focus on students). These subjects use a series of tools (both conceptual and technological) to perform their teaching/learning practice (the object of the activity). By changing some of these tools (introducing an LA innovation), such practices evolve towards a new LA-enhanced practice (the outcome). The proposed framework also recognizes the mediating role of contextual aspects (rules, community, division of labor), and which factors are critical for each of them. Figure 1 represents how we came from the four categories of literature in our review (top layer) and the 36 analyzed papers (second layer), to the 23 emergent topics depicted in the third layer. We then categorized these topics according to which of the elements in an activity system it made reference to (bottom layer). Thus, we arrive to an activity-theoretical depiction of LA adoption, along with important topics to understand each of the elements of the activity system (Figure 2). There, we can see for example that it is crucial to understand, not only the novel affordances of the LA innovation, but also how this newly-introduced element involves costs (in terms of time, effort, etc.) and modifies the mediating role that the rest of the technologies used in the classroom ecosystem have (see the tools element at the top of Figure 2). Not only can this conceptual framework be useful for researchers, e.g., to focus the investigation of an LA adoption process. It could also be seen as a first ‘boundary object’ that could support communication between multiple stakeholders. Boundary objects have been studied in sociology, in the interactions between different communities, by noting how certain (physical or conceptual) artifacts play a critical role in this communication (Star, 1989; Star & Griesemer, 1989). The notion has been widely applied later in information systems (Huvila, Anderson, Jansen, McKenzie, & Worrall, 2016), computer-supported collaborative work (CSCW) and many other fields. It can also be linked to different communities of practice (Wenger, 1998) working in a common task (as it occurs in the case of LA adoption). The framework defined in Figure 2 can thus be considered an ‘ideal type’, in terms of the boundary object taxonomy defined by Star and Griesemer (1989): a “diagram, atlas or other description which in fact does not accurately describe the details of any one locality or thing [...] abstracted from all domains, and may be fairly vague [...] it serves as a means of communicating and cooperating symbolically—a ‘good enough’ road map for all parties”. Supporting stakeholder communication (I): A simpler conceptual view One disadvantage of the conceptual framework presented above is the fact that it requires a certain understanding of activity theory and what an activity system is, to be used directly by stakeholders. Hence, its usefulness for direct usage as a boundary object is limited, being more useful for a researcher or policy-maker, as a tool to analyze or structure inter-stakeholder communication. In order to have a boundary object more directly usable by the different stakeholders, we performed a different topic clustering, aiming at a more easily interpretable view of the adoption process.
Preprint submitted for review
Figure 1. Literature review and topical mapping to socio-cultural activity theory (SCAT) notions
Preprint submitted for review
Figure 2: OrLA framework representing LA adoption as a socio-cultural activity system
One such simpler interpretation, which can be more natural from the point of view of the stakeholders (e.g., a practitioner), is to think of the adoption as an appraisal and decision process (i.e., ‘should I adopt this new LA tool in my classroom?’). The rationale behind this alternative conception of adoption is the fact that such appraisal and decision (based on costs and affordances) is a recurrent notion across sources in all four categories of literature analyzed (see the section on the literature review). In order to perform this cost/benefit appraisal, the stakeholders need to reflect upon (and, most probably, exchange information about) different crucial aspects of the innovation and the context, which can be grouped along a reduced number of themes (to help organize the large number of topics identified). Figure 3 represents this alternative clustering process, from literature review categories and individual papers, to the 23 emergent topics and larger themes to consider in this simplified decision process. The resulting framework, represented graphically in Figure 4, tries to answer the question: “Which of the topics detected in literature most often have uneven knowledge among stakeholders?”, i.e., which are more likely to require communication to achieve common ground. This second framework defines six main areas to touch upon: 1) Local constraints and challenges; 2) Current orchestration/practice; 3) New, LA-enhanced orchestration/practice; 4) Characteristics of the LA innovation itself; 5) Ethics and privacy issues. The sixth area involves stakeholders getting common ground on the 5 previous ones, and making an informed 6) Appraisal and joint decision of whether to adopt the LA technology, or what form it should take to be adopted. This second framework again can be seen as an ‘ideal type’ boundary object (Star & Griesemer, 1989), to be used in LA adoption processes (both linear and/or iterative). A practitioner could, for instance, use these larger themes, to structure consultations or interviews with researchers or system developers. Indeed, the following section provides further examples of how this framework is being used in real LA adoption processes.
Preprint submitted for review
Figure 3. Literature review and topical mapping to simplified OrLA framework notions
Preprint submitted for review
Figure 4: Simplified OrLA framework Supporting stakeholder communication (II): Detailed stakeholder forms While the conceptual frameworks presented above can be useful, there are limits to the usefulness of such ‘ideal type’ abstractions, as they can also be understood by stakeholders in very different ways. There exist, however, more concrete kinds of boundary objects that can be supportive of such communication, such as ‘standardized forms’ (Star & Griesemer, 1989). We thus developed a set of forms that aim to support the communication process among the three main stakeholders that are involved in most LA adoption scenarios: researchers, LA system developers, and teachers/practitioners. However, as suggested by the framework in Figure 4, it could also be extended to other stakeholders like students or legal experts. An illustrative representation of these forms is presented in Figure 5, and a full electronic version of them is available for detailed inspection at https://tinyurl.com/OrLA-forms. The communication process implied by these forms involves each stakeholder filling in a form with questions about the topics each of them is most likely to have knowledge about, which other stakeholders might not know about (see Figure 4), which are in the top-left part of each form (color-coded for each stakeholder). Then, each (still incomplete) form is passed onto the other stakeholders, who read the information provided, and bring on comments and contrasting pieces of their knowledge, to unearth potential tensions and conflicts. For instance, the form that the teacher completes contains questions regarding what awareness and assessment processes (e.g., monitoring of students’ work - one of the classic orchestration activities) are performed in the concrete educational context before introducing LA (including when they are performed, supporting technologies, or time constraints). In the same form, system developers would note how this would change once the LA tool is introduced (e.g., whether the practices would have to change, or there are potential conflicts with the local time constraints). Later on, the researcher would add information on whether the changed practices and constraints are still compatible with the theoretical and pedagogical assumptions of the innovation.
Preprint submitted for review
Figure 5: Stakeholder forms based on the OrLA framework. Colors denote who needs to fill in each form: researchers (blue), system developers (green) and teachers (yellow).
OrLA in action: three illustrative examples To help the reader understand the usefulness of the OrLA boundary objects presented above, we present below three real examples of use within LA development and adoption processes, in different countries. Estonia: Using the simplified framework in stakeholder interviews In the context of a research project on how to support the co-creation of educational innovations using learning analytics (Ruiz-Calleja, Rodríguez-Triana, Prieto, Poom-Valickis, & Ley, 2017), simple LA tools are being piloted with a few volunteer teachers, in actual university courses in which the teachers want to introduce a certain pedagogical innovation. In this context, the simplified OrLA framework has been used to structure the conversations and semi-structured interviews with teachers, both before and after the classroom usage of the LA tools. In this context, OrLA has been useful to establish a dialogue that invites these volunteering teachers to understand the technology and the innovation, and whether they would be adopted once the initial excitement wears off. For researchers and system developers, these OrLA-based interviews have served to delve deeper in understanding the local constraints, and to derive modifications needed in later iterations of the LA technology, and the likelihood that teachers will want to use it for next year’s courses. Figure 6 depicts an actual interview guide used by a researcher-developer to talk with the practitioner after she has used the LA tool in a few sessions of her course. The interview guides both stakeholders in reviewing relevant pieces of information dealing with all the framework’s main themes. These include, for instance, the intended benefit of increased student engagement, any learning benefits, but also the costs entailed - all under the theme of cost/benefit appraisal (see the green arrows in Figure 6).
Preprint submitted for review
Figure 6: Teacher interview guide used at a small-scale LA pilot, marking the simplified OrLA framework themes used to structure the questions. Australia: Using stakeholder forms in an LA implementation case The OrLA framework has been useful in the context of an ongoing project to develop an awareness tool to be used in debriefing sessions after simulation-based learning, in healthcare university settings (see Figure 7, left). The goal of such a debriefing tool is to use the data from multimodal learning analytics (Ochoa, 2017) to support teacher-led reflection in the classroom, but also independent work by students, who can access the tool online (see initial prototype in Figure 7, right). This project is conducted as a collaboration between a LA research group and the local School of Nursing at an Australian university. In this context, the main stakeholders in the conversation about the design, development and adoption of the aforementioned LA tool included: two developers/technicians (D1 and D2), one course coordinator (or teacher, T), a simulation manager and a simulation researcher (R1, R2). The OrLA stakeholder forms have been used in the initial stages of the project to generate an understanding of the different (and sometimes, conflicting) perspectives that these stakeholders had. Due to the logistic and scheduling restrictions of the stakeholders, these five actors preferred to fill in their initial input to the stakeholder forms individually, and then, in a second round of inputs, responded to each the other’s forms, also individually. Only then, in a third round, stakeholders actually met and discussed their different perspectives.
Figure 7: Authentic learning analytics project where the OrLA forms have been put into practice. Left: the learning situation that teachers want to improve consist in the assessment of healthcare simulations. Right: an initial prototype of a reflection tool to highlight simulation mistakes.
Preprint submitted for review Overall, the stakeholder forms were effective in allowing the different actors to speak up, as expressed by two of the nursing researchers: “[the forms] give opportunities for people of different disciplines to voice their thoughts on all aspects of the given task" [R2] and “put together multiple perspectives of the team in a structured way” [R1]. The teacher found it most useful, to “share information among the team members [with the goal of] improving practice and student learning, which often does not naturally occur" [T]. Developers/technicians, on the other side, did not fully understand the orchestration-related questions, but still expressed that they would like to know more about the teaching responsibilities, as they have a clear impact in the system they implement. It should be noted that this LA implementation project is still on a very early phase, and hence the stakeholder forms may be even more useful in later phases of the project, when the LA tool and intervention are more fully defined. Spain: Using stakeholder forms in a LA workshop with different stakeholders The OrLA framework and the boundary objects presented above are mainly designed to support an ongoing process of adoption or implementation of LA innovations (as the two examples above illustrate). These conceptual tools, however, can also be used with educational purposes, taking a more long-term perspective: to help different stakeholders think about LA innovations in general, something they can apply in their future practice. This is exactly what we did in the context of a workshop on the topic of learning analytics, which attracted both LA researchers, LA-interested teachers and LA system developers. The workshop took place in Spain in 2016. Thirteen participants (most of them Spanish) from these three stakeholder communities (researchers, teachers, developers) gathered in this 90-minute event. The goal of the event was to discuss the adoption of concrete LA tools (provided by the participant researchers and developers) in concrete authentic educational settings (represented by the participant teachers). The workshop activities included: a) an introduction of the problem of LA adoption and the OrLA framework; b) groupwork in multi-stakeholder teams of 3-4 people, in which they filled in the OrLA stakeholder forms and passed them to the other stakeholders in the team to complete; c) discussion on the shared information, and the reaching of a final decision on whether each concrete LA tool should be adopted in each particular classroom-level setting; and d) wrap-up and feedback about the framework and the forms themselves. Although this 90-minute format proved too short to use the (rather complex) stakeholder forms in their entirety, participant stakeholders were enthusiastic about the way OrLA structured the conversation and considered the perspectives of multiple actors (with some participants asking for more information to use it in their research/development right away). Rather than yes/no decisions, the workshop’s exercise yielded “starting points for adoption”, detecting needs for further integration into the existing tool and practice ecosystem, and further iterations in the adoption process, as very often “lots of adaptations and development [during the adoption process] are needed” [P1]. The workshop also prompted proposals of extending the boundary objects: for instance, some participants suggested that “students should be included” [P2] as another very relevant stakeholder in the process. Participants also suggested highlighting “key points in every section” [P6], for cases in which time is very limited.
Discussion and future work This paper started out by highlighting one of the main issues identified in the emergent literature focused on the adoption of learning analytics innovations: the importance (and difficulty) of communication between the different stakeholders involved in the process of adopting an LA innovation at the classroom level (e.g., teachers, students, researchers, or technology developers). After a review of LA and orchestration-related literature, we have proposed the “Orchestrating Learning Analytics” (OrLA) conceptual framework, reified into several boundary objects that can be used to scaffold such interstakeholder communication. This framework and the synthesis of literature from which it stems are the main contributions of this paper to the learning analytics knowledge base. As such, they differ from existing frameworks characterizing LA adoption (such as the Rapid Outcome Mapping Approach used by Macfadyen et al., 2014; or the SHEILA policy framework by Tsai et al., 2018), in that OrLA focuses explicitly on topics
Preprint submitted for review and stakeholders that are critical at the tactical level of everyday educational practice (as opposed to, e.g., a focus on the institutional-level policy), while keeping an explicitly multi-stakeholder perspective (as it is precisely intended to support inter-stakeholder communication). Hence, the framework neither competes nor opposes such top-down policy frameworks, or other bottom-up approaches to LA adoption. We see OrLA as a flexible tool to support stakeholder communication in any kind of LA adoption process. Although the boundary objects presented here can be used as-is in their current form (as shown in the three international illustrative examples of the previous section), it should be noted that boundary objects cannot really be designed to their last details, as their very nature is to be adapted and repurposed in contextualized practice by the stakeholder communities that use them (Star & Griesemer, 1989). In that sense, our present proposal is just a starting point for stakeholders to adapt to their particular contexts. This adaptation is indeed already apparent in the customized interview guide of the illustrative example from Estonia, or the simplifications suggested by stakeholders in the workshop held in Spain. This emergent quality of boundary objects does not excuse us, however, from evaluating their effectiveness in supporting communication (and iteratively developing them). Indeed, the illustrative examples above are part of ongoing formal evaluation efforts taking place in multiple countries. Despite these efforts, the true proof of the frameworks’ usefulness will be their wider usage in research and innovation, either as-is or in locally-adapted forms, in grassroots or institution-driven LA adoption processes. In this direction, we are already organizing scientific events to disseminate and discuss these boundary objects, adapting and refining them with the help of the broader LAK community (e.g., a workshop being held in the LAK 2018 conference in Sydney). Aside from these evaluations and dissemination, our future work in this line of research includes the alignment of the OrLA framework with other emergent trends such as “responsible research and innovation” (RRI) (Wickson & Carew, 2014). We believe that early user involvement and multistakeholder conversations have a crucial role in LA’s research agenda, if we are to avoid dystopian visions of the application of analytics to educational settings (Griffiths, Brasher, Clow, Ferguson & Yuan, 2016). It is these conversations what the OrLA framework aims to support.
Acknowledgements This research has been partially funded by the European Union in the context of CEITER and the NextLab (Horizon 2020 Research and Innovation Programme, grant agreements no. 669074 and 731685), as well as the Spanish Ministry of Economy and Competitiveness (project TIN2014-53199-C3-2-R and TIN2017-85179-C3-2-R), the Spanish Ministry of Education, Science and Sports (grant PRX177700140) and the Regional Government of Castilla y León (project VA082U16).
References Aldunate, R., & Nussbaum, M. (2013). Teacher adoption of technology. Computers in Human Behavior, 29(3), 519–524. doi:10.1016/j.chb.2012.10.017 Ali, L., Asadi, M., Gašević, D., Jovanović, J., & Hatala, M. (2013). Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Computers & Education, 62, 130–148. doi:10.1016/j.compedu.2012.10.023 Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gaševic, D., Mulder, R., … Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK’16 (pp. 329–338). New York, New York, USA: ACM Press. Colvin, C., Rogers, T., Wade, A., Dawson, S., Gaševic, D., Buckingham Shum, S., & Fisher, J. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Sydney, NSW: Australian Office for Learning and Teaching. Cooper, A. (2012). A Framework of Characteristics for Analytics. CETIS Analytics Series, 1(7). Available from: http://publications.cetis.org.uk/2012/524 Cuendet, S., Bonnard, Q., Do-Lenh, S., & Dillenbourg, P. (2013). Designing augmented reality for the classroom. Computers & Education, 68, 557–569. doi:10.1016/j.compedu.2013.02.015 Davenport, T. H., Harris, J. G., & Morison, R. (2010). Analytics at work: Smarter decisions, better
Preprint submitted for review results. Harvard Business Press. Davis Jr, F. D. (1986). A technology acceptance model for empirically testing new end-user information systems: Theory and results (Doctoral dissertation). Massachusetts Institute of Technology. Dillenbourg, P. (2013). Design for classroom orchestration. Computers & Education, 69, 485–492. doi:10.1016/j.compedu.2013.04.013 Dillenbourg, P., Järvelä, S., & Fischer, F. (2009). The Evolution of Research in Computer-Supported Collaborative Learning: from design to orchestration. In N. Balacheff, S. Ludvigsen, T. de Jong, A. Lazonder, & S. Barnes (Eds.), Technology-Enhanced Learning: Principles and Products (pp. 3–19). Springer. Dimitriadis, Y., Prieto, L. P., & Asensio-Pérez, J. I. (2013). The role of design and enactment patterns in orchestration: Helping to integrate technology in blended classroom ecosystems. Computers & Education, 69, 496–499. doi:10.1016/j.compedu.2013.04.004 Drachsler, H., & Greller, W. (2012). The pulse of learning analytics understandings and expectations from the stakeholders. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK ’12 (p. 120). New York, New York, USA: ACM Press. doi:10.1145/2330601.2330634 Drachsler, H., & Greller, W. (2016). Privacy and analytics: it’s a DELICATE issue a checklist for trusted learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK’16 (pp. 89–98). New York, New York, USA: ACM Press. Emin-Martinez, V., Hansen, C., Rodríguez-Triana, M. J., Wasson, B., Mor, Y., Ferguson, R., & Pernin, J.-P. (2014). Towards teacher-led design inquiry of learning. E-Learning Papers. Special Issue on Learning Analytics and Assessment, 36, 3–14. Engeström, Y. (1987). Learning by expanding: an activity-theoretical approach to developmental research. Helsinki, Finland: Orienta-Konsultit. Ferguson, R., Brasher, A., Clow, D., Cooper, A., Hillaire, G., Mittelmeier, J., … Vuorikari, R. (2016). Research Evidence on the Use of Learning Analytics: Implications for Education Policy. Joint Research Centre. Ferguson, R., Macfadyen, L. P., Clow, D., Tynan, B., Alexander, S., & Dawson, S. (2014). Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption. Journal of Learning Analytics, 1(3), 120–144. doi:10.18608/jla.2014.13.7 Gašević, D., Dawson, S., & Pardo, A. (2016). How do we start? State and Directions of Learning Analytics Adoption. Oslo, Norway: International Council For Open And Distance Education (ICDE). Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. doi:10.1007/s11528-014-0822-x Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 15(3), 42–57. Griffiths, D., Brasher, A., Clow, D., Ferguson, R. & Yuan, L. (2016). “Visions of the Future” Horizon Report. Learning Analytics Community Exchange (LACE) Public Deliverable D3.2. Available online at http://www.laceproject.eu/deliverables/d3-2-visions-of-the-future-2/ Gunn, C., McDonald, J., Donald, C., Milne, J., Nichols, M., & Heinrich, E. (2015). A practitioner’s guide to learning analytics. Proceedings of the Australasian Society for Computers in Learning and Tertiary Education - ASCILITE 2015 (pp. 672-675). Hämäläinen, R., & Vähäsantanen, K. (2011). Theoretical and pedagogical perspectives on orchestrating creativity and collaborative learning. Educational Research Review, 6(3), 169–184. doi:10.1016/j.edurev.2011.08.001 Huvila, I., Anderson, T. D., Jansen, E. H., McKenzie, P., & Worrall, A. (2016). Boundary objects in information science. Journal of the Association for Information Science and Technology. Kitchenham, B. (2004). Procedures for performing systematic reviews. Keele, UK, Keele University, 33(2004), 1–26. Knight, S., & Anderson, T. D. (2016). Action-oriented, Accountable, and inter (Active) Learning Analytics for Learners. Proceedings of the LAK 2016 Workshop on Learning Analytics for Learners (pp. 47-51). Available from http://ceur-ws.org/Vol-1596/ Lee, Y., Kozar, K. A., & Larsen, K. R. (2003). The technology acceptance model: Past, present, and future. Communications of the Association for Information Systems, 12(1), 50. Lonn, S., Aguilar, S., & Teasley, S. D. (2013). Issues, challenges, and lessons learned when scaling up a learning analytics intervention. In D. Suthers & K. Verbert (Eds.), Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13 (p. 235). New York,
Preprint submitted for review New York, USA: ACM Press. doi:10.1145/2460296.2460343 Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149. Macfadyen, L. P., Dawson, S., Pardo, A., & Gašević, D. (2014). Embracing big data in complex educational systems: The learning analytics imperative and the policy challenge. Research & Practice in Assessment, 9. Martinez-Maldonado, R., Schneider, B., Charleer, S., Shum, S. B., Klerkx, J., & Duval, E. (2016). Interactive surfaces and learning analytics: Data, orchestration aspects, pedagogical uses and challenges. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK ’16 (pp. 124–133). New York, New York, USA: ACM Press. doi:10.1145/2883851.2883873 McIntosh, N. E. (1979). Barriers to implementing research in higher education. Studies in Higher Education, 4(1), 77–86. McPherson, J., Tong, H. L., Fatt, S. J., & Liu, D. Y. (2016). Student perspectives on data provision and use: starting to unpack disciplinary differences. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK’16 (pp. 158–167). New York, New York, USA: ACM Press. Nussbaum, M., & Diaz, A. (2013). Classroom logistics: Integrating digital and non-digital resources. Computers & Education, 69, 493–495. doi:10.1016/j.compedu.2013.04.012 Ochoa, X. (2017). Multimodal Learning Analytics. In C. Lang, G. Siemens, A. Wise, & D. Gasevic (Eds.), Handbook of learning analytics (pp. 129–141). Society for Learning Analytics Research (SoLAR). doi:10.18608/hla17.011 Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Journal of Educational Technology & Society, 17(4), 49. Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology : Journal of the Council for Educational Technology, 45(3), 438–450. doi:10.1111/bjet.12152 Phiri, L., Meinel, C., & Suleman, H. (2016). Streamlined orchestration: An orchestration workbench framework for effective teaching. Computers & Education, 95, 231–238. Prieto, L. P., Dimitriadis, Y., Asensio-Pérez, J. I., & Looi, C. K. (2015). Orchestration in learning technology research: evaluation of a conceptual framework. Research in Learning Technology, 23(1), 28019. doi:10.3402/rlt.v23.28019 Prieto, L. P., Holenko-Dlab, M., Abdulwahed, M., Gutiérrez, I., & Balid, W. (2011). Orchestrating Technology Enhanced Learning: a literature review and a conceptual framework. International Journal of Technology-Enhanced Learning (IJTEL), 3(6), 583–598. Prieto, L. P., Wen, Y., Caballero, D., & Dillenbourg, P. (2014). Review of augmented paper systems in education: An orchestration perspective. Journal of Educational Technology & Society, 17(4), 169– 185. Rodríguez-Triana, M. J., Martínez-Monés, A., & Villagrá-Sobrino, S. (2016). Learning analytics in smallscale teacher-led innovations: Ethical and data privacy issues. Journal of Learning Analytics, 3(1), 43–65. Roschelle, J., Dimitriadis, Y., & Hoppe, U. (2013). Classroom orchestration: Synthesis. Computers & Education, 69, 523–526. doi:10.1016/j.compedu.2013.04.010 Ruiz-Calleja, A., Rodríguez-Triana, M. J., Prieto, L. P., Poom-Valickis, K., & Ley, T. (2017). Towards a Living Lab to support evidence-based educational research and innovation. In Á. Hernández-García, M. Caeiro-Rodríguez, & P. J. Muñoz-Merino (Eds.), Proceedings of the Learning Analytics Summer Institute Spain 2017. Aachen: CEUR-WS. Sclater, N. (2014). Code of practice for learning analytics: A literature review of the ethical and legal issues. Jisc, November, 5. Sclater, N., & Bailey, P. (2015). Code of practice for learning analytics. JISC, London. Retrieved October. Sharples, M. (2013). Shared orchestration within and beyond the classroom. Computers & Education, 69, 504–506. doi:10.1016/j.compedu.2013.04.014 Siemens, G., & Baker, R. S. J. d. (2012). Learning analytics and educational data mining: Towards communication and collaboration. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK ’12 (p. 252). New York, New York, USA: ACM Press. doi:10.1145/2330601.2330661
Preprint submitted for review Slade, S., & Prinsloo, P. (2013). Learning Analytics: Ethical Issues and Dilemmas. American Behavioral Scientist, 57(10), 1510–1529. doi:10.1177/0002764213479366 Star, S. L. (1989). The structure of ill-structured solutions: heterogeneous problem-solving, boundary objects and distributed artificial intelligence. In Readings in distributed artificial intelligence (Vol. 2, pp. 37–54). Morgan Kaufmann. Star, S. L., & Griesemer, J. R. (1989). Institutional Ecology, Translations and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology. Social Studies of Science, 19, 387–420. Steiner, C. M., Kickmeier-Rust, M. D., & Albert, D. (2016). LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox. Journal of Learning Analytics, 3(1), 66– 90. Tchounikine, P. (2013). Clarifying design for orchestration: Orchestration and orchestrable technology, scripting and conducting. Computers & Education, 69, 500–503. doi:10.1016/j.compedu.2013.04.006 Terzis, V., & Economides, A. A. (2011). The acceptance and use of computer based assessment. Computers & Education, 56(4), 1032–1044. Tsai, Y.-S., & Gašević, D. (2017). Learning analytics in higher education --- challenges and policies: A review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference on - LAK ’17 (pp. 233–242). New York, New York, USA: ACM Press. doi:10.1145/3027385.3027400 Tsai, Y-S., Moreno-Marcos, P. M., Tammets, K., Gašević, D. (2018). SHEILA policy framework: informing institutional strategies and policy processes of learning analytics. In Proceedings of the Eighth International Learning Analytics & Knowledge Conference on - LAK ’18 (in press). New York, New York, USA: ACM Press. Vatrapu, R. K. (2012). Towards semiology of Teaching Analytics. In Workshop Towards Theory and Practice of Teaching Analytics, at the European Conference on Technology Enhanced Learning. Saarbrücken, Germany. Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge university press. Wickson, F., & Carew, A. L. (2014). Quality criteria and indicators for responsible research and innovation: learning from transdisciplinarity. Journal of Responsible Innovation, 1(3), 254–273. doi:10.1080/23299460.2014.963004 Willis III, J. E., Campbell, J., & Pistilli, M. (2013). Ethics, big data, and analytics: A model for application. EDUCAUSE Review Online. Available from https://er.educause.edu/articles/2013/5/ethics-big-data-and-analytics-a-model-for-application Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. In Proceedins of the Fourth International Conference on Learning Analytics And Knowledge - LAK ’14 (pp. 203–211). New York, New York, USA: ACM Press. doi:10.1145/2567574.2567588