Improving learning design practices through strategic ... - CiteSeerX

1 downloads 1181 Views 82KB Size Report
Improving learning design practices through strategic integrated evaluation. Chien-Sing Lee. Faculty of Information Technology,. Multimedia University ...
Int. J. Cont. Engineering Education and Lifelong Learning, Vol. 18, No. 1, 2008

139

Improving learning design practices through strategic integrated evaluation Chien-Sing Lee Faculty of Information Technology, Multimedia University, Cyberjaya 63100, Selangor, Malaysia E-mail: [email protected] Abstract: Strategic INtegrated Evaluation Methodology (SINEM) addresses three problems: first, the time needed to formulate strategies and identify evaluation criteria to meet objectives; second, the diversity of evaluation criteria among evaluation models which hinder a holistic perspective of evaluation and third, the need to prioritise these strategies based on the likelihood of the strategy achieving the desired performance in objectives. The first significance of the diversified model-driven knowledge base of strategies and critical success factors lies in the formation of an ontological basis for evaluating and prioritising strategies within or among communities of practice. This consequently leads to the second significance, an ontological basis for road mapping performance improvement initiatives. The third significance is basis for designing a decision support/performance support system. Simulations on learning design are presented. Keywords: improving and road mapping performance improvement initiatives in engineering education; knowledge management; reference model; strategic integrated evaluation methodology. Reference to this paper should be made as follows: Lee, C-S. (2008) ‘Strategic integrated evaluation methodology’, Int. J. Cont. Engineering Education and Lifelong Learning, Vol. 18, No. 1, pp.139–153. Biographical notes: Chien-Sing Lee’s work on quality improvement originates from workflow quality improvement in instructional design. This interest is later extended to e-commerce and higher education practices as testbeds for the scalability of quality practices. The motivation for scaling quality practices came from having served in government-to-government activities, e.g. the Japanese–Malaysian government tele-education satellite project and UNESCOrelated human resource development initiatives under the Malaysian Ministry of Information, where it is observed that the deployment of quality practices would reduce error greatly and optimise the use of resources. Other research interests are computer-supported collaborative learning, games-based learning, mobile learning, ontology and the Semantic Web, knowledge management and change management. She also serves in the editorial board and programme committee of several international journals and conferences.

Copyright © 2008 Inderscience Enterprises Ltd.

140

1

C-S. Lee

Introduction

Blended learning requires a synthesis of analysis, design, development and evaluation of diverse learning approaches, delivery modes, delivery tools and assessment methods in meeting learning objectives. The plethora of options stretches across 14 dimensions best described by Reeves’ (1993) evaluation of what really matters in computer-based education. All 14 dimensions stretch across continuums. The first, second and third dimensions are epistemology, educational philosophy and underlying psychology, respectively. Epistemology determines the suitability of objectivism–constructivism theories, pedagogical philosophy, the instructivist–constructivist approaches and underlying psychology, the behavioural-cognitive psychologies. The other dimensions are as follows: goal dimension (sharply focused to unfocused), experiential value (abstract to concrete), teacher role (didactic to facilitative), programme flexibility (teacher-proof to easily modifiable), treatment of errors (frowned upon to valued), the origin of motivation (extrinsic–intrinsic), accommodation of individual differences (none or multi-faceted), learner control (none or unlimited), user activity (mathemagenic or generative), cooperativeness (unsupported or integral) and cultural sensitivity (none or integral). The author extends from Reeve’s 14 dimensions by optimising the choices made from these 14 dimensions. Optimisation is achieved by prioritising strategies for blended learning, which are more likely to achieve objectives (Pretorius, 2004). Benefits from doing so are clear: (1) better allocation of human and financial resources and (2) better prioritisation of performance improvement initiatives. The aim is to improve learning design by leveraging on and developing learning design (Koper et al., 2002) units of activities. Units of activities are chosen because units of activities form the core competencies of any organisation (Hamel, 1994). In this paper, units of activity are characterised by strategies and critical success factors in relation to their objectives and contexts of use.

1.1 Problems Three issues are of interest in this paper. First, learner objectives differ from one to another. In addition, critical success factors to achievement of these objectives differ due to the individual contexts and external influences. Therefore, formulating and mapping strategies with objectives and determining evaluation criteria require expense in terms of time. As such, a performance support system explicating suitable methods or techniques is necessary. Second, learning evaluation models are often regarded as independent from each other, to be used only based on their respective functions. As such, there is a lack of grounded reference to the evaluation criteria (critical success factors) of other models, which may add value to the current model considered. In short, the potential benefits of synergising these models towards achieving organisational objectives have been little explored. Third, the relevance and usefulness of each strategy meeting each learning objective has to be determined. This is so as to identify strategies that will add value to the learner and to prioritise the deployment of strategies and organisational resources.

Improving learning design practices through strategic integrated evaluation 141

1.2 Objectives To address the first problem mentioned earlier, the author proposes a diversified modeldriven knowledge base of strategies and critical success factors. The concept of a diversified model-driven knowledge base extends from the Object Management Group’s Model-Driven Architecture (MDA) concept (Mukerji and Muller, 2003). In MDA-based forward engineering, the software architect first decides on a computation-independent model, which defines the business context (business processes and business concepts). This leads to a platform-independent model, which describes the business processes and business concepts’ functionality and behaviour. The most suitable platform-specific model is subsequently determined and implemented. Reverse engineering on the other hand, involves discovery of new computation-independent models from the platformspecific and platform-independent models. Advantages to the MDA are easy instantiation of architectural design to different implementation platforms, rapid prototyping and reduction in error. Error is reduced as software architects and programmers tap into the existing problems and solutions indexed according to different architectural contexts. The author views MDA’s computation-independent model as parallel to Reeves’ 14 dimensions, the platform-independent model to learning approaches, platform-specific model to the learning methods and the specifics within the platform-specific model to the learning techniques. The author adapts the MDA concept to create a knowledge base driven by diverse learning models. The diversified model-driven knowledge base enables retrieval and instantiation of strategies and critical success factors. This in turn can create flexibility, reusability to different domains and needs and reduction in time needed to formulate, map and evaluate strategies against objectives. An example of reuse and reduction in time is in formative and summative evaluation. Formative evaluation aims at evolutionary improvements through evaluation during the instructional design process and most data on effectiveness of instruction is student (user)-driven (Scriven, 1991; Reeves and Hedberg, 2003; Dick and Johnson, 2007). Summative evaluation is usually implemented to derive conclusions at the end of a programme. Given that summative evaluation is more comprehensive than formative evaluation, criteria for evaluation can be added or refined with reference to suitable models captured in the knowledge base. The author addresses the second problem by adding value to the current model through identification and communication of relevant strategies found in other models in the knowledge base. From a pedagogical perspective, strategies correspond to pedagogical techniques and critical success factors to prerequisites to different learning paths. Identification and prioritisation of learning paths in terms of likelihood of performance in relation to learner context or objectives can enable reuse and optimise adaptation of the curriculum (learning path) to the learner. Data mining techniques such as in Lee (2007) can be used to map the learner profile to these paths. As for the third problem, the author utilises the Quality Function Deployment (QFD) Methodology (Erikkson and McFadden, 1993). QFD is used to determine the predictive performance quality (significance) of each strategy (how) corresponding to these objectives (what).

142

C-S. Lee

1.3 Scope The sampling of evaluation models in this paper is scoped to those most popularly known in their respective functional areas. It is hoped that starting off with the most popular models as the basis for formulation of a reference model will lead to deeper investigation and synergy of other models in those functional areas and consequently refinement of the reference model to meet the needs of different communities of practice. The author hopes that identifying the effectiveness of these strategies through actual implementation in Communities of Practice can result in more fruitful collaborative outcomes and refinement of knowledge in the knowledge base. Wibe and Kommers (2001) have demonstrated in their study on policies, dissemination and reality of information and communication technologies in regular education in north-western European countries that strategy formulation is crucial to decision-makers due to the direct impact on teachers and students and, more importantly, due to the influence on innovation and change processes. Since Strategic INtegrated Evaluation Methodology (SINEM) has only been simulated but not tested in actual contexts of use, the pragmatic aspects will need to be included through collaborative initiatives in actual contexts. The outline for the paper is as follows: Section 2 introduces the knowledge management framework, which contextualises evaluation initiatives in this paper. Subsequently, representative evaluation models and QFD are briefly reviewed. Section 3 introduces SINEM and Section 4 presents simulation of SINEM for learning design best practices. Section 5 concludes this study.

2

Related works

2.1 Learning design evaluation Scriven (1991) defines evaluation as ‘the process of determining the merit, worth and value of things. Evaluations are the product of that process’ (p. 139). Merit is derived from the intrinsic value of the product or process being evaluated, worth refers to the market value attributed by stakeholders to the product or process being evaluated and value involves value judgments (perceived benefits) by the same stakeholders. Our concern is to provide a basis for strategic learning by identifying the merit and value of teaching–learning strategies/methods in meeting learning objectives. The diversified model-driven approach is consequently used to provide the recommendations regarding suitable methods and techniques to meet learning objectives. Learning Design as defined by the IMS Learning Design specifications (Koper et al., 2002) covers three tenets: (1) learning will be more effective when there is careful sequencing of learning activities in a learning workflow, (2) when there is learning by doing (human factor) and (3) when learning designs are captured for future reuse and sharing (codification of knowledge). Learning design leaves the decision regarding which learning approach to be used to the instructor. Possible learning approaches are behaviourist, cognitivist or constructivist (Reeves, 1993). Each learning approach is substantiated by methods. The steps within each method are called techniques. Layers of instructional design to address simple to

Improving learning design practices through strategic integrated evaluation 143 complex learning needs can be created through layers of approach/method/techniques (Lee, 2004). The author defines learning design evaluation as involving evaluation of which teaching–learning approach, method or technique are suitable in meeting learning objectives. This concept borrows from Dempsey and Litchfield’s (2006) Direct Assessment Pyramid Model, which builds on top of Bloom’s (1956) knowledge, comprehension, application, analysis and evaluation taxonomy. Direct Assessment emphasises that evaluation has to come before learning design. This is to ensure that there is direct mapping between learning outcomes and learning design providing a better basis for assessment. Dempsey and Litchfield place identical elements at the pinnacle of the pyramid followed by simulated elements, procedural understanding, conceptual understanding and related knowledge. Identical (similar/analogical) elements assess the degree of learning transfer within an actual work environment. Expected learning outcomes are appropriate applications of concepts, application of rules, ability to identify similar elements between the actual scenario and prior learning, ability to discriminate differences between the actual situation and prior learning and ultimately problem-solving skills. Both identical and simulated elements depend on sufficient mastery of procedural understanding and conceptual understanding. Conceptual understanding is more foundational compared to procedural understanding. Dempsey and Litchfield’s most fundamental level of assessment lies with related prerequisite knowledge. Some of the assessment items included are simple physical discriminations (e.g. matching machine screws), knowing about simple motor skills, labels, facts and summary information (even complex summaries). Amidst these diverse approaches–methods–techniques however, Merrill (2002) notes that there are commonalities. He lists the following first principles (commonalities) as evaluative factors in assessing learning design in terms of learning effectiveness: •





Learning is facilitated when learners are engaged in solving real-world problems (learning by doing) •

Learners are to be involved in problem identification and not merely problem solving



Learners need to be shown the task that they are going to solve



Learners need to explicitly identify differences from one stage of learning to the other

Learning is facilitated when existing knowledge is activated as a foundation for new knowledge (learning by doing) •

Learners are asked to recall, relate, describe or apply knowledge from past experience



Learners are provided with relevant experience fundamental to the next task



Learners are given the opportunity to demonstrate their grasp of knowledge

Learning is facilitated when new knowledge is demonstrated to the learner (careful sequencing of learning activities) •

Demonstration is consistent with the learning goal, e.g. demonstrations of procedures, visualisations of processes and modelling of behaviour

144

C-S. Lee •





Learners are given suitable guidance, e.g. provision of relevant information, multiple forms of representations (text, graphics, videos, etc.), contrasts between demonstrations

Learning is facilitated when new knowledge is applied by the learner (reuse) •

Appropriate feedback and coaching should be provided inclusive of identification and correction of errors



The problems to be solved should be varied but incremental in complexity

Learning is facilitated when new knowledge is integrated into the learner’s world (reuse and sharing) •

Learners are given opportunities to reflect, discuss and defend their opinion



Learners are given opportunities to create and explore new ways to use their new knowledge.

2.2 Quality Function Deployment (QFD) methodology Quality function deployment or QFD (Barnett and Raja, 1995) is used in this paper to establish functional relationships between objectives and strategies. These correlational relationships are depicted in a matrix. Objectives are specified generally and subsequently in greater detail, forming a network of objective–strategy relationships at different levels of granularity. Predictive quality performance for each objective– strategy functional relationship is rated at 9 for strong, 3 for medium and 1 for weak. Hence, QFD allows not only quality of objective–strategy to be quantified but also the determination of priorities.

3

Strategic INtegrated Evaluation Methodology (SINEM)

Strategic INtegrated Evaluation Methdology (SINEM) originates from work on workflow assessment, WaLwFA. In WaLwFA (Lee and Lim, 2007), the authors looked at software quality improvement from its most fundamental component, i.e. the workflow. Quality can be improved by identifying best practices. We aim to improve internal best practices by incorporating externally identified best practices. As such, mapping and merging of internal and external best practices are necessary. Furthermore, these best practices can lead to the discovery or refinement of meta-models and reference meta-models. Different layers of criteria are identified to evaluate workflow and weights are determined by users to identify the importance of each evaluation criteria and the collective weight of each category of criteria. We believe that this will lead to better design of decision support systems. In SINEM, the author extends WaLwFA’s focus on quality improvement to the design of a model-driven knowledge base for strategic planning. The model-driven knowledge base, consisting of different models from literature or from identification of best practices, forms the basis for the meta-models and reference meta-models mentioned in WaLwFA. Weights determined by users in WaLwFA are replaced with the QFD methodology in SINEM to provide the correlation between objectives and strategies. This paper addresses application of SINEM for improvement of learning design. Concurrent

Improving learning design practices through strategic integrated evaluation 145 work is ongoing in the research team to simulate SINEM for e-commerce and higher education. In the next section, the author presents SINEM and how SINEM can be applied for formulating strategies to form the basis for learning design improvement.

4

Simulation on learning design improvement

SINEM for learning design improvement involves the following steps: Step 4.1 Identify the learning objectives Step 4.2 Categorise the objectives according to content evaluation perspectives (these perspectives vary according to what needs to be evaluated. Content evaluation perspectives are chosen here as an example of evaluationdriven design in-line with Dempsey and Litchfield’s evaluation-driven Direct Assessment) Step 4.3 Identify the models and corresponding strategies to be referenced Step 4.4 Determine common prerequisites across similar strategies (learning methods/techniques) Step 4.5 Present an intersection between objectives and techniques in a matrix and their alignment with the perspectives in Step 4.2 Step 4.6 Utilise QFD to determine likelihood of achieving these objectives and to prioritise the learning methods or techniques Step 4.7 Utilise the prerequisites in Step 4.4 to refine and/or to create new learning methods/techniques Step 4.8 Visualise the optimal learning paths.

4.1 Learning objectives Let us suppose that the student’s learning goal is to learn the concept multimedia compression. The learning objectives are: (1) to be able to differentiate between image and video compression and (2) to explain the techniques to compress sample video files.

4.2 Categorise the objectives according to content evaluation perspectives The content evaluation perspective used here is that of Bloom’s (1956) taxonomy. The first objective is categorised under Bloom’s taxonomy as comprehension. The second objective is categorised as application of knowledge.

4.3 Identify the models and the methods/strategies to be referenced As mentioned earlier, the author refers to Reeves’ 14 dimensions (mentioned in the Introduction) as the pedagogical framework for this paper. As such, models and methods/strategies to be used for the above two learning objectives in-line with Reeves’ 14 dimensions are elaborated below. Cognitivism–constructivism theories and correspondingly cognitivist–constructivist approaches and cognitive psychologies will be referred to for the simulation on learning

146

C-S. Lee

design due to the sharply focused learning goal of stimulating and refining critical thinking. The experiences are concrete and the teacher plays a facilitative role. As such, the flow in the learning process is easily modifiable with learners generating their own activities to meet the learning goal. Errors are not frowned upon. It is hoped that freedom of expression will create a greater sense of learner control and intrinsic motivation to learn. The author has identified three main cognitivist–constructivist strategies/methods aimed at providing another level of detail to Learning Design for the first and second learning objectives respectively. These strategies/methods are as follows: (1) Gagne’s nine events (1987), (2) worked examples (Sweller and Cooper, 1985) and (3) discovery learning (Bruner, 1963). The following techniques for each strategy/method are extracted from Alessi and Trollip (1991). Techniques within Gagne’s nine events are as follows: gain attention, inform lesson objective, stimulate recall of prior learning, present distinctive features, provide guided learning, elicit performance, provide information feedback, assess performance and enhance retention and transfer. On the other hand, the sequence of techniques for worked examples is as follows: present stimuli and check affordance, provide worked examples, provide discovery learning exercises and assessment. Discovery learning provides a higher degree of freedom suitable for learning of more complex cognitive skills. Corresponding discovery learning techniques are as follows: activating prior knowledge, defining outcomes, modelling outcomes or providing frameworks, presenting a general topic for research, setting up a structure for inquiry presentation and refining thinking. Complementary models to the cognitivist–constructivist approach/method/techniques are Bloom’s (1956) taxonomy, Learning Design (2000) and Merrill’s (2002) First Principles of Instruction (henceforth referred to as First Principles).

4.4 Determine common prerequisites or critical success factors (learning techniques) across similar strategies In Table 1, Merrill’s First Principles complements Learning Design’s critical success factors.

4.5 Present an intersection between objectives and critical success factors (techniques) in a matrix and their alignment with the perspectives in Step 4.2 The matrix indicating correlation in terms of likelihood of performance between objectives, strategies and critical success factors are shown in Table 5.

4.6 Utilise QFD to determine likelihood of strategies/methods achieving objectives and to prioritise the learning methods or techniques At the first level, the suitability of strategy is determined for each learning objective. The complexity of the objective to the achieved and the amount of resources required are considered. These two factors are taken into account in determining the QFD score for a strategy. For the first objective, the level of difficulty is minimal. As such, Gagne’s nine events of instruction are sufficient to meet the first learning objective.

Improving learning design practices through strategic integrated evaluation 147 However, for the second learning objective, students have to explain how to compress sample video files. They have not seen these video files before, and therefore, the learning objective requires transfer of knowledge from what was learnt in class to application of this knowledge to a different set of video files. As such, worked examples and discovery learning are deemed more likely to meet the learning objective compared to Gagne’s nine events as shown in Table 2. Table 1

Common critical success factors across Learning Design and First Principles

Strategies

Bloom’s objective Comprehension

Objective Differentiate between image and videocompression

Learning Design (careful sequencing – Gagne’s nine events of instruction) Gain attention

Inform lesson objective

Stimulate recall of prior learning Present distinctive features

Application

Explain techniques to compress sample video files

Provide guided learning Elicit performance Provide information feedback Assess performance Enhance retention and transfer Learning Design (careful sequencing – worked examples) Present stimuli and check affordance (arouse learner with novelty, pose questions) Provide worked examples

Provide discovery learning exercises Assessment

First Principles (engaged in solving realworld problems) Involve learners in problem identification and not just solving the problem Show learners the task or the problem that they are going to solve at the end of the module

Sequence the problem resolution based on gradual increment in difficulty and from stage-to-stage, explicitly identify differences

First Principles (activation of knowledge) Learners are asked to recall, relate, describe or apply knowledge from past experience Learners are provided with relevant experience fundamental to the next task Learners are given the opportunity to demonstrate their grasp of knowledge

148

C-S. Lee

Table 2

Quality Function Deployment (QFD) to determine suitability of strategy Strategies

Bloom’s objective

Objective

Gagne

Worked examples

Discovery learning

Total

Comprehension

To differentiate between types of compression

9

3

3

15

Application

To explain techniques to compress sample video files

3

9

9

21

The second level (Table 3) uses QFD to indicate the correlation between objective and strategy-technique as well as likelihood of performance. As mentioned in Sub-section 2.3, QFD indicators are 1 for less likely performance, 3 for moderate likely performance and 9 for most likely performance. For the gain attention technique, both weak and advanced learners are likely to perform well, but to involve learners in identifying the problem, advanced learners are more likely to perform compared to weaker learners. As such, the lower score for weaker learners indicate the need to prepare more resources to help them bridge the gap. Value-added critical success factors from First Principles are indicated in italics. From Table 3, both weaker and advanced students are deemed to be able to perform well using Gagne’s nine events. However, First Principles is problem based. As such, for techniques which provide higher degree of freedom and control to learners, such as to identify problems and to state the differences, weaker learners require more help and resources. Some examples of techniques to bridge the gap are indicated in Step 4.7 (Table 4). Table 3

QFD to identify likelihood of performance for two different user profiles Strategies

Bloom

Obj.

Comprehension

Differentiate between image and video compression

First Principles (engaged in solving real-world problems)

Learning Design Gain attention

Total

Involve learners in problem identification and not just solving the problem

(Weak)

9

3

12

(Advanced)

9

9

18

Inform lesson objective

Show learners the task or the problem that they are going to solve at the end of the module

(Weak)

9

9

18

(Advanced)

9

9

18

Stimulate recall of prior learning (Weak)

9

9

18

(Advanced)

9

9

18

Improving learning design practices through strategic integrated evaluation 149 Table 3

QFD to identify likelihood of performance for two different user profiles (continued) Strategies

Bloom

Obj.

First Principles (engaged in solving real-world problems)

Learning Design Present distinctive features

Total

Sequence the problem resolution based on gradual increment in difficulty and from stage-to-stage, explicitly identify differences

(Weak)

9

3

12

(Advanced)

9

9

18

(Weak)

9

9

18

(Advanced)

9

9

18

Provide information feedback

Assess performance Enhance retention and transfer

Application

Explain techniques to compress sample video files

Careful sequencing (worked examples)

First principles (activation of knowledge)

Present stimuli and check affordance (arouse learner with novelty, pose question to learner)

Learners are asked to recall, relate, describe or apply knowledge from past experience

(Weak)

9

(Advanced)

9

Provide worked examples

9

18

9

18

Learners are provided with relevant experience fundamental to the next task

(Weak)

9

(Advanced)

9

Provide discovery learning exercises

9

18

9

18

Learners are given the opportunity to demonstrate their grasp of knowledge

(Weak)

3

3

6

(Advanced)

9

9

18

Assessment

4.7 Utilise the prerequisites in Step 4.4 to refine and/or to create new learning methods/techniques Problem areas, which require more thought from the instructor in designing teaching and learning for weaker students are extracted from Table 3. Guidance to address these problems is obtained from discovery learning. Discovery learning techniques as

150

C-S. Lee

mentioned in Section 2.1 are: activating prior knowledge, defining outcomes, modelling outcomes or providing frameworks, presenting a general topic for research, setting up a structure for inquiry presentation and refining thinking. Techniques which will be used to bridge the gap between weaker students and experts are shown in Table 4 in bold. Table 4

Bridging techniques Strategies

Bloom’s objective

Objective

Comprehension

Differentiate between image and video compression

Learning Design (careful sequencing – Gagne)

First Principles (engaged in solving realworld problems)

Gain attention

Involve learners in problem identification and not just solving the problem

Model outcomes in prototypical examples Present distinctive features

Sequence the problem resolution based on gradual increment in difficulty and from stage-to-stage, explicitly identify differences

Set-up structure for inquiry (develop their stand from the problem identified earlier, research on background information, come up with a content outline, elaborate on the content, propose solutions) Application

Explain techniques to compress sample video files

Careful sequencing (worked examples)

First Principles (activation of knowledge)

Provide discovery learning exercises

Learners are given the opportunity to demonstrate their grasp of knowledge

Simulate the above structure for the sample video files explanation (use concept maps where relevant)

4.8 Visualise the optimal learning paths For the experts, they can follow the conventional path and spend more time by themselves on the research aspect. For the weaker students, to bridge the gap, their path will be longer. Their learning path for the first learning objective is shown in Table 5. A more detailed study on specification for learning path is addressed in another paper in this issue (Janssen et al., 2007).

Improving learning design practices through strategic integrated evaluation 151 Table 5 Path

5

Learning path for two different learner profiles Experts

Weaker students

1.

Gain attention by involving learners in problem identification

Gain attention by involving learners in problem identification (+ model outcomes in prototypical examples)

2.

Inform lesson objective by showing learners how to carry out the task

Inform lesson objective by showing learners how to carry out the task

3.

Stimulate recall of prior learning

Stimulate recall of prior learning

4.

Present distinctive features through incremental increase in levels of conceptual difficulty and highlighting similarities and differences from stageto-stage

Develop their stand from the problem identified earlier, research on background information (present distinctive features through incremental increase in levels of conceptual difficulty and highlighting similarities and differences from stage-tostage), guide them to come up with a content outline, elaborate on the content, propose solutions

5.

Provide information feedback

Provide information feedback

6.

Assess performance information feedback

Assess performance information feedback

7.

Enhance retention and transfer

Enhance retention and transfer

Conclusion

The author has presented SINEM as a methodology to address three problems at the learning design level. The first problem is the need to formulate strategies and evaluation criteria (critical success factors) to meet learning objectives. The second problem is the lack of synergy among evaluation models and their corresponding evaluation criteria, resulting in a segmented instead of a holistic view of teaching, learning and organisational processes. The third problem arises from the need to prioritise identified viable strategies. It is clear how SINEM maps learning strategies with learning objectives. Faster mapping can be obtained by referencing existing learning design models. Furthermore, SINEM highlights the added value of reference modelling evaluation criteria (critical success factors) and strategies from different models. The diversified model-driven critical success factors and strategies can be instantiated or reused to meet similar learning objectives. This results in value-added mapping between learning strategies and learning objectives. In addition, the diversified model-driven knowledge base of strategies and critical success factors forms an ontological basis for prioritising strategies, creating road maps for performance improvement initiatives. Last but not least, SINEM has been shown to be especially useful in designing teaching–learning units of activity for complex concepts. This has been simulated in the second learning objective example. Thus, SINEM can benefit the design of a decision support or performance support knowledge base. Future work will be to implement and test SINEM in diverse educational contexts. The author invites collaboration with other researchers and organisations.

152

C-S. Lee

References Alessi, S.M. and Trollip, S.R. (1991) Computer-Based Instruction: Methods and development, 2nd ed., Prentice-Hall, Englewoods Cliffs, NJ. Barnett, W.D. and Raja, M.K. (1995) ‘Application of QFD to the software development process’, International Journal of Quality and Reliability Management, Vol. 12, No. 6, pp.24–42. Bloom, B.S. (Ed.) (1956) Taxonomy of Educational Objectives, the Classification of Educational Goals – Handbook I: Cognitive Domain, McKay, New York. Bruner, J. (1963) The Process of Education. Harvard University Press, Cambridge, MA. Dempsey, J.V. and Litchfield, B.C. (2006) ‘Fidelity viewed as a pyramid of assessment’, Proceedings of the 2006 International Conference on Teaching and Learning in Higher Education, Centre for the Development of Teaching and Learning, National University of Singapore, pp.189–193. Dick, W. and Johnson, R.B. (2007) ‘Evaluation in instructional design: the impact of Kirkpatrick’s four-level model’, in Reiser, R.A. and Dempsey, J.V. (Eds.): Trends and Issues in Instructional Design and Technology, Merrill Education/Prentice-Hall, Upper Saddle River, NJ, pp.94–103. Erikkson, I. and McFadden, F. (1993) ‘Quality function deployment: a tool to improve software quality’, Information and Software Technology, Vol. 35, pp.491–498. Gagne, R. (1987) Instructional Technology Foundations, Lawrence Erlbaum Associates, Hillsdale, NJ. Hamel, G. (1994) ‘The concept of core competence’, in Hamel, G. and Heene, A. (Eds.): Competence-Based Competition, John Wiley & Sons, Chichester. Janssen, J., Berlanga, A., Vogten, H. and Koper, R. (2007) ‘Towards a learning path specification’, International Journal of Continuing Engineering Education and Lifelong Learning (this special issue). Koper, R., Olivier, B. and Anderson, T. (2002) IMS Learning Design Information Model. Available online at: http://www.imsglobal.org/learningdesign/ldv1p0/imsld)infov1p0.html Lee, C.S. (2004). ‘Reuse in modeling instructional design’, World Conference on Educational Multimedia, Hypermedia and Telecommunications, Lugano, Switzerland, AACE. Lee, C.S. (2007) ‘Diagnostic predictive and compositional modelling with data mining in integrated learning environments’, Computers & Education, Vol. 49, No. 3, pp.562–580. Lee, C.S. and A.H.L. Lim (2007) ‘Layered and weighted methodology to workflow evaluation’, International Journal of Electronic Business, Vol. 5, No. 3, pp.380–400. Merrill, M.D. (2002) ‘First principles of instruction’, Educational Technology Research and Development, Vol. 50, No. 3, pp.43–59. Mukerji, J. and Muller, J. (2003) Technical Guide to Model-Driven Architecture: The Model-Driven Architecture Guide Version 1.0.1,. Open Management Group Architecture Board. Pretorius, K. (Ed). (2004) Capacity Building for Effective Quality Management in South African Technikons. The Committee for Technikon Principals (CTP), Pretoria. Reeves, T.C. (1993) Evaluating what really matters in computer-based education. Available online at: http://it.coe.uga.edu/~treeves/edit6150/Reeves2p.pdf Reeves, T.C. and Hedberg, J.G. (2003) Interactive Learning Systems Evaluation, Educational Technology Publications, Englewood Cliffs, NJ. Scriven, M. (1991) ‘Beyond formative and summative evaluation’, in McLauglin, M.W. and Phillips, D.D. (Eds.): Evaluation and Education: At Quarter Century, University of Chicago Press, Chicago, IL, pp.19–64.

Improving learning design practices through strategic integrated evaluation 153 Sweller, J. and Cooper, G.A. (1985) ‘The use of worked examples as a substitute for problem solving in learning algebra’, Cognition and Instruction, Vol. 2, No. 1, pp.59–89. Wibe, J. and Kommers, P.A.M. (2001) ‘Policies, dissemination and reality of information and communication technologies in regular education in north-western European countries’, International Journal of Continuing Engineering Education and Lifelong Learning, Vol. 11, Nos. 4/5/6, pp.298–311.