Automatic answering tool for e-learning environment Mohamed JEMNI & Issam BEN ALI Ecole Supérieure des Sciences et Techniques de Tunis Research Unit of Technologies of Information and Communication (UTIC) 5, Av. Taha Hussein, B.P. 56, Bab Mnara 1008, Tunis, TUNISIA e-mail :
[email protected] [email protected] This work is devoted to design a tool that can answer automatically students by using and exploiting past cumulated experiences in past e-learning sessions. This tool constitutes a new component to be incorporated in elearning environments. For instance, this tool will be added to the system we are developing since several years that we have called PERSO. The main objective of PERSO is to introduce intelligence into e-learning environments by automating a set of its features (e-mail return, student profile determination, dynamic course generation, online assessment and generation of course material at the student level, …) Keywords E-learning environment; automatic answering; latent semantic analysis; intelligent tutor.
1. Introduction The goal of current works of the unity of research of Technologies of Information and Communication (UTIC) of the Higher School of Sciences and Techniques of Tunis, is to improve efficiency of e-learning by introducing intelligence into e-learning environments and automating a set of its features (e-mail return, student profile determination, dynamic course generation, online assessment of student and generation of course material at the student level,…). In this context, we are developing since several years a system that we have called PERSO. The goal of PERSO is to design and develop an adaptive hypermedia e-learning system [4, 5], where learners with different learning goals are treated differently, by building a model of knowledge and preferences about each of them. This model is used to generate automatically a personalized course fitting the needs of each learner. In this paper, we present a new component to be added to our system and concerns the automatic answering of student questions in e-learning environment.
2. The automatic answering tool The aim of this tool is to reduce lecturers’ work load and to give an immediate answer to the student (when possible) by exploring the cumulative experiences from previous students’ answers for the benefit of new ones. It is well known that in every learning session of a given course, students may ask the same questions and that tutors (who may be different) may answer the same answers.
Automatic answering tool
Student
Tutor
Data base of questions/ answers
Fig 1 : The automatic answering tool
Our approach consists of storing questions/answers (with the permission of the tutor) in a data base. If any email’s similarity occurs regarding asked and/or answered questions, the tool tries to search for this information in the data base and answers automatically the student by giving him the stored data. Otherwise, the question will be submitted to the tutor. The functioning of the tool is summarized in figure 2.
Student
Question Student question
Answer
Automatic answering tool
Exist? Yes
No Tutor
Data Base
Fuzzy treatment
Fig 2 : Functioning of the answering tool
2.1 Latent Semantic Analysis of questions A key step of the approach consists on the semantic analysis of questions in order to compare them with others. To process this analysis, we calculate the semantic closeness between the current student’s question and previous questions saved in the data base. The semantic closeness is calculated by use of LSA.
Student question
Stored questions
LSA
(Concept, value: semantic closeness between the two questions) Fig 3: Latent semantic analysis
The treatment of the question depends on the semantic closeness value returned by LSA as described in figure 4.
2
Current question
Data base Question i, Answer i
LSA Similar
Stop and return Answer i
Different
Not sure
Refinement (Fuzzy treatment)
Skip to question i+1
Fig 4 : Treatment of the student’s question
When the analyzer does not found exactly the same question saved in the data base, the system uses other techniques. It proceeds to a refinement treatment of the collected questions when LSA could not give an accurate decision. This treatment uses the meta data introduced by the tutors who have saved the questions and their answers. Every saved question has been answered only one time when it first has been asked. In this case, the tool reacts in semi-automatic way. It may interact with the student by considering his opinion. 2.2 Refinement (Fuzzy treatment) This section presents the fuzzy treatment adopted when the system can not recognize the question with high accuracy, i.e. when LSA gives a set of candidate questions with very close semantic closeness values. In this case, the automatic answering tool proceeds in two steps: Step 1: It proceeds to a refinement of the result by using the Meta data of every question. Those Meta data contain information related to the subject of the question such as the related section in the course, the cognitive level, the pre-requisite sections… This step leads to a reduced set of candidate questions. Step 2: The system can react with the student by proposing for him the reduced set of candidate questions and asking him to select a question from the list else to reformulate his question differently and to submit it again to the system. Whereas if the system does not succeed to treat the student question even after the last step, the system sends the question to the tutor for answering. Figure 5 describes the fuzzy treatment process. 2.3 Steps of the treatment To give more insight on the system’s work, we can identify two different phases: 2.3.1
Alimentation phase : during this phase, the data base will be alimented by questions/answers under the total control of the tutors. A tutor may decide or deny saving the selected questions and their answers with an appropriate interface.
2.3.2
Exploiting phase : it consists of answering automatically or semi automatically the student by using the previously stored data. Notice here, that the efficiency of the tool is ameliorated progressively as much as the data base of questions is alimented.
3
Automatic answering tool
Student question
The result of search is not sure: the question does not exist in the data base.
Search
Meta data stored by the tutor in the data base. According to the Meta data in the data base the system generates a set of questions in which the student picks up the one similar to his question
Generation of a set of questions
Set of questions
Student Student question does not exist in the data base
A question in the set satisfies the student
Tutor
Ok
Fig 5 : The fuzzy treatment
3. Conclusion In this paper, we presented an automatic answering tool based on latent semantic analysis approach. The objective is to exploit the cumulative experiences from previous e-learning sessions in order to give an immediate answer to the student (when possible). A prototype of the answering tool is currently under development. It is based on open source software i.e. MySql database, tomcat web server, JSP code and Linux operating system. We plan in the future to integrate it in the system PERSO in order to experiment and evaluate it before making it available at the internet for free use.
4
References [1] E. Aïmeur, G. Brassard, H. Dufort & S. Gambs, CLARISSE: A Machine Learning Tool to Initialise Student Models, ITS-02, Sixth International Conference on Intelligent Tutoring Systems, pp. 718-728, Biarritz, France. [2] I. Arroyo, R. Conejo, E. Guzmand & B.P Woolf, An Adaptative Web-Based Component for Cognitive Ability Estimation, in Proceedings of the 10th International Conference on Artificial Intelligence in Education, pp. 456–466, San Antonio, Texas, May 19-23 2001. [3] D. Billsus & M. Pazzani, Learning and Revising User Profiles: The Identification of Interesting Web Sites, Machine Learning 27, 313–331 (1997) Kluwer Academic Publishers. [4] H. Chorfi & M. Jemni, PERSO: Towards an adaptative e-learning system, Journal of Interactive Learning Research, (2004), 15 (4), pp 433-447. [5] H. Chorfi & M. Jemni, PERSO: A System to customize e-training, 5th International Conference on New Educational Environments, May 26-28 2003, Lucerne, Switzerland. [6] H. Chorfi, M. Jemni & E. Aimeur, “A Case-Based Reasoning System to customize e-Training”, 4th International Conference on Information Communication Technologies in Education (ICICTE 2003), Samos Island GREECE 3-5 July 2003. [7] P. Cotter & B. Smyth, Wapping the Web - A Case Study in Content Personalisation for WAP-Enabled Devices, In Proceedings of the International Conference on Adaptive Hypermedia and Adaptive Web-based Systems, Trento, Italy, August 2000. [8] P. Gounon &B. Lemaire, Semantic Comparison of Texts for Learning Environments. In F.J. Garijo , J. C. Riquelme Santos, M. Toro (Eds): Advances in Artificial Intelligence - IBERAMIA 2002, Berlin: Springer Verlag LNCS 2527, 724-733. [9] D.B. Leake, Case Based Reasoning: Experiences, Lessons, and Future Directions, Menlo Park: AAAI Press/MIT Press, 1996In Leake, D., ed., 1996. [10] M. Jemni & I. Ben Ali, Design of an automatic answering tool for e-learning environment, poster - the International Conference on Engineering Education, October 16-21, 2004, University of Florida, Gainesville, USA. [11] F. Marir & I. Watson, "Case-based reasoning: A review" . The Knowledge Engineering Review, 9(4), 1994. [12] J. Kay, Stereotypes, Students Models and Scrutability, in Gauthier, G., Frasson, C. and VanLehn, K (eds) Intelligent Tutoring Systems ITS 2000, Montréal, Canada, Berlin: Springer-Verlag, pp. 19–30, 2000. [13] T.K. Landauer, P. Foltz, & D. Laham, Introduction to Latent Semantic Analysis, Discourse Processes, 25, pp. 259-284, 1998. [14] N.K. Person, A.C. Graesser, L. Bautista & al, Evaluating student learning gains in two versions of AutoTutor. In J.D. Moore, C.L. Redfield and W.L. Johnson (Eds.) Artificial Intelligence in Education: AI-ED in the Wired and Wireless Future (pp. 286-293). Amsterdam: OIS Press, 2001. [15] R.A. Silveira & R.M. Vicari, Developing Distributed Intelligent Learning Environment with JADE – Java Agents for Distance Education Framework, 2001.
5