certification of professional competence in transfusion medicine. Pascal Staccinia, MD ... 5) track the deposit and the validation of documents as proof of the ...
Modeling and using a web-based and tutored portfolio to support certification of professional competence in transfusion medicine Pascal Staccinia, MD, PhD, Philippe Rougerb, MD, PhD STIC Santé, Université Nice-Sophia Antipolis, Nice, France b Institut National de la Transfusion Sanguine, Paris, France
a
In order to manage a nationwide assessment program leading to certification of professional competence in blood transfusion throughout France, the National Institute of Blood Transfusion (INTS) and the University of Nice-Sophia Antipolis designed and developed a structured and tutored web-based portfolio. The entire process of certification has been approved by the national healthcare agency (HAS). Eleven assessment programs have been written. The structure of this e-portfolio is based on a matrix of actions defined according to standards of practice. For each action, elements of proof are uploaded by the physician and peer-reviewed by an expert (a tutor) before validation. The electronic portfolio stores all the history of the actions performed by users. This tracking feature generates alerts which are e-mailed to users (physicians and tutors) according to a list of monitored events. After one year of design and development, the application is now being used routinely. INTRODUCTION Since 20051, all French physicians (GPs and specialists in the private and public sectors) have had to join a lifelong professional competence accreditation process. This process, divided into 5year segments, is part of the mandatory continuous medical education requirement, as defined by law ten years ago2,3. Under the auspices of the French Ministry of Hospitalization and Care Management, the High Agency for Healthcare (HAS)4 has responsibility for monitoring how each physician manages his/her certification process. A global schema has been defined requiring each physician to first register with a program administered by an accredited organization. In the area of blood transfusion, the National Institute of Blood Transfusion (INTS) received approval from HAS to organize the evaluation and certification of professional competence throughout France. The main issues INTS has had to cope with are: 1) the difficulty of managing a large number of physicians nationwide; 2) the large number of evaluation programs to deal with; 3) the validity of the proof supplied by the physician; 4) the need to track all the interactions and exchanges between physicians and INTS. How can the HAS ensure physicians have
genuinely completed their accreditation procedure, with a high level of confidence? Furthermore, the project has had to deal with the emerging concept of “mass” professional assessment in France, in combination with individual evaluation tracks. There is no doubt that, although the web-based application was devised by INTS, questions still remain regarding the features required to support the certification process. OBJECTIVES The aims of the project were to: 1) structure and build up a library containing templates of assessment pathways; 2) assist physicians in managing their own record; 3) guide physicians in choosing a program according to their main activity and to help them define relevant actions; 4) provide each registered physician with a customized and tutored follow-up; 5) track the deposit and the validation of documents as proof of the completion of an action. The aim of this article is to describe and discuss the model of the portfolio, the features that have been implemented and the ways the application is used. METHOD A group of experts was engaged to design the process of shaping the electronic portfolio. The group comprised a manager, two physicians, two quality engineers and a computer engineer. User needs were determined by means of the UML scoring system. We used an open-source UML editor7 to produce the information model charts. Development of the first operational prototype lasted over a year. The project began with a classic elicitation of user needs. We started with a process analysis to define the main steps a physician must perform (4 months). This first modeling phase was followed by a programming phase with elementary tests (6 months). The last 2 months were dedicated to integrated tests before opening. We used open-source tools such as an object-oriented framework (Zope8), a contentmanagement system (Plone9) and a relational database management system (Postgresql10). Thus, an electronic portfolio was developed as a web-based application, including workflow and collaborating working features.
AMIA 2008 Symposium Proceedings Page - 697
RESULTS INTS, in association with scientific boards, drafted appropriate standards. The method used to define standards was framed by HAS. Each program of assessment is led by a steering committee. A scientific committee selects experts to build the program and supervises its content. An evaluation committee checks the way the program is carried out by the physicians. Scientific experts are recruited to define the standards and their criteria. They also devise the measurement grid and write the user guide for the evaluation program. The scientific committee validates the final assessment set (standards, criteria, grid and user guide). Then, the program is submitted to a panel of physicians for testing. The evaluation committee analyses the testing feedback and proposes improvements. Once the program is definitively approved, it can be distributed to other physicians. An evaluation program comprises two kinds of actions carried out by the physician according to the results of a self- or external assessment: 1) long-term actions requiring several answers as proof of continuous improvement of his/her practice; 2) transverse or specific actions that can be performed several times during the evaluation cycle (such as participation in a relevant working group, attending a CME-approved congress, writing a scientific paper or a critical appraisal in the area of blood transfusion, or
any action consistent with the results of the initial evaluation). In accordance with the program description proposed by the group of experts, an information model of the evaluation portfolio has been defined (figure 1). An evaluation program is designed as a portfolio, i.e., a collection of items to be documented. A program relates to one professional area or specialty. The portfolio is composed of actions and sub-actions (or activities). Two types of actions are defined: continuous and specific actions. The status of each action and activity can be classified as: in process, completed, validated or not validated. One or several elements of proof (uploaded by the physician) are forwarded with an activity in order to certify it has been performed (completion). Each element of proof is attributed a status: approved or not. Additional elements can accompany an action in order to provide explanations (help documents) or to allow users to exchange comments (related to an action or an element of proof). As an object of the Zope object-oriented platform8, any new evaluation program inherits other properties defined for other objects such as indexing and retrieval features. The Plone content management system9 implements Dublin Core indexing attributes. This explains why the information model does not include such metadata attributes.
Figure 1: The information model of the portfolio (UML notation)
AMIA 2008 Symposium Proceedings Page - 698
Four distinct roles or functions interact through the platform: physician, tutor, program director and supervisor. “Physician” is the role of any individual who registers with an evaluation program. A copy of the template portfolio related to the program is then assigned to the physician. The physician has to complete all the actions and to give proof of their completion by uploading documents. Physicians registered with the same program are grouped together (ten per group). "Tutors" are assigned to a program to track the work of physicians and advise them if necessary. The tutor’s main task is to validate (or not) each element of proof uploaded by the physician. When each activity of an action has been validated, the tutor validates the action. Each program is supervised by a "director", who also manages the group of experts engaged to prepare the standards of the program. The program director provides content for the program toolbox. He collects all the notifications of action validations. The role of the "supervisor" is to prepare and to coordinate the scheduling of the tasks for the other roles. The supervisor registers physicians with a program, groups the physicians from the same program and assigns the tutors to a program and groups. The itinerary of a physician through the evaluation system, the platform and a program comprises eight main steps (table 1). Registration is divided into two stages. First, the physician when choosing the evaluation program documents a web-based form. This form is emailed to the supervisor who checks the request (relevance of the choice of the program according to the physician’s specialty). After approval, the physician is assigned to the program and a group. This event “assignment” automatically generates a message to the physician notifying him/her that his/her request has been accepted. The message contains a login and password to connect with the platform as well as a reminder of the date and time of the start of the meeting. The initial meeting provides the physician with: 1) an overview of the evaluation program; 2) a lesson in how to use the web application; 3) the user guide. After the first meeting, the physician is expected to gradually document his/her own program. Under the supervision of the tutor (an expert in the field), the physician chooses a set of actions (continuous and transverse). For each action, the physician proposes several dates to schedule their completion. The tutor validates the proposed or selected actions and their scheduling. For each action, the physician may post a comment and must upload a document that proves he/she has partially or fully performed the action. Each deposit generates an email to the tutor who logs on the platform to examine the elements of proof and to exchange comments with the physician.
Table 1: Main steps of the certification process and the corresponding functions supplied by the platform The main steps of the certification process
Registration
Initial meeting - groups of ten physicians - explanation of methods and tools - meeting with the tutor
Self-assessment based on a manual comparison between physician’s practice and standards, aimed at identify lacks or criteria to be answered
Choice of actions to manage over the period, according to the results of the self-assessment
Performing actions over a 18 months period
Testifying results
Overall validation of the assessment program completed by a physician Delivery of the certification
Support and use of the web-based tool On line registration Management of agreements by emails exchanged between physician's hospital and INTS Learning session with practical exercises: how to use the platform. Deposit of an attendance certificate as an example of interaction Pedagogical documents available for downloading through a dedicated toolbox Messages exchanged between tutor and physicians Deposit of documents as a proof of the completion of the self-assessment step Validation by the tutor List of actions chosen by the physician Validation of the choice and the time for completion by the tutor Messages exchanged between tutor and physician Alerts generated according to time points and sent to relevant users Deposit of documents as a proof of actions completed or in process. Validation of the documents by the tutor Validation of each activity and each action of the program by the tutor Print-out of a document to be sent to the physician
A tracking system works in the background recording the complete history of a program and its components, including the creation steps as well as the use made by the physician interacting with the tutor through the portfolio. All accesses to a portfolio are tracked in reading and writing (records are managed by the relational database). Qualifiers have been attached to programs and all of their components for analysis by a workflow engine integrated into the platform. Thus, each role can know who is using the system, when and for what purpose. Filters allow the return of specific tracked datasets according to the user’s role.
AMIA 2008 Symposium Proceedings Page - 699
1
2 3
4
5
Figure 2: Details of an action in an evaluation program (1), as a set of activities (2). For each activity, the physician can read the details and download additional documents (3), see the previous uploaded elements of proof (4) and add new ones. Each element of proof, activity and action can be validated or not by the tutor (5). Combining real time and expected time attributes with workflow states, alerts can be generated to provide physicians, their tutors, the program director and the supervisor of the system with information concerning problems occurring during the completion of activities and actions (table 2). These tracking features are a crucial part of the system in order to render the entire evaluation process as transparent as possible. Table 2: List of events and alert messages Messages emailed to the tutor after: • addition of general news • addition of a new activity by a physician • the deposit of a new element of proof • modified duration for completion of an activity • addition of a new comment for an activity • time remaining before a deposit • expiry of deposit date Messages emailed to the physician after: • addition of general news • addition of group-dedicated news • change of state of a proposed activity by the tutor • change of state of an element of proof by the tutor • change of state of an activity or an action by the tutor • addition of a new comment for an activity by the tutor • time remaining before a deposit • expiry of deposit date
DISCUSSION As Matillon wrote in a government mission report entitled “Modalities and conditions of the evaluation of competences for health care professionals in France”11, the most advanced systems for crediting and validating professional competences are operational in the United States. Information technologies are used to support self-assessment
sessions as well as to manage the follow-up of accreditation programs12. Nowadays, in France, the development of distance learning projects is coupled with thinking about the evaluation of knowledge and practice by means of Internet applications. These projects aim to use the ISO normative framework to structure the resources and describe the semantics of learning contents, learning courses and assessment systems13. In parallel to this normative approach, other projects arise, based on competences portfolios. As a result of a personal commitment, these portfolios are designed to list and testify levels of knowledge, skills or competences14. Using and maintaining the portfolio are functions under the control of its owner. The digital transposition of “paper” portfolios is nowadays an international challenge as identified by the Key-PAL project. Guidelines have been published in order to promote the setting up of such electronic portfolio (ePortfolio) applications15. Several typologies of electronic portfolios have been published. One is based on the chronology between production and publishing processes16: 1) showcase portfolio (the organization of content is free and arranged retrospectively); 2) structured portfolio (content production and publishing processes are strictly framed); 3) learning portfolio (the arrangement of content follows the acquisition of skills). Another typology is based on the purpose of the portfolio: personal development reporting, learning, evaluation, and collaborative work14. In order to describe the characteristics of the electronic portfolio that we developed, we merged these classifications. From a canonical point of view, six attributes can be used to describe an ePortfolio (the SAGACE model): Structure, Actor, Goal, Audience, Content, Evidence (table 3).
AMIA 2008 Symposium Proceedings Page - 700
Table 3: The SAGACE model of a ePortfolio Attributes Structure Actor Goal Audience Content Evidence
Values free arrangement or structure as a matrix of questions/answers individual, paired (interaction between two persons) or collaboration group knowledge assessment, self-learning, works listing or skills showing private, restricted audience or public digital documents, texts, scores or marks no evidence of what is shown, evidence given before the deposit, or peer-review after deposit
In the area of web services for assessment, reports detail challenges and describe the tools for several healthcare professionals such as nurses17 and residents, with a focus on specialties such as emergency medicine. Globally speaking, an evaluation system is based on: 1) a methodology (standards, actions, matrix of answers and level of evidence); 2) a schedule (registration, follow-up and validation); 3) roles (assessed person, tutor/peer reviewer, program director and supervisor); 4) tools (tracking system). The main purpose of such an organization is to guarantee the credibility and the authenticity of the evaluation process. Transparency and accuracy of the steps and the results are supported by the application we developed: history of a program template, history of user actions, history of exchanges between users, history of the deposit of documents. A preliminary survey towards the first 45 enrolled physicians (http://epp-ints.fr) shows that 50% of them rate the system over 4 (likert scale from 0 to 5). Links between evaluation process and continuous education have also to be found. Thus, the electronic portfolio has been integrated into our learning content management system. As competence is a life-long process encompassing personal and professional development, assessment of professional competence needs to take into account not only the physicians’ current specialties and settings but also where they are in the evolution of their career. The web-based and structured portfolio assists physicians in the assessment, maintenance, and improvement of their overall clinical and professional competence. By means of tutoring and exchanges tools, we believe that physicians can better identify the knowledge, skills, abilities, attitudes and behaviors, or core competences that are essential to ensure continuous and adaptive competence in medical practice. References 1. Décret n° 2005-346 du 14 avril 2005 relatif à l'évaluation des pratiques professionnelles. J.O n° 88 du 15 avril 2005 page 6730.
2. Ordonnance n° 96-346 du 24 avril 1996 portant réforme de l'hospitalisation publique et privée, J.O n° 98 du 25 avril 1996 page 6301. 3. Loi n° 2002-303 du 4 mars 2002 relative aux droits des malades et à la qualité du système de santé, J.O n° 54 du 5 mars 2002 page 4118. 4. Haute Autorité de Santé. La politique d’évaluation des pratiques de la HAS, http://www.has-sante.fr/portail/ display.jsp?id=c_434225 (last visit: 02/03/08). 5. IMS ePortfolio Best Practice and Implementation Guide, http://www.imsglobal.org/ep/epv1p0/ imsep_bestv1p0.html (last visit: 02/03/08). 6. Staccini P, Bordonado C, Alet J, Joubert M, Dufour JC, Fieschi M. A Customized Open Source Content Management System to Support Collaborative Distance Learning: The J@LON Platform. in: Computer-Based Medical Systems, 2007. CBMS '07, pp. 651-656 7. StarUML, The Open Source UML/MDA Platform, http://www.staruml.com/ (last visit: 02/03/08). 8. Latteier A, Pelletier M, McDonough C, Sabaini P. The Zope Book (2.6 Edition), http://www.zope.org/Documentation/Books/Zope Book/2_6Edition/ZopeBook-2_6.pdf (last visit: 02/03/08). 9. Plone: a user friendly and powerful Content Management System, http://plone.org (last visit: 02/03/08). 10. Postgresql: the world’s most advanced open source database, http://www.postgresql.org (last visit: 02/03/08). 11. Matillon Y. Modalités et conditions d’évaluation des pratiques professionnelles médicales. Rapport de mission (Mars 2006), http://www.sante.gouv.fr/htm/actu/matillon/rappo rt_matillon.pdf (last visit: 02/03/08). 12. National Board of Medical Examiners (NBME), http://www.nbme.org (last visit: 02/03/08). 13. ISO/IEC JTC1/SC36, International Standards and guidance in information technology for learning, education, and training, http://jtc1sc36.org/ (last visit: 02/03/08). 14. Greenberg G. The digital convergence. Extending the Portfolio model. Educause Review, 2004;39(4):28-37. 15. Eifel, Keypal project: ePortfolio guides now available, http://sd1065.sivit.org:9691/eifel/ activities/projects/keypal/deliverables (last visit:02/03/08). 16. Lorenzo G, Ittelson J. An overview of eportfolios, http://www.educause.edu/ir/library/pdf /ELI3001.pdf (last visit: 02/03/08). 17. Sherrod D. Professional portfolio: a snapshot of your career. Nursing, 2007;37:18.
AMIA 2008 Symposium Proceedings Page - 701