Original Research
Measuring Participants’ Attitudes Toward Mobile Device Conference Applications in Continuing Medical Education: Validation of an Instrument Christopher M. Wittich, MD, PharmD; Amy T. Wang, MD; Justin A. Fiala, MD; Karen F. Mauck, MD; Jayawant N. Mandrekar, PhD; John T. Ratelle, MD; Thomas J. Beckman, MD Introduction: Mobile device applications (apps) may enhance live CME courses. We aimed to (1) validate a measure of participant attitudes toward using a conference app and (2) determine associations between participant characteristics and attitudes toward CME apps with conference app usage. Methods: We conducted a cross-sectional validation study of participants at the Mayo Clinic Selected Topics in Internal Medicine Course. A conference app was developed that included presentation slides, note-taking features, search functions, social networking with other attendees, and access to presenter information. The CME app attitudes survey instrument (CMEAPP-10) was designed to determine participant attitudes toward conference apps. Results: Of the 602 participants, 498 (82.7%) returned surveys. Factor analysis revealed a two-dimensional model for CMEAPP10 scores (Cronbach a, 0.97). Mean (SD) CMEAPP-10 scores (maximum possible score of five) were higher for women than for men (4.06 [0.91] versus 3.85 [0.92]; P = .04). CMEAPP-10 scores (mean [SD]) were significantly associated (P = .02) with previous app usage as follows: less than once per month, 3.73 (1.05); monthly, 3.41 (1.16); weekly, 4.03 (0.69); and daily or more, 4.06 (0.89). Scores were unrelated to participant age, specialty, practice characteristics, or previous app use. Discussion: This is the first validated measure of attitudes toward CME apps among course participants. App usage was higher among younger participants who had previously used educational or professional apps. Additionally, attitudes were more favorable among women and those who had previously used apps. These findings have important implications regarding efforts to engage participants with portable and accessible technology. Keywords: continuous professional development, e-learning, validation study, online/computer-based education DOI: 10.1097/CEH.0000000000000031
S
martphones, which combine mobile telephones with handheld computers, are pervasively used by physicians.1–5 Arguably, the success of smartphones is partly due to the availability of device-specific software applications or apps,1 defined as specialized software programs that are downloaded onto a mobile device and made for a specific use.2,3 Mobile device apps that have become common in clinical practice include those for textbooks, practice guidelines, drug references, medical calculators, and institution therapy standards, as well as those for potential use by patients.1,6 Apps provide point-of-care Disclosures: The authors declare no conflict of interest. Dr. Wittich: Clinical Practice Chair and Associate Professor of Medicine, Division of General Internal Medicine, Mayo Clinic, Rochester, MN. Dr. Wang: Assistant Professor of Medicine, Division of General Internal Medicine, Harbor-University of California Los Angeles Medical Center, Torrance, CA. Dr. Fiala: Resident in Internal Medicine, Department of Internal Medicine, Mayo Clinic, Rochester, MN. Dr. Mauck: Associate Professor of Medicine, Division of General Internal Medicine, Mayo Clinic, Rochester, MN. Dr. Mandrekar: Professor of Biostatistics and Neurology, Division of Biomedical Statistics and Informatics, Mayo Clinic, Rochester, MN. Dr. Ratelle: Instructor in Medicine, Division of Hospital Internal Medicine, Mayo Clinic, Rochester, MN. Dr. Beckman: Education Chair and Professor of Medicine and Medical Education, Division of General Internal Medicine, Mayo Clinic, Rochester, MN. Correspondence: Christopher M. Wittich, MD, Division of General Internal Medicine, Mayo Clinic, 200 First Street SW, Rochester, MN 55905; e-mail: wittich.
[email protected]. Copyright ª 2016 The Alliance for Continuing Education in the Health Professions, the Association for Hospital Medical Education, and the Society for Academic Continuing Medical Education
JCEHP n Winter 2016 n Volume 36 n Number 1
information for physicians by conveying up-to-date information in an accessible, user-friendly medium. Mobile device apps potentially can reach physicians engaged in continuous professional development in ways that live courses cannot. In 2001, the Institute of Medicine addressed the need for ongoing education beyond the traditional paradigm of conference-based continuing medical education (CME), citing the perils of practicing in a field with an ever-expanding knowledge base.7,8 CME addresses this need by giving physicians updated information periodically. For the most part, CME has been delivered through traditional lectures,9 yet CME courses that use multiple media and educational techniques seem to have the greatest influence on improving physician performance.10,11 However, CME courses have generally failed to adapt to modern advances in electronic knowledge dispersal and retrieval. Smartphone and mobile tablet apps have begun to make their way into the medical education arena. In undergraduate medical education, apps have been developed to enhance core medical school courses,12 replace print textbooks,13–15 supplement general practice experiences,16 and create a virtual hospital for problem-based learning.17 In graduate medical education, apps are frequently used as a clinical resource.2,3,5,18–21 In CME, apps have been used to improve performance in simulated clinical settings,22 offer CME credits through podcasts or social media,1 and provide point-of-care information for self-directed learning.23 One study showed that CME participants had
www.jcehp.org
69
Copyright ª 2016 The Alliance for Continuing Education in the Health Professions, the Association for Hospital Medical Education, and the Society for Academic Continuing Medical Education. All rights reserved.
70
JCEHP n Winter 2016 n Volume 36 n Number 1
favorable attitudes toward the use of social media in CME, especially among those participants who were younger and used social media frequently.24 However, it is not known whether the use of apps with traditional CME courses improves learner satisfaction, knowledge, or application of CME learning to clinical practice. The goals of this study were to (1) validate an instrument that measures participant attitudes about the value of a conference app, (2) determine associations between participant characteristics and conference app usage, and (3) determine associations between participant attitudes toward CME apps and app usage. On the basis of previous research, we hypothesized that attitudes toward a CME course app would be associated with demographic characteristics such as age and previous app use. METHODS Study Design and Participants
We conducted a cross-sectional survey and validation study of all participants at the 26th Annual Selected Topics in Internal Medicine Course. This annual, week-long course is accredited by the Mayo School of Continuous Professional Development and qualifies for 24.5 hours of CME credit. The course consists of podium presentations (30–45 minutes long) and small-group breakout sessions. All attendees were invited to participate in the study. This study was deemed exempt by the Mayo Clinic Institutional Review Board.
www.jcehp.org
Attendee demographics and app usage characteristics were treated as categorical variables. Demographic variables included gender (male and female), age (20–30, 31–40, 41–50, 51–60, and $61 years), specialty (internal medicine, family medicine, medical specialty, and nonmedical specialty), practice type (academic, industry, solo, group, and other), and practice location (US–Northeast, US–Southeast, US–Midwest, US–Southwest, US–West, and other). App usage characteristics included professional or educational app use frequency (never, less than once per month, monthly, weekly, and daily or more), previous download of a medical app (yes and no), and previous paid purchase of a medical app (yes and no). Factor analysis was completed on the CMEAPP-10. Factors were extracted using the minimal proportion criteria. Items with factor loadings of 0.60 or more were retained.27 Internal consistency reliabilities for items comprising each factor and overall were determined using Cronbach coefficient a, where a > 0.7 was considered acceptable.27 Differences in demographic and app usage characteristics of the conference app users versus nonusers were reported as frequencies and percentages and were compared using the x2 or Fisher exact test. App attitude survey scores were reported as mean (SD) for each of the 10 items and overall. Among participants who used the conference app, associations between conference app attitude scores and participant demographic and app usage characteristics were compared using the Kruskal–Wallis test or Wilcoxon rank-sum test. The threshold for statistical significance was set at P < .05. Statistical analyses were conducted using SAS version 9.3 (SAS Institute, Inc., Cary, NC).
Conference App Development
A conference app was developed for the course using a platform created by a commercial app designer. The app was available free of charge for all attendees to download onto devices running Apple or Android operating systems. The app could be used before, during, and after the conclusion of the on-site course. Precourse emails and reminders during the course encouraged attendees to download and use the app. On-site technical support was available to all attendees. Features of the course app included the ability to download all presentation slides, take notes, and add highlights; search functions; social networking and texting with other attendees; and access to presenter information, including email addresses.
RESULTS CMEAPP-10 Validation
Factor analysis of the CMEAPP-10 items showed a 2-dimensional model of measuring attitudes toward CME conference apps (TABLE 1). The first factor included seven items involving the educational value of the app. The second factor included three items involving app appeal and usability. Internal consistency reliability (Cronbach a) was 0.96 for factor 1 (app educational value), 0.95 for factor two (app appeal and usability), and 0.97 overall (all 10 items). Mean (SD) scores for individual items ranged from 3.53 (1.17) to 4.27 (1.02) on a 5-point scale; the overall mean (SD) score was 3.96 (0.92).
Survey Instrument Development
A survey instrument was designed to determine CME participant attitudes about conference apps. Content for the instrument was derived from the existing literature on social media24 and medical apps.25 Mayo investigators (C.M.W., A.T.W., and T.J.B.) who were experienced in scale design, validation, and CME assessment created the instrument items and iteratively revised them after repeated review and discussion.26 The group ultimately selected 10 items that were structured on 5-point scales (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree). The final version included categorical response options regarding demographic characteristics and app usage.
Participant Characteristics
Data Collection and Analysis
Associations Between CMEAPP-10 Scores and Conference App User Characteristics
The CME app attitudes survey instrument (CMEAPP-10) was given to all conference attendees through the registration packet. Attendees were asked to return the completed surveys at the end of the conference. As an incentive for participation, a pen with the Mayo Clinic logo was given to attendees who returned the survey.
Of the 602 conference attendees, 498 returned surveys (response rate, 82.7%), and of these, 466 provided demographic and app survey data. The app was used by 293 attendees (293/466 respondents [62.9%]) (TABLE 2). Conference app users were younger (P = .003), more commonly used professional or education apps (P < .001), more commonly had downloaded a medical app previously (P < .001), and more commonly had purchased a medical app previously (P < .001). We observed no differences between conference app users and nonusers regarding gender, specialty, practice type, or practice location.
The associations between CMEAPP-10 scores and app user characteristics are shown in TABLE 3. Among course participants who used the conference app, we noted an association between CMEAPP-10 scores and gender, with women scoring
Copyright ª 2016 The Alliance for Continuing Education in the Health Professions, the Association for Hospital Medical Education, and the Society for Academic Continuing Medical Education. All rights reserved.
Wittich et al.
Mobile Device CME Applications
TABLE 1.
TABLE 2.
CMEAPP-10 Mean Scores, Factor Loadings, and Internal Consistency Reliability
Demographic and Usage Characteristics of Mobile Conference App Users and Nonusers*
Item Loading Item Factor 1: app educational value Using the course app improved my learning experience Using the course app helped me to stay more engaged Using the course app enabled me to gain more knowledge Using the app will help me apply what I have learned to clinical practice Using the course app enhanced my education I would be more likely to attend a CME course if it has an app I am likely to use the app after the conference is over Factor 2: app appeal and usability The course app was easy to use The course app was intuitive to use I would recommend a similar app for other CME courses Total score
71
Score, Mean (SD)*
Factor 1
Factor Cronbach 2 a 0.957
3.94 (1.06)
0.7503 0.5565
3.95 (1.09)
0.7565 0.5300
3.79 (1.05)
0.8509 0.3538
3.92 (1.04)
0.7219 0.5009
3.97 (1.03)
0.7866 0.4936
3.53 (1.17)
0.6506 0.3098
4.00 (1.08)
0.6384 0.5469
4.14 (0.97) 4.11 (1.00)
0.4344 0.8495 0.4020 0.8559
4.27 (1.02)
0.5986 0.7037
0.953
3.96 (0.92)
0.968
*Items were scored using 5-point scales (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree). CMEAPP-10 indicates CME App Attitudes Survey Instrument; app, mobile device application.
higher than men (mean [SD], 4.06 [0.91] versus 3.85 [0.92]; P = .04). Additionally, mean (SD) CMEAPP-10 scores were significantly associated (P = .02) with previous app usage as follows: less than once per month, 3.73 (1.05); monthly, 3.41 (1.16); weekly, 4.03 (0.69); and daily or more, 4.06 (0.89). CMEAPP-10 scores were unrelated to participant age, specialty, practice type, practice location, previous downloading of a medical app, or previous purchase of a medical app. DISCUSSION To our knowledge, this is the first study to validate an instrument that measures attitudes toward the use of apps for CME and to investigate app usage among participants at a traditional, live CME course. We showed that app usage was higher among younger participants and those who had previously used apps for educational or professional purposes. Additionally, we showed that women and those who had previously used apps reported more favorable attitudes toward using the course app. These findings have important implications for CME course directors, especially with the increasing numbers of course participants who use electronic learning and are adept with mobile computing. This study builds on previous literature describing the use of apps for learning in medical education.16,25,28–31 Our finding of conference app use being associated with younger age and previous experience with apps bears similarity to a study that showed that medical students in clinical clerkships and residents used apps
Characteristic Demographics Male gender Age, y 20–30 31–40 41–50 51–60 $61 Specialty Internal medicine Family medicine Medical specialty Nonmedical specialty Practice type Academic Industry Solo Group Other Practice location US–Northeast US–Southeast US–Midwest US–Southwest US–West Other App Usage Previous professional or educational app usage Less than once per month Monthly Weekly Daily or more Previously downloaded a medical app Previously paid for a medical app
Used the App, Did Not Use No. (%) the App, (n = 293) No. (%) (n = 173)
138 (48.1)
89 (52.4)
5 94 78 75 38
(1.7) (32.4) (26.9) (25.9) (13.1)
4 32 34 60 41
(2.3) (18.7) (19.9) (35.1) (24.0)
121 117 38 10
(42.3) (40.9) (13.3) (3.5)
75 68 20 7
(44.1) (40.0) (11.8) (4.1)
50 3 12 196 20
(17.8) (1.1) (4.3) (69.8) (7.1)
21 4 12 122 10
(12.4) (2.4) (7.1) (72.2) (5.9)
11 11 136 16 73 43
(3.8) (3.8) (46.9) (5.5) (25.2) (14.8)
9 8 72 9 47 24
(5.3) (4.7) (42.6) (5.3) (27.8) (14.2)
P .38 .003
.94
.28
.91