healthcare spending in the US, which is expected to increase as ... older Americans is steadily increasing. ..... the Aging (JABA), Volunteers of America National.
Development of Survey Instruments to Guide the Design of Health Status Monitoring Systems for the Elderly: Content Validity Evaluation Majd Alwan1, Beverely Turner1, Steve Kell1, Kim J. Penberthy2, Wendy Cohn3, and Robin Felder1 1 Medical Automation Research Center (MARC) Department of Pathology, 2 Department of Psychiatric Medicine, 3 Department of Health Evaluation Sciences, University of Virginia, PO Box 800403, Charlottesville VA 22908, USA {alwan, bt2h, swk3f, jkp2n, wfc2r, rfelder}@virginia.edu Abstract The development of technologies for monitoring the health status of older adults in their living settings is a long and expensive process that involves multiple stakeholders. If the technology is to be mass produced, widely deployed and utilized by different user groups, the technology has to meet certain feasibility criteria, including being acceptable, useful, and potentially beneficial to all the different users. This paper describes the development, content and content validity evaluation results of three custom survey instruments designed to assess the feasibility of using in-home health status, monitoring technologies. The instruments were designed to solicit input from three primary stakeholders in the care process: the monitoring candidate older adults, professional caregivers, and informal caregiver. The validity of the instruments content was evaluated by ten field experts who were asked to score each question in each of the three survey instruments on a 4-point Likert scale on relevance, clarity, and simplicity. All three survey instruments received significantly high overall mean content validity scores. The content validity evaluation results indicated that these instruments are ready to be used to solicit users’ requirements from the user groups to guide the refinement of monitoring technologies.
1. Introduction Nearly three quarters of older adults suffer from one or more chronic diseases, and require some degree of formal and/or informal care due to loss of function, according to the Centers for Disease Control (CDC). Older adults currently account for 60% of the overall healthcare spending in the US, which is expected to increase as the adult population expands to 108 million (45% of the U.S. adult population) over the next 15 years [1]. The cost and burden of appropriate care for older Americans is steadily increasing. The issue is further exacerbated by the decrease in the number of
professionals specializing in providing care for the geriatric population, including physicians, nurses, and nurse-aides. However, recent advances in sensor, communication, and information technologies have created opportunities to develop novel tools enabling remote diagnosis and management of chronic disease, emergency conditions, and the delivery of care. In-home monitoring has the added benefit of evaluating individualized health status and reporting it to care providers, and caregivers alike; allowing timelier and individually targeted preventive interventions [2]. Hence, in-home health status monitoring may be one of the key solutions to the problem of care delivery to the world’s growing elder population, and will provide hospitals and health systems, as well as eldercare service providers with a unique opportunity to extend the delivery of health care into the community through their home health divisions. The opportunity has stimulated significant research and product development by many academic institutions as well as technology companies in the recent past. Over the past few years, academia and industry alike have conducted studies to understand older adults’ attitudes towards and interests in technologies that could allow them to receive better care in their homes. Such studies inform emerging design solutions that take the user’s requirements into account, thus encouraging adoption of the resulting solutions. Examples include research conducted at the University of Missouri that aimed to explore the perceptions and expectations of seniors regarding home health status monitoring technology through audio recorded focus groups [3]. Another example is a study conducted by the Center for Aging Services Technology (CAST), a consortium of eldercare providers, academia and technology providers. The study gauged interest in the use of technology for the delivery of health and care services by “Baby Boomers” with a 76 adults between the ages of 50 and 65 years. Willingness to pay for the monitoring service out of pocket was a point of interest in a study that also employed a video taped focus group methodology [4]. However, most of these studies focus on one user group
(older adults in the above two examples) rather than attempting to understand the needs and concerns of all affected user groups. One exception was a study conducted in the UK involving both older adults and care workers to solicit user requirements for an Interactive Domestic Alarm System (IDAS) that aims to enable older adults who adopt and use the system to live independently in their own homes for a longer time than those who do not use the system. This study revealed difficulties encountered when using the focus group method with older adults, and highlighted the importance of educating the older adults in preparation for the focus group [5]. Unlike survey instruments, focus groups are used for gathering qualitative data through group interaction guided by a moderator [6]. Accordingly, focus groups are not suitable for soliciting user requirements from larger groups of potential users. Moreover, the outcome of focus groups is difficult to quantify. Thus, the survey instruments methodology was favored over focus groups. The ability of any technology to proliferate in society relies directly on its acceptance, utility to the different users, and the benefits it brings to all the user groups, or stakeholders, and society in general through its consistent use. If a health, wellness or safety monitoring technology is to be mass produced, widely deployed and utilized by different user groups, the technology has to meet certain feasibility criteria including being acceptable to both older adults and their informal caregivers (family members and/ or friends who provide assistance or care to older adults), useful to informal caregivers as well as professional caregivers (nurses, nurse aides, home health nurses, social workers, service coordinators and eldercare facility managers), and potentially beneficial to older adult, as well as their informal and professional caregivers. In this paper, we describe the history, development, content and content validity evaluation of survey instruments designed to gauge the feasibility, utility, and acceptance of monitoring technology being developed to monitor selected activities of older adults in their preferred living setting. Moreover, we describe the general categories and nature of the questions incorporated in the survey instruments, and the rationale for including them. The surveys will be used to elicit user requirements from three distinct user groups significantly invested in the care process, taking a broader stakeholders’ approach to guide the design of health status monitoring systems for older adults.
2. Method 2.1. Survey Design Through previous research conducted by the Medical Automation Research Center and its collaborators,
various methods were user requirement solicitation attempted including, focus groups [7]. A custom monitoring survey, targeting older adults and adult children of cognitively impaired older adults, was designed to gauge the opinions, acceptance of monitoring technologies, and concerns. The categorical response survey was designed by a psychologist and was pilot tested with cognitively intact residents and adult children of memory care residents in an assisted living facility [8]; content validity and reliability were not formally evaluated [8]. This original survey instrument was adapted for the three user groups of interest: Older Adult Monitoring Candidates, Professional Caregiver, and Informal Caregivers. The surveys were later reviewed for content relevance, completeness, clarity and cross-cultural readability of the questions.
Acceptable Activities Useful Activities (Informal Caregivers) (Informal Caregivers)
Useful Activities (Older Adults)
Acceptable Activities
Useful Activities (Professional Caregivers) Figure 1. A Venn Diagram Representing Activities that should be Monitored The survey instruments start by asking the respondents whether they believed that monitoring activities and health status continuously in the residence was a good idea or not, and if not identify why they thought it was not; thus identifying potential barriers to embracing the proposed technology. Then the respondents are asked to identify activities and health status parameters that are acceptable and useful to monitor. The list included Pulse, Breathing Rate, Body Temperature, and Restlessness while in bed, Falls, Bathing/ Showering, Meal Preparation, Movement around the residence, Stove/ Microwave use, Walking Abilities and Balance, Taking Medications, Other Activities of Daily Living (ADLs)/ Instrumental Activities of Daily Living (IADLs), and space for Other activities/ health parameters to be identified by the
respondents. The intersection between the activities and health status parameters that are both acceptable and useful to both older adults and the informal caregivers, and those that are useful to professional caregivers answers the question: What activities should be monitored? The answer lies within the intersection between all the circles presented in Figure 1, which illustrates the concept using Venn diagrams. The following portion of the surveys asks respondents to rate acceptability of various technologies (including cameras) that could be exploited to monitor these parameters. The list of technologies included Cameras that record activities, Cameras that produce body outline images without details, Sound Recorders, Motion Detectors, In-Home Fall Detectors, Wearable Fall Detectors, Automatic Emergency Response System/ Device, Wireless object use detectors (e.g., wirelessly tagged dishes), and space for Other technologies to be identified by the respondents. In this manner the survey identifies acceptable technologies that may be used to monitor the identified activities. The survey instruments also ask the respondent to identify those parties with whom they would share their information. The list included Adult children of the Older Adults, Older Adult’s spouse/ significant other, Older Adult’s sibling (brother/ sister), Any other relative of the Older Adult, Friend, Neighbor, Volunteer Informal Caregiver, Health Care Provider, 911 Emergency Services, Home Health Agency, Hospital, Medicare agency, Medicaid agency, Health Insurance company, The Older Adult, and Others to be identified by the respondent. The instruments were also designed to gauge the willingness of older adult monitoring candidates and informal caregivers to directly pay for the technology and further, the acceptable range of such monthly monitoring fees. This is important because the technology may not be embraced if not appropriately priced, even if it was otherwise both acceptable and useful. The surveys ask the respondent to identify potential benefits of in-home monitoring to all three user groups. Additionally, the respondents are asked whether they personally believe that they would have specific benefits arising from implemented in-home monitoring technologies. The surveys further ask the respondents to identify the conditions for accepting or recommending such technologies. The conditions included If the system or monitoring was provided at no cost to The Older Adult or family, If the Older Adult could select what is to be monitored (e.g., select from bed, gait, fall, and/ or ADL monitors), and If the Older Adult could control when they are monitored (e.g., turn all or part of the monitoring system on/ off as they desire). Finally, each cohort’s survey also include demographics questions; these are intended to reveal any cultural, ethnic and racial differences in attitudes towards monitoring technologies, as well as the influence of education on such attitudes.
A team of technology designers, a psychologist, and an expert in survey development initially designed the survey instruments. The research team iteratively refined the survey instruments. Attention was given to soliciting valid responses, through the careful selection of vocabulary, phrasing and tone of all questions so as to reduce the potential to bias the responses unintentionally. Moreover, a short introduction was prepared that provides a brief overview of the concepts related to the survey (monitoring technology), the definitions and roles of the stakeholders; the introduction was included in the beginning of the survey instruments. Finally, a brief explanation of the objectives of the study, and what was specifically asked from participants was included. Hence, the instruments could be flexibly employed either as self-administered survey, or administered by staff or research personnel. A survey design and evaluation expert independent from the development team further reviewed the initial instruments. The development team further revised the instruments based on the comments received from the independent expert and developed the content validity evaluation methodology and forms. A short introduction was written to provide the instrument evaluators with a brief education on the monitoring technology concepts, the definitions and roles of the target stakeholders, a brief explanation of the objectives of the study, and what was specifically asked from evaluators.
2.2. Content Validity Evaluation 2.2.1. Evaluators. Ten eldercare field experts were asked to evaluate the content validity of the revised survey instruments; these experts included professors in the field of geriatric nursing, graduate nursing students, home health nurses, social workers, and managers of eldercare facilities (including assisted living facilities, and home health agencies). 2.2.2. Evaluation Method. The experts were provided with information about the aim of the instruments and their evaluation and were asked to score every question in each of the three survey instruments on a 4-point Likert scale from 1 to 4 on each of the following evaluation qualities: relevance, clarity, and simplicity. The evaluation forms provided a field for each question where the evaluator could write in comments or suggest changes. 2.2.3. Statistics. Descriptive statistics, including means and variance were used in this study. In addition, the one sample t-test and the Wilcoxon rank sum test were used to compare the evaluation score for the instruments to 2.5, the middle of the score range.
3. Results The older adults’ survey received an overall mean content validity score of 3.775 (SD= 0.34, min= 2.875, max= 4, median= 3.907) on all three evaluation qualities; the mean score was significantly different from the mid-range score of 2.5 (p < 0.0001 two-tailed using the one sample t-test). Similarly, the informal caregivers’ survey received an overall mean content validity score of 3.898 (SD= 0.17, min=3.45, max= 4, median= 3.975) on all three evaluation qualities; the mean score was significantly different from the mid-range score of 2.5 (p < 0.0001 two-tailed using the one sample t-test). And the professional caregivers’ survey received an overall mean content validity score of 3.942 (SD= 0.102, min= 3.667, max= 4, median= 3.978) on all three evaluation qualities; the mean score was significantly different from the mid-range score of 2.5 (p < 0.0001 two-tailed using the one sample t-test). Finally, the overall mean score from each evaluator on all three evaluation qualities was significantly different from the mid-range score of 2.5 for all three survey instruments; maximum p= 0.0003 (two-tailed) obtained using the Wilcoxon rank sum test. This nonparametric statistical test was applied because the individual evaluators’ scores did not have Gaussian distributions. All the suggestions and comments of the evaluating field experts were incorporated into the finalized survey instruments, which can be obtained from the author. The final survey instruments contained 24, 30, and 22 questions (older adult, informal caregiver, and professional caregiver version respectively), including questions about the respondent’s demographics, living and caregiving setting.
4. Conclusions and Future Work The content validity evaluation results, even before taking the evaluators’ comments and suggested changes into consideration, indicate that the surveys’ questions were highly relevant, clear, and simple. The suggestions from the eldercare expert evaluators allowed the team to improve the instruments further. The content validity evaluation results indicated that these instruments are ready to be used to solicit users’ requirements from the user groups to guide the refinement of monitoring technologies. These survey instruments will be administered twice to 25 older adults, 25 informal caregivers, and 25 professional caregivers in the near future. One week between the two administration rounds will allow for the assessment of test-retest reliability of the survey instruments. Responses will be analyzed to evaluate the test-retest reliability of the instruments, and to guide enhancements of MARC’s In-Home Monitoring System and future development of related in-home monitoring technologies. The ultimate aim is to use the survey
results not only to guide the design of monitoring technologies that are acceptable to older adults and informal caregivers, useful to informal and professional caregivers, and beneficial to all user groups, but also to identify favorable operational conditions, including system privacy features, who to share the monitoring information with, and how to price the monitoring service.
5. Acknowledgements This research was funded in part by a grant from the National Institutes of Health (NIH), the National Institute on Aging (NIA). The authors would like to thank Wendy Novicoff, the survey design and evaluation expert, for reviewing the survey instruments and providing constructive comments, and for the Department of Health Evaluation Sciences at the University of Virginia Health System. The authors would also like to express their gratitude to all the field expert evaluators who graciously contributed their time, effort, and domain knowledge to evaluate the content of the instruments and provide suggested changes, and to their departments/ organizations. The authors are grateful to the Department of Acute & Specialty Care, University of Virginia School of Nursing, University of Virginia Medical Center, the Jefferson Area Board for the Aging (JABA), Volunteers of America National Services, the Jewish Council on Aging, and the Evangelical Lutheran Good Samaritan Society and their staff for participating in this effort.
6. References [1] “A Profile of Older Americans: 2004”, Administration on Aging, available on-line at: http://www.aoa.gov/prof/Statistics/profile/2004/2004profile.pd f, accessed March 3 2006. [2] Celler BG, Lovell NH, Chan DK. “The Potential Impact of Home Telecare on Clinical Practice”, MJA. 1999, 171:518-521 [3] Demiris G, Rantz M, Aud M, Marek K, Tyrer H, Skubic M, Hussam A. Older adults' attitudes towards and perceptions of "smart home" technologies: a pilot study, Med Inform Internet Med. 29(2):87-94. [4] Higgins H, Miner-Olson K et al. ““Baby Boomer” Interest in the Use of Technology for the Delivery of Aging Services and Healthcare”, available on-line at: http://www.agingtech.org/documents/NR_exec_summary.pdf, accessed March 3 2006. [5] Lines L, Hone K S. “Eliciting user requirements with older adults: lessons from the design of an interactive domestic alarm system”, Journal of Universal Access in the Information Society, Vol. 3, No. 2, June 2004, 141-148. [6] Morgan DL. “Focus Groups” Annual Review of Sociology Vol. 22: 129-152. [7] Williges R, Smith-Jackson TL, Kwahk J., Ryu YS, Narayan SJ. Adaptive Telemedical Support System Used with
Smart House Technology. Research Report Submitted to the Medical Automation Research Center, June (2002). [8] Alwan M, Dalal S, Mack D, Kell S, Turner B, Leachtenauer J, Felder R. “Impact of Monitoring Technology in Assisted Living: Outcome Pilot”, IEEE Transaction on Information Technology in BioMedicine, Vol. 10, Issue 1, Jan 2006, pp. 192-198.