Integration of Wireless Wearable Sensors and Mobile Computing with ...

50 downloads 158709 Views 1MB Size Report
runs on Android operating system as shown in Fig. 3. The two. IMUs are worn on the user's upper and lower arm. The IMUs are connected to the smartphone ...
Integration of Wireless Wearable Sensors and Mobile Computing with Cloud-based Service for Patient Rehabilitation Monitoring Albert Causo, Lili Liu, Ganesh Krishnasamy, Song Huat Yeo and I-Ming Chen School of Mechanical and Aerospace Engineering Nanyang Technological University Singapore [email protected], [email protected], [email protected], [email protected], [email protected] Abstract—In this paper, we propose a novel system that uses wearable wireless sensors and handheld devices for body motion tracking. The sensors are inertial measurement units (IMUs) and the handheld devices are smartphones and tablets. The system enables a stroke patient to perform rehabilitation exercises at home, eliminating frequent visits to the hospital. Furthermore, healthcare providers such as therapists and doctors are able to keep track of their patient’s progress by remotely monitoring the rehabilitation activities of the latter. Cloud services are integrated in our system to allow remote monitoring, replace expensive hardware and software licenses needed for data storage. Initial user tests show the feasibility of the system. Keywords: remote rehabilitation monitoring, telerehabilitation, wireless body area network, inertial measurement unit, motion capture, cloud computing

I.

INTRODUCTION

Stroke remains as one of the major diseases and considered as Singapore’s fourth leading cause of death [1]. After a stroke, patients usually develop impairment with their cognitive function, language perception and motor function. Studies have shown that these impairments can be improved with rehabilitation therapy through intensive practice [2]. Rehabilitation usually takes place in specialized centers such as the rehabilitation center or stroke unit at the hospitals. These centers are well equipped with therapeutic equipment and welltrained medical staffs for the patient’s speedy recovery from these impairments. However, many patients are unable to travel frequently to these centers due to physical and financial factors. Therefore, remote rehabilitation using integrated wireless wearable sensors and mobile computing comes in to address this issue. Wireless Body Area Network (WBAN) allows vital biosignals data such as ECG, body temperature and human motion to be gathered, processed and sent wirelessly to other handheld devices to create ubiquitous health care services. Generally, a WBAN comprises of wearable devices attached or implanted on the body and capable of establishing wireless communication [3]. By integrating these wearable sensors with mobile computing devices such as smartphones and tablet, it is

Fig. 1. WBAN Architecture for remote monitoring [3]

possible to remotely monitor human’s physiological indicators such as health status and motion activities as shown in Fig. 1. By utilizing such system, therapy or medical consultation can be delivered to the patients who cannot travel to the hospital frequently due their disabilities. According to [4], the cloud is a large pool of easily usable and accessible virtualized resources such as hardware, development platforms and/or services. These resources can be dynamically re-configured to adjust to a variable load (scale), allowing for optimum resource utilization. Cloud computing services provide virtual data storage for hospitals, which can replace existing expensive server hardware, software and maintenance lowering the implementation cost of the remote monitoring system. This also translates as cost savings to the patients. Data can be stored in the database and be accessed remotely by the medical staffs enabling ubiquitous health care services. For example in [5], vital patient’s bio-signal such as ECG, blood sugar level and pressure are collected by wearable sensors and transferred via WLAN or cellular network to the medical’s cloud for storage, processing and distribution. The stored data can be accessed and retrieved from the cloud by the medical staff by using mobile computing devices and desktop workstations. In addition, if there any abnormalities occur while monitoring, an alert message will be sent to medical staff through SMS or email for a quick prompt action that could save the life of the patient. In [6, 7, 8] the authors proposed a

Fig. 2. System architecture overview

wireless sensor network-integrated cloud computing to monitor human health, activities and share the information among doctors, caretakers, clinics and pharmacies. The cloud services are being exploited in [9] and [10] where sleep activity pattern monitoring of the patients captured by the wearable sensors were stored in the cloud servers and to be processed offline later. The authors of [11] proposed using cloud computing and mobile computing in managing healthcare information such as patient medical records and images. HipGuard system was developed to monitor patient movement while recovering from a hip replacement surgery [12]. Patients must follow the strict motion limits and load after the surgery as defined by the surgeon. This system gives warning through audio signal or haptic feedback whenever the posture or load put on the operated hip approaches the predefined limits. The system has seven wireless sensor nodes attached into a pair of tight-fitting pants to measure the orientation of the hip and the legs. In addition, a load capacitive sensor in the insole of a shoe measures the load on a leg. In [13], Wii controller was used to provide an interactive and simple way for the patient to do rehabilitation exercise. This system was designed so that patients can perform upper limb rehabilitation exercises and monitor their performance through the feedback from the software interface in a laptop. Furthermore, the therapists are also able to remotely monitor the progress and performances of their patients by using the software as a web service through the Internet. Bio-WWS system was developed by a team of researchers from the Universita di Bologna for rehabilitation bio-feedback [14]. This system helps users to correct their posture both in static and dynamics conditions. Accelerometer based sensors are placed on the body trunk and both legs to capture posture of the body. The data are then transmitted wirelessly to mobile devices for processing. After processing, the mobile devices provide audio feedback to the user for posture correction. This paper presents a novel system that uses wearable wireless sensors and handheld devices for body motion tracking. There are three main components to the system: the hardware components, the database and the cloud server. The sensor, Inertial Measurement Unit (IMU), has nine integrated sensors to give orientation data and uses Bluetooth for connection. The mobile device can either be a smartphone or a tablet running Android operating system and connects to the cloud using TCP/IP and http. The cloud server is configured to

Fig. 3. Hardware component of the system

communicate and accept data from the mobile device and store it in a database. Fig. 2 shows the over-all system architecture. This system can be applied to upper limb rehabilitation of stroke patients. Patients wear the IMU sensors on their arm. A mobile device with an application (app) installed gathers 3D arm motion and connects to the cloud server. Using the app, patients can select to do an exercise, send the exercise data to the cloud server, or download and review previously uploaded data to their mobile devices. On the other end, therapists and doctors can use their own mobile devices with the installed app to connect to the cloud server, download, and analyze the uploaded exercise data of their patients. Our proposed system is highly mobile, can be used anywhere and anytime, since we only use small light IMUs, a smartphone and wireless connection (cellular networks or the Internet). Furthermore, our system utilizes cloud computing services to replace expensive servers at hospitals. II.

SYSTEM ARCHITECTURE

A. Hardware Component Our system consists of two IMUs and a smartphone that runs on Android operating system as shown in Fig. 3. The two IMUs are worn on the user’s upper and lower arm. The IMUs are connected to the smartphone through the Bluetooth SPP (RFCOMM) protocol. The IMU data transferred to the smartphone are then mapped onto a rigid body model and rendered using OpenGL ES to display real time body motion. The IMUs are InertiaCube BT sensors (InterSense Inc.). Each IMU combines nine discreet MEMS sensing elements with advanced Kalman filtering algorithms to produce a 3 DOF of yaw, pitch and raw measurement. The complete package of the sensor measures at 60mm x 54mm x 32mm and weights 67grams including the battery. The InterSenser, a server application provided by InterSense Inc., connects and processes data from up to 2 InertiaCube BT sensors. Data is provided to the client application over local socket connection.

III.

Fig. 4. Calibration of the sensors before the measuring activities

The IMUs need to be calibrated before starting any measurement. To calibrate, the patient needs to face north and click on the Reset Heading on the InterSenser application as shown in Fig. 4. B. Cloud Server The cloud server, a virtual server, is hosted in commercial provider. The server runs CentOS operating system, Apache, MySQL, PHP and Python. For the initial testing of the system, a different set-up was used. The server runs Microsoft Windows 7 operating system, Java and MS SQL. For our system the server set-up is easily convertible between the Windows and the Linux-based operating systems. The mobile devices connect to the cloud server by http protocol. As shown in Fig. 2, the cloud service provides storage and processing for collected data. Data sent by patient devices is accessible by therapists using their own devices. The collected data from the sensor is formatted as *.json file, saved in the smartphone and the cloud server. This data format is specially designed for Java and supported by Android OS. C. Database MS SQL server is used to organize, store and manage the collected data from patients, which must be available when requested, up-to-date and easily retrievable. Fig. 5 shows the relational model for the database. The database has tables for User Type, User Account Relationship, Doctor Patient Relationship and Patient Exercise Result. The link between these tables represents the relationship between these data. A relationship works by matching data in key fields. In most cases, these matching fields are the primary key from one table, which provides a unique identifier for each record, and a foreign key in the other table. Adding the column or columns creates a link that hold one table's primary key values to the other table. For example, patients and doctors are identified and linked by their User IDs. The constraint enforces referential integrity by ensuring that changes cannot be made to data in the primary key table if those changes invalidate the link to data in the foreign key table.

CLIENT – SERVER COMMUNICATION PROTOCOL

In our system, Java language and Android OS are utilized for the client development while C++ language and Windows platform are used for the server and database; in the final version of the system, Linux flavored OS is used (CentOS). In order to cater to the communication requirements between these different, we chose protocol buffer for its web communication protocols and file format. The whole sensor frame is wrapped in one sensor frame list, the client will continue to receive the full sensor frames one by one until the sensor frame list is empty. As shown in Fig. 6, the client request will send a RequestLogin message to connect to the server. Then the server, acting as a slave, will send an acknowledgement ReposonseLogin packet to set up link. Once link is ready, the client will request for the sensor data by sending a RequestSensor message which consists of message ID, message length and message content. After the server has received and parsed these messages, it will then send the corresponding message SensorFileData to the client. Fig.7 depicts the structure of message packets. Figure 7 (a) and (b) show the packet structures for establishing the clientserver connection. The UserAccount field contains the user name and password of client, while LoginResult contains the result of connection, whether it’s a successful or unsuccessful login. As for the RequestSensorFile packet, the field name for the FileName contains the information of the requested file by the client. The SensorFrameList field in the SendSensorFile packet contains the orientation measurement of raw, pitch and yaw data. IV.

IMPLEMENTATION

Our proposed system supports two types of application, the Patient Client App and Doctor Client App. The Patient Client App allows the patient to monitor their rehabilitation exercises in real-time through the smartphone, send the exercise data to the cloud server and retrieve exercise data from the cloud server. The Doctor Client App is an application that allows the doctor to receive notification whenever there is a new patient data uploaded into the cloud server. The doctor can retrieve the patient’s exercise data the cloud using the same app. A. Patient Client App There are three application modes in the Patient Client App as shown in Fig. 8: • Exercise Only • Exercise and Send to Server • Retrieve record from the Server If the patient chooses the “Exercise Only” mode, the sensors need to be connected to Patient Client application first. After that, the sensor manager will read the motion data from the sensors, process, and then display it in 3D on the smartphone as depicted in Fig. 9 (a).

Fig. 5. The relationship model of the proposed database

 

Fig. 7. (a) RequestLogin packet; (b) ResponseLogin RequestSensorFile packet; (d) SensorFileData Packet

packet;

(c)

Fig. 6. Protocol of client retrieving data file form the server

In the “Exercise and Send to Server” mode, the patient needs to login and get access permission through the Internet, as illustrated in Fig. 9 (b). If the login is successful, the sensors will be connected to Patient Client application, and the sensor manager will read the motion data, process the motion data and display it on the screen of the smartphone. Simultaneously, the sensor manager will send these captured data to the cloud server for storage.

The “Retrieve record from the Server” mode enables the patient to download the recorded motion data from the cloud server and play back the motion in the smatphone. In this mode, the user is also required to login into the cloud server. Then, the user may choose the file that the sensor manager will retrieve from the cloud server and play back on the smartphone’s screen using the 3D arm module. Error message prompts the user whenever a motion is attempted to be played without choosing a file to playback.

B. Doctor Client App Doctor Client App has the same interface and function as the Patient Client App. The main difference between them is that the Doctor Client App allows the doctor to receive push notification in real time whenever there is a new patient data uploaded into the cloud server. For example, as shown in Fig. 10(b), the number “8” shows that there are eight new data files uploaded into the cloud server. In the Doctor Client App, the therapist or the doctor has two ways to retrieve the exercise data of their patient. In the first method, whenever there is a push notification, the doctor will have to click on the Doctor Client widget on the smartphone’s screen to retrieve the new data file. Then, the doctor is able to playback the captured exercise motion on the smartphone. In the second method, the doctor can decide to choose a particular patient by specifying the patient’s name and also stating the range of the time; start time and followed by the end time as illustrated in Fig. 10(a). V.

[1] [2]

[3]

[4]

[5]

[6]

EVALUATION

To demonstrate the feasibility of the system, a user was asked to wear the two IMUs on the upper and lower arm. Then, the user was asked to perform certain postures. We chose the “Exercise and Send to server” mode as this mode allowed us to test the functionality of the system comprehensively from capturing the motion and rendering 3D animation on screen of the phone to sending the motion captured data to the cloud server in real-time. Fig. 11 shows the comparison between the actual user posture and its motion data rendered in 3D animation in the smartphone. The figure shows that the 3D rendering of the actual arm motion data can be accurately mapped by our system. In addition, the Doctor Client App received push notification when the new posture data was uploaded into the cloud server. Furthermore, using the Doctor Client App, the user was able to playback on the smartphone the uploaded arm motion data. VI.

REFERENCES

CONCLUSION

Our proposed system integrates wearable wireless sensors, handheld computing devices and cloud computing to deliver a new interesting and interactive method of arm motion tracking. This system can be easily adapted for stroke rehabilitation that would allow patients to have easy access to rehabilitation without frequently travelling to the hospital. Our system could also allow healthcare providers, doctors and therapists alike, to monitor their patients even without being in the same room with them. Hopefully, our system could help address issues in healthcare delivery and truly leverage technology for the benefit of both the patients and the healthcare providers. As part of future work, a multi-user study is necessary to test the feasibility of rolling out our system.

[7]

[8]

[9]

[10]

[11]

[12]

[13]

[14]

N. Venketasubramanian and C. L. H. Chen, "Burden of stroke in Singapore," Int J Stroke, vol. 3, pp. 51-54, 2008 M. McLaughlin, A. A. Rizzo, Y. Jung, W. Peng, S. Yeh, W. Zhu, and the USC/UT Consortium for Interdisciplinary Research, “HapticsEnhanced Virtual Environments for Stroke Rehabilitation”, Proceedings of the IPSI, Cambridge, MA. 2005 B. Latre, B. Braem, I. Moerman, C. Blondia, and P. Demeester, "A survey on wireless body area networks," Wirel. Netw., vol. 17, pp. 1-18, January 2011 L. M. Vaquero, L. Rodero-Merino, J. Caceres, and M. Lindner, "A break in the clouds: towards a cloud definition," SIGCOMM Comput. Commun. Rev., vol. 39, pp. 50-55, January 2008 C. O. Rolim, F. L. Koch, C. B. Westphall, J. Werner, A. Fracalossi, and G. S. Salvador, "A Cloud Computing Solution for Patient's Data Collection in Health Care Institutions," Proceedings of the 2010 Second International Conference on eHealth, Telemedicine, and Social Medicine, 2010 Xuan Hung Le, Sungyoung Lee, Phan Truc, La The Vinh, A.M. Khattak; Manhyung Han, Dang Viet Hung, M.M. Hassan, M. Kim, Kyo-Ho Koo, Young-Koo Lee, and Eui-Nam Huh, "Secured WSNIntegrated Cloud Computing for u-Life Care," Proc of the 2010 7th IEEE Consumer Communications and Networking Conference (CCNC), vol., no., pp.1-2, January 2010 R. Shahriyar, M. F. Bari, G. Kundu, S. I. Ahamed, and M. M. Akbar, "Intelligent Mobile Health Monitoring System (IMHMS)”, Electronic Healthcare. vol. 27, pp. 5-12. Springer Berlin Heidelberg, 2010 Asad Masood Khattak, Zeeshan Pervez, Koo Kyo Ho, Sungyoung Lee, Young-Koo Lee, "Intelligent Manipulation of Human Activities Using Cloud Computing for u-Life Care," IEEE/IPSJ International Symposium on Applications and the Internet, pp. 141-144, 2010 J. Biswas, J. Maniyeri, K. Gopalakrishnan, L. Shue, P.J. Eugene, H.N. Palit, Foo Yong Siang, Lau Lik Seng, and Li Xiaorong, "Processing of wearable sensor data on the cloud - a step towards scaling of continuous monitoring of health and well-being," 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), vol., no., pp.3860-3863, August 2010 J. Biswas, M. Jayachandran, L. Shue, K. Gopalakrishnan, and P. Yap, Design and trial deployment of a practical sleep activity pattern monitoring system. ICOST 2009, LNCS 5597, pp. 190-200, 2009. C. Doukas, T. Pliakas, and I. Maglogiannis, "Mobile healthcare information management utilizing Cloud Computing and Android OS," 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), vol., no., pp.1037-1040, August 2010 P. Iso-Ketola, T. Karinsalo, and J. Vanhala, "HipGuard: A wearable measurement system for patients recovering from a hip operation," Second International Conference on Pervasive Computing Technologies for Healthcare, pp.196-199, January 2008 Javier Martin-Moreno, Daniel Ruiz-Fernandez, Antonio Soriano-Paya, and Vincente Jesus Berenguer-Miralles, "Monitoring 3D movements for the rehabilitation of joints in physiotherapy," 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp.4836-4839, August 2008 D. Brunelli, E. Farella, L. Rocchi, M. Dozza, L. Chiari, and L. Benini, "Bio-feedback System for Rehabilitation Based on a Wireless Body Area Network," Proceedings of the 4th annual IEEE international conference on Pervasive Computing and Communications Workshops, 2006

Fig. 8. Work flow for the Patient Client App

(a) (b) Fig. 9. (a) Application modes in the Patient Client Mode; (b) Login UI for the Exercise and Send to Server mode

(a) (b) Fig. 10. (a) Push notification when new data is uploaded to the cloud server; (b) UI for querying captured exercise data from cloud server.

Fig. 11. Different arm postures (top right and bottom right) and their 3D (top left and bottom left) rendering on the smartphone’s screen

Suggest Documents