T. Zhang, D. . Klein, T. Walsh, J. Lu, and E. S. Sazonov, “Android TWEETY: A wireless activity monitoring and biofeedback system designed for people with Anorexia Nervosa,” in 2014 IEEE International Symposium on Medical Measurements and Applications (MeMeA), 2014.
Android TWEETY—A Wireless Activity Monitoring And Biofeedback System Designed For People With Anorexia Nervosa
Q1
Ting Zhang, Diane A Klein, Timothy Walsh, Jiang Lu, and Edward S. Sazonov, Senior Member, IEEE
Abstract—Anorexia Nervosa is a life-threatening psychiatric condition that is typically associated with food restriction and may also be associated with excessive Physical Activity (PA), including fidgeting. Therefore, patients may benefit from interventions aimed at reducing excessive physical activity. We developed a prototype of a shoe-based wireless activity monitoring and biofeedback system (TWEETY) that provides audiovisual feedback on an android phone. Tweety is based on SmartShoe with built in pressure sensors and accelerometers. The system monitors daily activities, characterized as sitting, standing, walking, and provides multiple levels of feedback when the subject is sitting while fidgeting. The sensor data are wirelessly delivered to a remote server that runs a pattern recognition algorithm and sends the feedback response to the user’s phone. The user-interface uses a sprite of a canary bird to provide multiple levels of feedback. The system was tested on 2 healthy subjects. Preliminary results show that the activity monitoring and biofeedback system has a high level of accuracy and may be suitable for helping Anorexia Nervosa patients’ rehabilitation.
I. INTRODUCTION Anorexia Nervosa is a severe psychiatric illness characterized by irrational fear of gaining weight, distorted body self-perception, and inadequate food intake for energy requirements [1]. People with Anorexia Nervosa may also demonstrate excessive physical activity that can contribute to energy imbalance [2] [3] [4]. Anorexia Nervosa affects up to 1% of young women in Western cultures, is associated with 4% mortality, and is often resistant to treatment including psychotherapy, family therapy, and medication [5] [6] [7] [8]. Non-exercise activity, such as movements of fidgeting, plays termed Non-Exercise Activity Thermogenesis (NEAT), or Spontaneous Physical Activity (SPA), plays an important role in energy expenditure and weight regulation [9]. The level of NEAT may be elevated in patients with Anorexia Nervosa and may also benefit from specific intervention.
Activity Monitoring System (PAMS) for patients with restrictive type Anorexia nervosa. PAMS aims at recording individual’s body acceleration with walking velocity and walking energy expenditure [13]. However, PAMS does not monitor other daily activities such as sitting and standing. The Intelligent Device for Energy Expenditure and Activity (IDEEA) with 5 bi-axial accelerometers is able to classify individual’s daily postures such as walking, sitting and standing. It has sensors attached to the feet, thighs and chest that transmit data about position and acceleration via small wires to a microcomputer worn on the waistband [14]. While it records non-exercise postures such as sitting and standing, it does not measure the physical activities with the postures, such as sitting with fidgeting. To the best of our knowledge, none of the currently available systems can provide biofeedback for reducing non-exercise activities in anorexia patients. In this study, we describe a prototype of an unobtrusive wearable biofeedback system (TWEETY). The SmartShoe [15] sensor system is used in combination with real-time posture and activity classification performed over a wireless link on a remote server. Whenever the user begins to fidget while sitting, multiple-level audiovisual feedback is delivered on the user’s smartphone. On the phone’s screen, different emotional canary bird images related to the duration of continuous movement are shown as warnings. The phone will also play warning sounds along with the images. In this paper we describe the details of the development of the system and provide initial testing of the device. II. HARDWARE AND SOFTWARE ARCHITECTURE A. System Overview
Modern sensor technologies and pattern recognition techniques make it possible to monitor activities without the need of staying in a particular environment [10] [11] [12]. Harris et. al validated a system, called Physical Ting Zhang, Jiang Lu and Edward S. Sazonov is with The Department of Electrical and Computer Engineering, The University of Alabama, Tuscaloosa, AL 35487 USA (Phone: 205-348-1981; Fax: 205-348-6959; e-mail:
[email protected]) Diane A. Klein is with Department of Psychiatry, NYU Langone Medical Center, New York, NY 10016. Timothy Walsh is with Columbia University College of Physicians and Surgeons, New York, NY 10032 and the New York State Psychiatric Institute, New York, NY 10032.
Figure 1. System architecture
The TWEETY system includes two set of components. The first set includes the SmartShoe and the phone, which the user keeps with them doing their daily activities such as staying at home, going to visit friends or going shopping. The second set is kept in a rehabilitation center, and includes a server and corresponding software. The SmartShoe collects and aggregates data and sends them to the phone via a short-range Bluetooth wireless link. The phone transfers the data to the remote server through the wireless data link or WiFi. The server processes the data and sends the feedback trigger back to the phone to trigger the display of the feedback. The whole system operates in near real time, with the delay of no more than several seconds. Figure 1 illustrates the system architecture. B. Wearable Shoe Sensors The SmartShoe [11] is the primary source of data in the TWEETY system. A SmartShoe has an insole with built in pressure sensors positioned at the toe, metatarsal heads and heel. The insole connects to a small clip-on that can be attached to the side of a shoe. The battery, power switch, a three dimensional accelerometer and wireless Bluetooth 2.0 module are installed on a rigid circuit board in the clip-on. The readings of the pressure sensors and accelerometers are sampled at 25 Hz, digitized by a 12-bit A/D converter and then sent over Bluetooth to the smart phone. Each single sample of data from the shoe is represented by a vector S = {AAP, AML, ASI, PH, PM, PHX}. Table I shows the meanings of the sensor readings and their corresponding representations in the pattern recognition algorithm.
91.5% for the group (subject-independent, not requiring individual calibration) model. Figure 2 shows the decision tree structure. The branches are the computed features from the sensor readings. The thresholds are obtained from the training process. The tree leaves are the classification results. Once sitting is recognized, we utilize an algorithm to detect fidgeting. In a two second time interval, there are 50 sample vectors. First, we compute the sum of the square of 3D accelerometer readings for each sample. Then we compute the maximum of these numbers. If the computed TABLE II. PSEUDO CODE FOR THE BIOFEEDBACK SCHEDULER
Pseudo code for the biofeedback scheduler 1. 2. 3. 4. 5. 6. 7. 8.
if walking , no feedback. else if standing , no feedback. else if sitting: if no fidgeting, no feedback. else if fidgeting for T > X generate feedback trigger 1. else if fidgeting for T > 2*X generate feedback trigger 2. else if fidgeting for T > 3*X generate feedback trigger 3. else if the fidgeting is stopped, feedback trigger 4.
seconds, seconds, seconds, generate
C. Feature Computation, the Classifier and Three Levels Biofeedback Scheduler Algorithms To accommodate remote monitoring of the patient’s activity and unload smartphone’s CPU from feature computation and pattern recognition, sensor data from the SmartShoe is sent to the remote server, which processes the data and sends a biofeedback trigger to the phone. Algorithms running on the server are feature computation algorithms, the decision tree classifier, and three levels biofeedback scheduler. All these algorithms are written in Matlab. Feature computation is performed on the sensor signals corresponding to two seconds of observation. The features are mean, standard deviation, entropy [16], variance, maximum value, Number of Mean Crossings (NMC), and Mean Absolute Deviation (MAD). Therefore, the number of computed features in every two seconds will be 7 features / sensor * 8 sensors = 56 features. The computed features are given to a Decision Tree for classification. The Decision Tree is a hierarchical model that recursively separates the input vectors into different classes. The result is a tree-like structure, which is composed of decision nodes and decision leaves. Each node has a test function that determines which branch is taken for the next step. The process is done recursively until one of the leaves is reached and the class of a particular vector is determined [17]. In this study, we utilize C5.0 algorithm. The classifier was trained on the data from 12 subjects performing different physical activities from our previous study [18], with the activity classification accuracy of
Figure 3. Components of the Android Program
maximum acceleration is above a certain threshold, we consider that fidgeting is occurred during the time interval. The threshold is determined by repeated experiments with multiple subjects. A four levels biofeedback scheduler algorithm determines the timing and particular type of biofeedback to be sent to the phone. As the feature computation, the corresponding activity classification algorithm, and fidgeting detection algorithm are performed on two second time intervals, the biofeedback scheduler also makes decisions on a two second time interval basis. We utilize a threshold X=6 seconds, to compute the time interval after
which a feedback trigger is generated. The system generates biofeedback when the subject is sitting while fidgeting. The system generates different levels of biofeedback, based on whether the subject starts, continues or stops fidgeting. A simplified pseudo code is shown in the Table II. D. Communication Protocols And User-Friendly Biofeedback Interface On The Phone The Android program on the phone is responsible for background wireless communication with the Smart Shoe and the remote server and user-interface. Figure 3 shows the components of the Android program running on the phone. There are three software components implementing the communication protocol. The first one is the Bluetooth data receiver. The shoe clip-on communicates with the phone via Bluetooth Serial Port Protocol (SPP), feeding the receiver thread in the Android program. The second component is the data transfer service, which sends aggregated data to the remote server. This component includes TCP/IP and UDP protocols. Connection messages, which describe whether the sensor data is available and test whether the connection is established is sent through TCP/IP. The sensor readings are sent through UDP. The third component is the biofeedback trigger receiver. After the biofeedback type is determined by the biofeedback scheduler running on the remote server, the biofeedback trigger is sent through TCP/IP to the phone to deliver the feedback. The Android program has a user-friendly user interface, specifically designed for the target population of young women with Anorexia Nervosa. The initial screen allows the user to start the application (Figure 4). On the starting screen, there are a few buttons and text entry spaces for the user to set the initial environment. The user can choose the detected sensors for which to record data. The user can also enter a file name for the data stored in the phone and set the server’s TCP/IP address. These settings may be removed in the later version, but at this time they allow for easy configuration during pilot testing of the device. The main screen of the user interface displays multiple levels of biofeedback. When the phone receives the biofeedback triggers, the corresponding image will display on the phone. Each biofeedback is a notification in Android development strategies. The notification consists an image of a sprite of a canary bird (Tweety, Figure 5) bird with a particular emotion, a text message as well as a warning sound resembling a bird song. As shown in Figure 5, the first level of biofeedback shows the bird sleeping and user’s fidgeting is about to wake up Tweety bird which responds with a text message ‘Hello’. The second level biofeedback shows a woken up Tweety with text ‘You are moving!’. The third level biofeedback is a frustrated Tweety with the text ‘You are still moving’. There is a fourth level biofeedback, with the picture the same as the first one (sleeping bird), but with text ‘Thank you’. It is displayed when the user stops moving.
Figure 4. Starting screen with text boxes and buttons
Figure 5. Emotional sprites of the canary bird displayed as feedback
III. PRELIMINARY TEST AND VALIDATION A. Participants and activities Data collection was performed on two human subjects, one male and one female. The summary of the subjects’ characteristics is shown in Table III. In the data collection procedure, each individual wore the Smartshoe with size (US) 10 for men and 7 for women for duration of 1 hour per day. The experiment was done in two continuous days. A total of 4 hours of data were recorded for two sets of activities. The first set of activities includes walking, standing, sitting without fidgeting for 5 minutes, and sitting with fidgeting for 3 minutes. The second set of activities includes sitting with fidgeting for time intervals between 5 seconds to 1 minute. Data was collected for different activities in self-selected sitting postures to better mimic real-life conditions. Participants performed each activity 6 times. For each day, the participants performed each activity 3 times, and the procedure repeated for another day. The order in which each activity was performed was randomized. A random number generator was used to determine the order of the activities. The description of the activities is shown in Table IV. B. Accuracy measures Accuracy of the posture and activity classification was established by comparing class labels produced by the decision tree classifier with known activities performed by the subjects. After the experiments, based on the labels of the actual activities and the computed activities, we computed the confusion matrix of the classification algorithm.
The biofeedback was validated through manual observation of the Android phone and recording each instance of biofeedback. Specifically, the feedback events were counted as true positives (TP), true negatives (TN), false positives (FP), and false negatives (FN) depending on whether the feedback was expected a given activity. For example, A True negatives (TN) was counted, as in the number of six second intervals in which the TWEETY produced no feedback and no feedback was expected from the system. The counts of TP, TN, FP, and FN were used to estimate precision and recall of the feedback mechanism: ் ܴ݈݈݁ܿܽ ൌ (1) ்ାிே ்
ܲ ݊݅ݏ݅ܿ݁ݎൌ
(2)
ிା்
IV. RESULTS Figure 6 shows the readings of accelerometers and pressure sensors of the SmartShoe. The three sets of plots show the sensor readings of the SmartShoe in different types of activities. Each of them shows 8 seconds of data. On the left side, each plot shows the readings from the three axes of the accelerometer. On the right side, each plot shows the readings from the pressure sensors. Plot (a) shows the difference in sensor readings when the subject is sitting with and without fidgeting, indicating noticeable
difference both in acceleration and pressure sensor readings. Table V shows the confusion matrix of the Decision Tree models. The overall classification accuracy is 94.99%. The classification accuracy is similar to the one in our previous study for the group model. However, in this experiment a higher proportion of sitting is confused with walking. A possible reason is that in the experiment, the participant performs fidgeting for a long period of time. The accelerometer readings vary when the fidgeting activity is performed. This increases the chance that they are misclassified as walking. After the classification, if the activity is recognized as walking, or standing, the biofeedback scheduler generates no biofeedback triggers. However, when the result is sitting while fidgeting, based on the time that the subject continues fidgeting, we have a four level notification strategy. After the scheduler generates the biofeedback triggers, they are sent to the Android phone. On the Android phone, the biofeedback includes TWEETY notifications, which include pictures, text messages and warning sound. The picture and text message are shown in Figure 5. The biofeedback event counts are shown as Table VI and Table VII. Table VI shows that virtually no biofeedback events are generate in any activity from the first set but sitting with fidgeting, which is the desired mode of operation. Table VII shows counts of biofeedback events in sitting with fidgeting in the second set of activities. As Table VII demonstrates most of biofeedback is TABLE V. CUMULATIVE CONFUSION MATRIX FOR DECISION TREE Predicted
a)
Actual
Sitting
Standing
Walking
3356 37 16
64 1721 22
180 42 1762
Sitting Standing Walking
TABLE VI. CUMULATIVE COUNTS OF TP, TN, FP, AND TN IN 6 EXPERIMENTS IN DIFFERENT ACTIVITIES
b)
Activities
TP
FP
TN
FN
Walking (5 min) Standing (5 min) Sitting without fidgeting (5min) Sitting with fidgeting (3 min)
0 0 0
2 0 0
298 300 300
0 0 0
182
0
0
4
TABLE VII. CUMULATIVE COUNTS OF BIOFEEDBACK EVENTS IN 6 EXPERIMENTS OF SITTING WITH FIDGETING
c) Figure 6. Sensor readings of different postures. (a) Sitting in which the first 4 seconds activities are with fidgeting and the last 4 seconds are without fidgeting; (b) Standing (c) Walking.
Time in sitting with fidgeting 5s 10 s 15 s 30 s 60 s
TP
FP
FN
0 12 17 36 65
6 2 3 5 3
0 0 1 0 1
delivered when algorithmically expected. In two cases, the feedback was not delivered when expected. These cases of false negatives are likely to be caused by incorrect classification of posture and activity, thus creating a break in the sitting sequence and resetting the feedback trigger. A majority voting scheme may be used to prevent such cases of false negatives in the future. Far more (19 total) false positive events were observed. The reason for the false positives from the 5-second activities, is that in each experiment, the system generated one ‘Thank you’ message, even before the bird was waken due to a bug in the code. The rest experiments did not have this problem as there was a ‘Thank you’ message expected after the continuous fidgeting which is more than 6 seconds. The possible reason for the false positives in these experiments is that when the person with the stopwatch says stop, the participant then stops. There is likely to be a gap in time between the two actions, which may generate one more feedback. There is not a column of true negatives in this table, because this scenario simply does not exist in these activities. We further computed that precision and recall are 93.69% and 98.11%. This shows that the proposed monitoring and biofeedback system has very high accuracy. IV.
CONCLUSION AND DISCUSSION
In this study, we designed an activity monitoring and biofeedback system relying on two wireless links for remote processing of the sensor data on a server generating biofeedback triggers. The whole system can operate in real time. The interface is user friendly, generating emotional sprites, text messages and sounds to alert the user of excessive fidgeting. We performed pilot validation of the system by healthy participants performing different activities. The validation results show that the system is feasible for usage by real patients. Further work in experimental aspect includes recruiting more subjects to validate the system using image-based methods. For example, we will recruit people with Anorexia and validate the system with a video monitoring the system performance and providing the ground truth for comparison. Further work in technical aspect on the system will include experiments to determine the best time interval triggering the biofeedback. In this work the threshold parameter X was set at 6 seconds. An alternative would be to allow the user to input this parameter on the phone by his preference or conduct a study to determine usability of different thresholds. Another potential future direction is implementation of the classification algorithm solely on the phone. While this will make the system less communication dependent, the benefit of real-time monitoring may be lost. In conclusion, the proposed TWEETY biofeedback system designed for Anorexia patients is convenient for everyday usage and has feasible accuracy. Future work includes validate the system with Anorexia patients in a clinical environment.
REFERENCES [1] N. R. Carlson, W. Buskist, C. D. Heth, and R. Schmaltz, Psychology: The Science of Behaviour, Fourth Canadian Edition with MyPsychLab. Pearson Education Canada, 2009. [2] D. A. Klein, L. E. S. Mayer, J. E. Schebendach, and B. T. Walsh, “Physical activity and cortisol in anorexia nervosa,” Psychoneuroendocrinology, vol. 32, no. 5, pp. 539–547, Jun. 2007. [3] J. C. Rosen, J. Reiter, and P. Orosan, “Assessment of body image in eating disorders with the body dysmorphic disorder examination,” Behav Res Ther, vol. 33, no. 1, pp. 77–84, Jan. 1995. [4] M. J. Cooper, “Cognitive theory in anorexia nervosa and bulimia nervosa: progress, development and future directions,” Clin Psychol Rev, vol. 25, no. 4, pp. 511–531, Jun. 2005. [5] A. Signorini, E. De Filippo, S. Panico, C. De Caprio, F. Pasanisi, and F. Contaldo, “Long-term mortality in anorexia nervosa: a report after an 8-year follow-up and a review of the most recent literature,” Eur J Clin Nutr, vol. 61, no. 1, pp. 119–122, Jan. 2007. [6] J. R. Falk, K. A. Halmi, and W. W. Tryon, “Activity measures in anorexia nervosa,” Arch. Gen. Psychiatry, vol. 42, no. 8, pp. 811–814, Aug. 1985. [7] R. C. Casper, “Behavioral activation and lack of concern, core symptoms of anorexia nervosa?,” Int J Eat Disord, vol. 24, no. 4, pp. 381– 393, Dec. 1998. [8] C. Davis, D. K. Katzman, S. Kaptein, C. Kirsh, H. Brewer, K. Kalmbach, M. P. Olmsted, D. B. Woodside, and A. S. Kaplan, “The prevalence of high-level exercise in the eating disorders: etiological implications,” Compr Psychiatry, vol. 38, no. 6, pp. 321–326, Dec. 1997. [9] J. Hebebrand, C. Exner, K. Hebebrand, C. Holtkamp, R. C. Casper, H. Remschmidt, B. Herpertz-Dahlmann, and M. Klingenspor, “Hyperactivity in patients with anorexia nervosa and in semistarved rats: evidence for a pivotal role of hypoleptinemia,” Physiol. Behav., vol. 79, no. 1, pp. 25–37, Jun. 2003. [10] J. Lu, J. Gong, Q. Hao, and F. Hu, “Multi-agent based wireless pyroelectric infrared sensor networks for multi-human tracking and self-calibration,” in 2013 IEEE Sensors, 2013, pp. 1–4. [11] Q. Hao, F. Hu, and J. Lu, “Distributed multiple human tracking with wireless binary pyroelectric infrared (PIR) sensor networks,” in 2010 IEEE Sensors, 2010, pp. 946–950. [12] J. Lu, J. Gong, Q. Hao, and F. Hu, “Space encoding based compressive multiple human tracking with distributed binary pyroelectric infrared sensor networks,” in 2012 IEEE Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), 2012, pp. 180–185. [13] A. M. Harris, D. E. McAlpine, R. Shirbhate, C. U. Manohar, and J. A. Levine, “Measurement of Daily Activity in Restrictive Type Anorexia Nervosa,” Int J Eat Disord, vol. 41, no. 3, pp. 280–283, Apr. 2008. [14] S. Kwon, M. Jamal, G. K. D. Zamba, P. Stumbo, and I. Samuel, “Validation of a Novel Physical Activity Assessment Device in Morbidly Obese Females,” Journal of Obesity, vol. 2010, Feb. 2010. [15] F. Hu, Cyber-Physical Systems: Integrated Computing and Engineering Design. CRC Press, 2013. [16] H. Zheng-you, C. Xiaoqing, and L. Guoming, “Wavelet Entropy Measure Definition and Its Application for Transmission Line Fault Detection and Identification; (Part I: Definition and Methodology),” in International Conference on Power System Technology, 2006. PowerCon 2006, 2006, pp. 1–6. [17] T. Zhang, W. Tang, and E. S. Sazonov, “Classification of posture and activities by using decision trees,” in 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2012, pp. 4353 –4356. [18] T. Zhang, G. D. Fulk, W. Tang, and E. S. Sazonov, “Using decision trees to measure activities in people with stroke,” in 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2013, pp. 6337–6340.