Detecting Motion Intention in Stroke Survivors Using

0 downloads 0 Views 114KB Size Report
retain sufficient residual motor control to initiate the robotic ... participants executed and imagined a grasping task with their ... move was a more effective strategy than imagining the ... could be used to trigger robotic assistance when an intention ... are relatively fast to set up. ..... Johannes Brand for their valuable support.
Detecting Motion Intention in Stroke Survivors Using Autonomic Nervous System Responses Laura Marchal-Crespo, Domen Novak, Raphael Zimmerman, Olivier Lambercy, Roger Gassert, and Robert Riener Abstract— Individuals with severe neurologic injuries often cannot participate in robotic rehabilitation because they do not retain sufficient residual motor control to initiate the robotic assistance. In these situations, brain- and body-computer interfaces have emerged as promising solutions to control robotic devices. In a previous experiment conducted with healthy subjects, we showed that detecting motor execution accurately was possible using only the autonomic nervous system (ANS) response. In this paper, we investigate the feasibility of such a body–machine interface to detect motion intention by monitoring the ANS response in stroke survivors. Four physiological signals were measured (blood pressure, breathing rate, skin conductance response and heart rate) while participants executed and imagined a grasping task with their impaired hand. The physiological signals were then used to train a classifier based on hidden Markov models. We performed an experiment with four chronic stroke survivors to test the effectiveness of the classifier to detect rest, motor execution and motor imagery periods. We found that motion execution can be accurately classified based only on peripheral autonomic signals with an accuracy of 72.4%. The accuracy of classifying motion imagery was 62.4%. Therefore, attempting to move was a more effective strategy than imagining the movement. These results are encouraging to perform further research on the use of the ANS response in body-machine interfaces. Keywords—Body–computer interface; autonomic nervous system; physiological measurements; hidden Markov model, robot-assisted rehabilitation; stroke survivors

I. INTRODUCTION Using robotic devices to actively assist during movement training provides novel somatosensory stimulation that can help induce brain plasticity in neurologic patients [1]. * This work was supported by the Swiss National Science Foundation (SNF) through the National Centre of Competence in Research (NCCR) Robotics and on Neural Plasticity and Repair, and by the CHIRP1 ETH Research grant (CHIRP1-Application CH1-02 09-3, Cortically–Driven Assistance Adaptation during Sensorimotor Training). L. Marchal-Crespo, D. Novak and R. Riener are with the Sensory-Motor Systems (SMS) Lab, Institute of Robotics and Intelligent Systems (IRIS), ETH Zurich, Switzerland and the Medical Faculty, Balgrist University Hospital, University of Zurich, Switzerland. E-mail: [email protected] D. Novak is also with the Department of Electrical & Computer Engineering, University of Wyoming, WY 82071, USA R. Zimmermann, O. Lambercy and R. Gassert are with the Rehabilitation Engineering Laboratory (RELab), ETH Zurich, Switzerland R. Zimmermann is also with the Biomedical Optics Research Laboratory, University Hospital Zurich, University of Zurich, Switzerland

Neurologic patients' active participation is also thought to be essential for provoking motor plasticity [2] [3]. However, sufficient residual motor ability is required to initiate active robotic assistance, preventing its application to individuals who have no functional motor ability left as a result of a severe neurologic injury. Brain–computer interfaces (BCIs) have emerged as promising solutions to control robotic devices [4]. BCIs could be used to trigger robotic assistance when an intention to move is detected from cortical activity. Electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) are among the most widely used noninvasive portable techniques employed in BCIs [5]. However, the burden of connecting sensors on patients’ scalps and the relatively long training period required for the user to produce classifiable brain signals can be time consuming and, thereby, frustrating. Recent studies have introduced the concept of body– machine interfaces (BMI), where peripheral autonomic signals are used to detect behavioral intent [6]. Responses of the autonomic nervous system (ANS), such as cardiorespiratory and electrodermal responses, can be measured with economical off-the-shelf instrumentation and are relatively fast to set up. We recently conducted an experiment with heathy subjects and showed that it was possible to accurately detect motor execution using only the ANS response, as assessed through measurements of the blood pressure, respiration rate, heart rate, and skin conductance response [7]. However, this approach has not yet been tested on neurological patients, despite being the target population for the envisioned BMI application. Neurologic injuries, such as stroke, affect the autonomic system, which may introduce variations in the peripheral signals. For example, traumatic brain injury survivors are known to show abnormalities in the autonomic system (hypo or hyper-function) and show an asymmetric sweating with colder hemiplegic limbs that can affect the skin conductance response (SCR) signal [8]. On the other hand, some recent studies have shown the feasibility of using physiological signals (e.g. heart rate and SCR) to assess stroke survivors’ cognitive load during rehabilitation [9] [10], and to classify activity engagement in individuals with severe physical disabilities, such as subjects suffering from cerebral palsy and muscular dystrophy [11]. In this study, we analyze if the ANS response in stroke survivors with severe upper limb impairment can be consistently employed to control a BMI without direct measurement of brain activation. In particular, we investigated the feasibility of a BMI to detect motor

execution and motor imagery of a grasping task, by only monitoring changes in peripheral autonomic signals (i.e. mean blood pressure, breathing rate, skin conductance response and heart rate) in stroke survivors. We hypothesize that such a BMI can achieve satisfactory classifier accuracy, and that accuracy can be enhanced if weak or absent physiological responses are discarded by means of feature reduction algorithms. II. METHODS A. Participants All experiments were approved by the Ethics Committee of the Canton Zurich and by the institutional committee of the ETH Zurich. Four chronic stroke survivors with severely impaired hand function (see Table I for participants’ demographics [12]) provided informed consent. Participants did not have previous experience with BMIs or BCIs. TABLE I.

PARTICIPANT DESCRIPTION

Sex

Age

ToI

LOC

A L

DH

s1

F

78

I

Left MCA & ICA

R

R

s2

F

60

I

Left MCA

R

R

s3

F

28

H

Right frontal parietal

L

R

n.a.

12

s4

F

55

I

Left MCA

R

R

30/66 (9/24)

25

FMA 4/66 (0/24) 12/66 (1/24)

MI 26 19

Butterworth filter with cutoff frequency of 0.1 Hz, leaving only the low and very low frequency spectra of the BP signal. Skin conductance was measured via two electrodes (g®.GSRsensor, g.tec, Austria) attached to the distal phalanges of the index and middle fingers of the unaffected arm. The skin conductance response was calculated by removing the tonic level by means of an eighth-order Butterworth high-pass filter with a cutoff frequency of 0.1 Hz. All ANS signals were then down-sampled to 5 Hz, centered around zero in real time by subtracting the median of the last 60 s, and normalized by dividing the signals by the interquartile-distance of the past 60 s. C. Experimental Protocol The measurements were conducted in a silent room. fNIRS was used to simultaneously record brain activity in motor areas; however, these signals were not used in the present analysis and are beyond the scope of this paper. Subjects sat on a comfortable padded chair. The task consisted in attempting to perform or imagine a grasping task with the affected hand using a robotic device (Fig. 1a). The device, named “ReHapticKnob”, is a robot with two degrees of freedom (DOF) usually employed for hand rehabilitation [13]. The participants’ thumb and index fingers were fastened using Velcro® straps to two handles actuated by a single DC motor and instrumented with force sensors. a

ToI: Type of infarct (I: ischemic, H: hemotthagic). LOC: infarct location (MCA: middle cerebral artery, ICA: internal carotid artery). AL: affected limb side. DH: dominant hand. FMA: Fugl-Meyer motor assessment score for the upper limb function (in brackets distal section). MI: months since incident [12].

B. Measurements of Physiological Responses Based on our previous work with healthy subjects [7], four peripheral autonomic signals were recorded online: electrocardiogram (ECG), respiration, blood pressure, and skin conductance response (SCR). All physiological signals were acquired at 600 Hz using a biosignal amplifier (g.USBamp, g.tec, Austria). ECG was measured using the g®.GAMMAsys active electrode system from g.tech. The ECG signal was filtered with a fourth-order Butterworth bandpass filter with the frequency band 0.01–40 Hz. The heart rate (HR) was calculated online using an adaptive threshold algorithm [7]. The respiration signal was acquired using a thermistor respiration flow sensor (SleepSense®, Scientific Laboratory Products, USA) placed at the entrance of the nostrils. The respiration signal was filtered with an eighth-order Butterworth bandpass filter with the frequency band 0.1– 2.1 Hz. The breathing rate (BR) was calculated using an adaptive threshold algorithm [7]. The blood pressure was measured with a continuous noninvasive arterial pressure system (CNAP™monitor 500, CNSystems, Austria). The blood pressure signal was detrended by subtracting the running mean from the raw signal. It was further low-pass filtered with a first-order

b

c Move

d

Imagery

e

Rest

Fig. 1. (a) Robotic hand device ReHapticKnob employed in the experiment. Visual cues shown to the participants during baseline (b), ME (c), MI (d), and rest (e) [12].

Participants were requested to follow the instructions presented on a computer screen located in front of them, while remaining concentrated on a fixation cross displayed at the center of the screen (Fig. 1 b-e). During experimental trials, participants were asked to either i) actively (attempt to) grasp the robot handles with their impaired hand (motor execution, ME), or ii) perform kinesthetic imagery of the grasping action (motor imagery, MI). Participants were furthermore asked to rest between trials of ME and MI.

During rest periods, participants were requested to not move and to relax. The protocol consisted of two experimental runs. Each run began with a baseline rest of 90 s. During the initial training run, the subjects were requested to perform 20 ME and 20 MI trials, randomly presented, which lasted 15 s with the robot locked in position (i.e. with the hand open). The duration of the rest periods between activity trials was randomized (from 15 to 20 s) to keep the autonomic system from synchronizing with the activity periods. Those rest and activity trials were used to train a classifier that was employed in the second run for online classification of the measured signals (physiological and fNIRS signals). In the second run, participants were asked to perform trials of ME and MI (15 trials of each), similar to the training run, with the implemented online classifier running in the background. After 15 s of action task (ME or MI), the online classifier was sampled. If the classifier result was ‘active’, then a confirmatory haptic feedback was given to the user, in the form of a closing and opening movement of the hand that lasted 6 s. If the classifier result was ‘rest’, then no feedback was provided (the robot remained locked). The online classifier was also sampled after 15 s of rest trials. If a rest trial was misclassified, the robot provided ‘negative’ haptic feedback by vibrating the handles for 6 s. A 10 to 25 s idle phase followed all haptic stimuli (confirmatory or negative) during which the robot remained locked in position and no information was provided on the monitor. The objective of this idle phase was to remove possible effects of the haptic feedback on all physiological signals and, therefore, not to influence the next decision of the online classifier. Removing potential influence of the haptic feedback further allows for the implementation of post-hoc analysis on the collected dataset, for example using only physiological responses as presented in the rest of this paper. The total time required to complete a run was approximately 20 min, and participants paused for 10 min between runs. The protocol was implemented in Simulink® (The MathWoks Inc., Natick, MA, USA), and the visual display was programed using Unity (Unity Technologies, San Francisco, CA, USA). D. Classifier In this paper, we performed an offline data analysis to study the feasibility of a classifier that could detect motion intention (in particular motor execution and motor imagery) of a grasping task based only on the ANS responses (without signals from the brain) using the physiological responses measured from the four chronic stroke survivors. Based on our previous experiment with healthy subjects [7], we employed a hidden Markov model classifier (HMM) which consisted of 5 states and whose emission probabilities in all states were modeled as a mixture of 2 Gaussians, in order to avoid overfitting. The training and test data sets were offline-extracted as vectors containing the time series of the physiological data. These vectors contained the sample points from 15 s of data acquisition, corresponding to the duration of the ME and MI trials in the second experimental run. Two HMM classifiers

were trained to detect motor execution (motor imagery) using ME (MI) and rest trials from the training run. An 8fold cross-validation was employed. In order to reduce the effect on performance variability due to random initial model parameters, we ran the evaluation procedure a total of 30 times for each subject and partition. The HMM parameters from each of the 8-fold cross-validations of a training iteration were then applied to the test data. The metrics used to quantify the classifier performance were accuracy and specificity, for ME, MI and overall trials [7]. The chance level is not exactly 50%, but 50% with a confidence interval at a certain level (95%) that depends on the number of training trials [14]. During the testing class we had 15 trials per class and, therefore, based on [14], the upper confidence limit of the chance level was set to 66.5%. III. RESULTS The mean accuracy obtained across all participants was 67.4% (72.4% for ME and 62.4% for MI). Fig. 2 reports the per-participant classifier accuracy for the ME and MI trials, and the overall accuracy. The accuracy of classifying motor execution was considerably higher than the accuracy of classifying motor imagery. Due to fatigue, subject s1 only performed 10 trials of MI and 10 trials of ME during the second run. This particular stroke survivor was the only one who performed below the chance level (66.5%).

Fig. 2. Mean and SD of classification accuracy across thirty complete crossvalidation runs for each subject, for ME, MI and overall trials. The dashed line indicates the upper confidence limit of the chance level 66.5%.

In order to investigate which physiological signals were more meaningful for the classifier, and whether there was an optimal number of features that would maximize the classifier accuracy, we applied a sequential forward feature selection algorithm to find the subset of features (1, 2 and 3 features) that optimizes classification accuracy for each subject. The overall accuracy and specificity for the determined number of features that optimizes the overall accuracy are reported in Table II, together with the optimal subset of features. Even with only one feature, three over four participants performed significantly better than chance. In fact, all subjects showed a slight decrease in the accuracy as more

features were employed in the classifier. From Table II, it is clear that the most reliable physiological response was the skin conductance response (SCR), followed by the heart rate (HR) and blood pressure (BP). The breathing rate added little information to the classifier, and in fact, it seems to have a negative impact. TABLE II.

CLASSIFICATION RESULTS

subject s1

1 feature SCR

2 features BP SCR

3 features BP HR SCR

acc/spe s2

51.5 / 50.2 SCR

56.7 / 58.8 BP SCR

55.5 / 62.7 BP SCR BR

acc/spe s3

73.6 / 80.4 SCR

72.7 / 74.8 HR SCR

72.9 / 71.1 BP HR SCR

acc/spe s4

71.8 / 72.1 SCR

78.2 / 77.8 HR SCR

76.7 / 81.4 HR SCR BR

acc/spe

79.1 / 76.5

81.8 / 71.4

81.1 / 69.5

4 features BP HR SCR BR 53.6 / 64.8 BP HR SCR BR 70.3 / 69.2 BP HR SCR BR 68.5 / 74.3 BP HR SCR BR 77.4 / 66.2

Classification accuracy (acc) and specificity (spe) (%) per each subject and optimal subgroup of features. The features that optimize each subset of features number are also reported. SCR: skin conductance response, BP: low and very low blood pressure components, HR: heart rate, BR: breathing rate. Maximal accuracies achieved per subject are highlighted in bold.

We also searched for the feature subsets that optimize the accuracy across all participants (i.e. the subset of physiological responses that optimizes the mean accuracy across all participants), and found that the combination of SCR and HR yielded an optimal mean accuracy of 70.5%, slightly superior than the accuracy achieved with all the four features (67.4%). The same subset of features optimized the accuracy of the MI classifier (67.0%, therefore above chance). The accuracy of the ME classifier was maximum when only the SCR was employed (74.2%). IV. DISCUSSION We performed an experiment with four stroke survivors, the results of which showed that motor intention (hand grasping) can be accurately classified based only on peripheral physiological signals (mean blood pressure, breathing rate, skin conductance response and heart rate) with a mean accuracy across participants of 67.4%. In particular, motion execution was classified with an accuracy of 72.4%, and motor imagery with 62.4%. Therefore, despite severe hand impairments, attempting to move was a more effective strategy to control the BMI than imagining the movement, despite the inability to produce overt movement. Although there is some evidence that MI causes significant changes in the ANS (i.e. alterations in heart rate, blood

pressure and respiration rate) [15], MI is also known to induce a large variability across participants as some individuals have only limited capability to perform kinesthetic imaging (or imaging at all) [16], which could account for the lower MI accuracy reached. In fact, motor imagery is a skill that has to be learned and, therefore, BCIs based on motor imagery usually do not work very well during the first session [17]. Motor intention could be accurately classified above chance level in three of the four participants. We failed to successfully detect motion intention in one stroke survivor, who, due to fatigue, voluntarily withdrew from the experiment after performing only half of the test trials during the second recording run. We suspect that this participant, who was the oldest and most severely impaired participant, was not sufficiently engaged during the test run and, therefore, the ANS responses were not reliable enough to be correctly classified. The mean ME accuracy achieved by the three stroke survivors who completed the second test run was 78.4%, above the identified threshold of 70% for proper device control and user-friendly BCI operation [18]. This mean accuracy is comparable to the outcomes of our previous experiment performed with healthy participants, where we achieved an accuracy of 84.5% when using the same physiological signals to detect motor execution of an isometric pinching task [7]. In our previous experiment we speculated that, based on some recent studies that showed the feasibility of using physiological signals in stroke rehabilitation [9] [10], a similar classifier could be used to successfully detect motion intention using the ANS responses in neurologic patients. We hypothesized that the injured ANS could be consistently employed to control a BMI if weak or absent physiological responses were discarded by means of feature reduction algorithms [11]. Indeed, in this study we found that SCR and HR were the most reliable physiological responses, which in combination yielded a mean accuracy of 70.5 % (74.2% for ME, and 67.0% MI, both above the chance level). Therefore, in order to further simplify the measurement set up in future work, the breathing and blood pressure measurements could be excluded. A major limitation of this study is the reduced number of participants involved. However, the objective of this paper was to study the feasibility of a BMI to detect motor intention by monitoring only changes of the ANS in neurologic patients, which was successfully demonstrated. Another limitation is that, in this paper, offline classification was studied. However, in order to employ BMIs in useful and real applications, the decoding has to be done online. Here, we employed signal processing steps and classifiers that can be employed online, and thereby, the proposed method has the potential to be implemented in real time. Finally, physiological signals might be affected by different

environmental disturbances (e.g. auditory or visual stimuli) and the amount of physical activity. In this study, all these disturbances were minimized by conducting the experiment in a silent room and by adding an idle phase in order to remove possible effects of the haptic feedback on all physiological signals. However, such a setup is not realistic in a standard therapeutic environment. Ideally, the use of a wide range of different physiological features could account for these undesirable disturbances.

[6]

[7]

[8]

[9]

V. CONCLUSION AND OUTLOOK This study showed the feasibility of a BMI to detect motor intention by monitoring only changes of the ANS in neurologic patients. Motor execution was accurately classified using HMM classifiers based on only peripheral physiological signals with an accuracy level of 72.4%. However, the accuracy of classifying motor imagery was considerably lower (62.4%, below the upper confidence limit of the chance level). Therefore, attempting to move was a more effective strategy than imagining the movement, despite the inability to produce overt movement. These results are encouraging to perform further research on the use of the ANS response in BMI to promote neurologic patients' active participation during robot-assisted rehabilitation. The long term goal of this project is to develop novel human-oriented strategies that enhance robot-assisted rehabilitation. In particular, the robotic system should estimate intention so that it can optimally assist a human in the anticipated reaching, grasping or manipulation movement. The use of physiological signals and binary classifiers may not be enough to achieve this goal. Hence, in order to classify in higher dimensions (e.g. “move right” vs. “move left” vs. “rest”), we plan to incorporate measurements of cortical activation acquired through fNIRS. ACKNOWLEDGMENTS The authors gratefully acknowledge the contribution of Bertrand Pouymayou, Allen Kim, Jean-Claude Metzger and Johannes Brand for their valuable support. REFERENCES [1]

[2]

[3]

[4]

[5]

P.M. Rossini and G. Dal Forno, “Integrated technology for evaluation of brain function and neural plasticity”, Phys. Med. Rehabil. Clin. North Am. vol 15, pp. 263–306, 2004. M.A. Perez, B.K. Lungholt, K. Nyborg, and J.B. Nielsen, “Motor skill training induces changes in the excitability of the leg cortical area in healthy humans”, Exp. Brain Res. vol 159, pp. 197–205, 2004. M. Lotze, C. Braun, N. Birbaumer, S. Anders, and L.G. Cohen, “Motor learning elicited by voluntary drive”, Brain, vol 126, pp. 866– 72, 2003. J. del R Millan et al, “Combining brain–computer interfaces and assistive technologies: state-of-the-art and challenges” Front. Neurosci. vol 4, pp. 161, 2010. L-F. Nicolas-Alonso and J. Gomez-Gil “Brain Computer Interfaces, a Review.” Sensors, vol. 12(2), pp. 1211–1279, 2012.

[10]

[11]

[12]

[13]

[14]

[15]

[16]

S. Blain, T. Chau, and A. Mihailidis, “Peripheral autonomic signals as access pathways for individuals with severe disabilities: a literature appraisal” Open Rehabil. J. 1: 27–37, 2008. L. Marchal-Crespo, R. Zimmermann, O. Lambercy, J. Edelmann, MC. Fluet, M. Wolf, R. Gassert, and R. Riener, “Motor execution detection based on autonomic nervous system responses”, Physiological measurement, vol. 34(1), 2013. J. Korpelainen, K. Sotaniemi, and V. Myllyla, “Autonomic nervous system disorders in stroke”, Clin. Auton. Res. vol. 9, pp. 325–33, 1999. A. Koenig, D. Novak, X. Omlin, M. Pulfer, E. Perreault, L. Zimmerli, M. Mihelj, and R. Riener, “Real-time closed-loop control of cognitive load in neurological patients during robot-assisted gait training”, IEEE Trans. Neural Syst. Rehabil. Eng. vol. 19, pp. 453–64, 2011. D. Novak, M. Mihelj, J. Ziherl, A. Olensek, and M. Munih, “Psychophysiological measurements in a biocooperative feedback loop for upper extremity rehabilitation”, IEEE Trans. Neural Syst. Rehabil. Eng. vol. 19, pp. 400–10, 2011. A. Kushki, A.J. Andrews, S.D. Power, G. King, and T. Chau, “Classification of activity engagement in individuals with severe physical disabilities using signals of the peripheral nervous system” PLoS One 7 e303732012, 2012. R. Zimmermann “A brain-body-robot interface for sensorimotor rehabilitation following neurological injury” PhD dissertation, Eidgenössische Technische Hochschule ETH Zürich, 2013. J.-C. Metzger, O. Lambercy, D. Chapuis, and R. Gassert, "Design and characterization of the ReHapticKnob, a robot for assessment and therapy of hand function," Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on, pp.3074-3080, Sept. 2011. G. Muller-Putz, R. Scherer, C. Brunner, R. Leeb, and G. Pfurtscheller, “Better than random? a closer look on BCI results,” Int. J. Bioelectromagn., vol. 10, pp. 52–5, 2008. J. Decety, M. Jeannerod, M. Germain, and J. Pastene “Vegetative response during imagined movement is proportional to mental effort”, Behavioural Brain Research, vol. 42(1), pp. 1–5, 1991. N. Sharma, V.M. Pomeroy, and J.C. Baron, “Motor imagery: a backdoor to the motor system after stroke?”, Stroke. vol. 37(7), pp. 1941–52, 2006.

[17] G.E. Fabiani, D.J. McFarland, J.R. Wolpaw, and G. Pfurtscheller,

“Conversion of EEG activity into cursor movement by a braincomputer interface (BCI)”, IEEE Trans Neural Syst Rehabil Eng, vol. 1(3), pp. 331–338, 2004. [18] A. Kubler, V. Mushahwar, L. Hochberg, and J. Donoghue, “BCI meeting 2005-workshop on clinical issues and applications”, IEEE Trans Neural Syst Rehabil Eng, vol. 14(2), pp.131–134, 2006.

Suggest Documents