Guidance and Movement Correction Based on ... - IEEE Xplore

7 downloads 132 Views 916KB Size Report
interaction support tool for rehabilitation systems. The Kinect sensor gives three-dimensional information about the user body, recognizing skeleton and joint ...
2012 14th Symposium on Virtual and Augmented Reality

Guidance and Movement Correction Based on Therapeutics Movements for Motor Rehabilitation Support Systems Alana Da Gama, Thiago Chaves, Lucas Figueiredo, Veronica Teichrieb Voxar Labs, Informatics Center Federal University of Pernambuco Recife, Brazil {aefg, tmc2, lsf, vt}@cin.ufpe.br conventional therapies, in order to achieve rehabilitation goals. The use of technology can motivate the patient and promote entertainment, making the repetitive motor control practice more pleasing, as well as diverting patient attention from their pain. Therewith, the adhesion of patients to treatment showed to be significantly increased [2-3]. In both VR and AR applications, interaction is a fundamental characteristic. These systems enable the user to interact with real and/or virtual environments and through these interactive tools it is possible to control the encouragement through feedbacks and to perform measurements on the movement. This way, the rehabilitation process can be optimized and patients’ adhesion improved [7]. Although there are benefits provided by these systems, it is important to control the movement execution all the time while it is being performed [4, 8]. During motor rehabilitation, the wrong exercise can undermine the effectiveness of the therapy, or even be harmful for the patient. According to Rainville and partners [9] the use of postural compensations during therapy can promote pain and also reduce motor ability. Natural Interaction is a key concept in those systems. During the treatment, the patient must not be concerned about how the interface works. Rather, it is the interface the one that should interpret user’s intention. This way, the user can focus their attention on the given task (e.g. pursuit a virtual object with his arm) [10]. Moreover, different tools can be used in VR and AR applications, including sensors [6, 11-12]with the intent to improve interaction ways. Microsoft recently launched Kinect, which enables three-dimensional perception from captured images. The Kinect was developed for the game console Xbox360 [13], however, since it has shown to be a programmable tool, many works are being developed with the purpose of amplifying its uses, aiming other interaction demands [14-15]. In order to create a rehabilitation system, this study proposes the use of a natural interface, enabling a body motion interaction with movement freedom, aiming to improve the exercise execution by the patient, providing movement guidance and correction according with therapeutic movement description. Moreover, the use of the proposed tool extends the supervision of the exercise and the assistance to correct it, providing orientations and stimulus. Shoulder abduction movement, due to its frequent

Abstract - The use of Virtual and Augmented Reality systems for motor rehabilitation is increasing mainly due to the new interaction tools which enable interaction through body movements. To improve these systems it is important to provide movement guidance and correction according to the therapeutic description of it. The use of natural interaction can help due the understanding of user's intentions, e.g. analyzing their body movements allows the application to provide information and encouragement to them in real time, improving the interaction and the treatment process as a whole. Thus, the aim of this research is to provide guidance and correction to performed movements based on the therapeutic definition of them. This is applied on motor rehabilitation through the use of the Kinect sensor as interaction support tool for rehabilitation systems. The Kinect sensor gives three-dimensional information about the user body, recognizing skeleton and joint positions, however, it does not provide detection of body specific movements. This way, the correct description of therapeutic movement (upper limbs, for instance) was implemented in a prototype. A scoring mechanism was also developed in order to measure the patient performance, as well as to encourage him/her to improve it by displaying a positive feedback whenever he does a correct movement. Keywords: Rehabilitation, Tracking, Kinect.

Natural

Interaction,

Body

I. INTRODUCTION The motor rehabilitation is a continuous , repetitive and sometimes slow process which can become tiring and discouraging. Usually, this situation turns the treatment unattractive promoting patient evasion. A patient satisfaction study showed that their satisfaction is strongly correlated with the quality of patient-therapist correlation , i.e. the amount of time the professional spends with the patient and the clarity of the treatment explanation [1]. Patient participation and interaction with the treatment is essential to the rehabilitation process. The efficacy and intensity of treatment is considerably improved if the patient is motivated and aware of their activities, [2-3] which can be provided through guidance and feedback. Different kinds of rehabilitation methods can be improved through interactive systems such as Virtual Reality (VR)or Augmented Reality (AR) based systems. These methods includes orthopedic [4] , neurologic [5-6] and cognitive [3] treatments. The use of VR/AR platforms have brought revealing advantages, in relation to 978-0-7695-4725-1/25 $26.00 © 4725 IEEE DOI 10.1109/SVR.2012.15

191

use in rehabilitation exercises, is used as case study to evaluate the first system prototype. This paper is organized as follows. First, in Section 2, the major related work regarding AR and VR systems for rehabilitation support are presented. Section 3 describes the proposed rehabilitation support system, including implementation details, movements mapping procedure explanation, user feedback interface, system pre-evaluation and user tests. Section 4 shows the case study results and discussions about the system sensibility, accuracy, movement recognition, feedback and usability. Lastly, some conclusions are drawn and future works are discussed.

gloves [5]. However, these attached equipments are a drawback to natural interaction systems, contrariwise, the user should feel free when experiencing an interface, being not tied to additional objects [10]. Another AR based system was developed for cognitive rehabilitation of children using markers. This study compared the interaction of healthy and disabled children with the system, and observed that the last one were very enthusiastic during the use of it. In general, all subjects showed to be more motivated, but mainly the autistic and trisomic children [3]. As previously mentioned, the use of markers makes interaction less natural, due to the need of attached equipments, and promotes reduction of tracking efficiency due the high incidence of motion blurs. Trying to improve rehabilitation system interaction, a full body motion capture (MoCap) system was developed by Shönauer and partners [12]. The system uses an infrared optical tracker which is a passive marker based motion tracking that works by the user wearing motion suits with retro reflective markers attached to them being tracked by cameras. This kind of technology has not been used for this application due to numerous reasons like cost, inflexibility and complexity of operation[12]. This technique improves motion freedom and tracking system, but there is the necessity of wearing a special cloth, which also interfere in the opposite direction to the concept of freedom that is promoted by natural interaction. A study focused on unsupervised patient rehabilitation is described by Rado and partners [4]. By using a passive marker based infrared optical motion tracking system, which is commonly used for gait analysis, a tracking of the knee movements was realized. This work makes a dynamic performance evaluation, by detecting errors and showing to the user how to perform the movement correctly [4]. Although movement supervision advantages, the system provides a local movement analysis limiting correctness to this region. Moreover, as presented before, the use of markers limit the interaction with system. This way, the work presented here aims to analyze Kinect use as an interaction tool, in order to improve motor rehabilitation process through a natural interaction based system. Through it the propose study plans to enhance therapies providing body motion interaction with movement freedom enabling patient better movement execution, with supervision to correct accompany it and stimulus to motivate it.

II.RELATED WORK The application of VR and AR based systems showed to be, over time, a successful tool for the optimization of the most varied treatment procedures. These systems provide multisensory and multidimensional real time interaction [7] and the individualization (and standardization) of the treatment or environment [7, 16], which can be graduated and adapted accordingly to the rehabilitation program necessities [2, 16]. Furthermore, they also provide patient safety and entertainment through interactivity as an option to distract them from their pain [2]. In addition, AR allows the interaction with real objects, improving social communication, enabling uses on specific deficiencies, and promoting users motivation [3]. Although, major interactions tools used at these systems are restricted to some body parts, which limit treatments diversity and control and also patient freedom during therapy execution. In general, the interaction methods on AR devices, applied to rehabilitations, are predominantly performed by marker based techniques [6]. These markers are used as reference to gather information about the scene orientation and the positioning of the virtual object, through which the treatment is directed to. It is also common to apply these techniques associated to haptic sensors to improve the interaction [5]. Nevertheless, none body reference for interaction is used on them, turning difficult to analyze movement carefully, which is a powerful tool not only for the current patient evaluation but also for the storage and future analysis of his progress on the rehabilitation treatment. The benefits of AR systems have already been experimented on different populations including children [3] and stroke survivors [5-6, 11]. A study developed with stroke subjects experiments a desktop AR system for rehabilitation using markers attached to real objects adding virtual information [6]. The scenario added using AR can orientate the treatment, simulating daily activities or inducing a movement that the user should do during the rehabilitation program. Although that, the use of markers limited therapy and user mobility on system. This happens mainly due the limit of vision angle of marker to camera and tracking failures. The development of useful tools for stroke survivors was also tackled by Luo and partners [5], demonstrating a training environment which integrates AR with assistive devices. In this case, in order to interact with virtual objects, the system required the use of a head mounted display and

III.REHABILITATION SUPPORT SYSTEM The system introduced in this work consists of a method for movement recognition, focused on the guidance and correction of the movement execution according with therapeutic movement description, during motor rehabilitation therapies. This orientation can be useful as a complimentary therapy and mainly to home care therapies, where physiotherapist cannot supervise patient and they have to execute the rehabilitation program alone. Through the skeleton tracking provided by the Kinect sensor it is possible to perform the needed analysis. The Kinect is capable of detecting the user body position without

192

1) Therapeutic Movements A standard to categorize the movements is provided by the International Society of Biomechanics (ISB), which describes the movements for each joint according with its anatomy [17]. If the joint is able to move through only one plane and axis it is a monoaxial joint, if it has mobility on two and three planes it is a biaxial and triaxial joint, respectively. The shoulder anatomy is a spheroid joint which enable movement at the three planes and axis [18]. Due to this, it is necessary to have a movement recognition which enables to differentiate these movements and categorize them correctly. The first reference for movement description is the anatomic position which is associated to the start human body position used in movement classification. The anatomic position is the stand position with hand palm facing forward and toes pointed to front (Figure 1). Another important reference to it is the median line which crosses the body from head to feets, passing through the gravitational center dividing it into right and left sides [19].

any wearable device or calibration, and few computational resources are necessary. Moreover, the sensor works well in most light conditions (clarity or darkness), with the exception of outdoor environments on daylight, due to its infrared tracking method [15]. Since this work focuses on common treatment scenarios that are typically indoor places, this issue is not relevant. Thus, the sensor showed to be an efficient tool towards a more natural interaction, respecting values like user freedom, low latency response, accurate movement tracking, AR feedback support, and so forth. Some specific movement recognition was implemented considering skeleton tracking provided by the sensor. This way, only movements which were correctly executed result in points for the patient, increasing their score on the rehabilitation session. During rehabilitation process, each exercise must be executed properly to improve treatment efficacy. So a score mechanism was developed in order to motivate the patient to achieve each time a better result. Moreover, the point measurement can be used by both, the patient and the physiotherapist, to analyze the patient’s progress on each treatment session as well as to visualize it in a broader perspective for further decisions. A. Implementation The implementation was made with C++ language, using Visual Studio 2010 professional. Tests were executed using an Intel Core I3 M460 CPU, 4GB of RAM, with Intel Graphics Media Accelerator HD VGA. The Kinect sensor is composed by a RGB camera, an infrared pattern projector as well as an IR camera (giving depth information), and a USB output that enables the connection with a computer [14]. From conjugated information of color and depth images it is possible to develop techniques for skeleton recognition. Tracking is made based on training, where a skeleton model is attached to the depth image using the OpenNI library [15]. The two-dimensional information captured from the environment is given by the RGB image. For threedimensional space information, a depth map is generated from Kinect sensor. Through this information, the Z axis is stabilized (growing from proximal to distal). Then, the user’s body is tracked and registered by the system. The tracking is maintained continuously, therefore, in case of partial or total occlusion the system does not lose tracking of the user if the time of occlusion is lower than 10 seconds [15]. In order to perform user detection, a calibration pose (psi pose) is requested.

Figure 1. Tridimensional planes overlapped to body to classify biomechanical movements.

Human body movements are tridimensional, which means that it can move through the three space planes: XY, XZ and YZ. These planes receive specific names when applied to body system, centered on it, representing the plane were bone can move in relation to a determined joint. The XY plane, named frontal or coronal plane, divides body into front and back. The YZ plane, called sagittal or vertical plane, splits the body into right and left side. The last one, XZ, which is the horizontal or transversal plane, divides the body into up and down portions [17]. The planes overlaped to body is presented at Figure 1. The movement’s descriptions are based on these biomechanical references and also in the bone direction. The movements which are realized on frontal plane are analyzed in relation to median line, if the bone is going to the center, towards to median line, the movement is named adduction,

B. Therapeutic Movements Recognition The need movement mapping was made based on therapeutic description. On motor rehabilitation, exercises are prescribed according with joint mobility, which is categorized according with the planes and axis where its anatomy enable it move. This section presents the concepts of therapeutic movements and how to classify it. Hereafter the movement mapping procedures to recognize it is described.

193

if it is going towards the opposite direction it is called abduction (Figure 2a). Performing movements at frontal plane, they are described in relation to bones (connected by the joint) inter relation, if the bones are approaching to each other, it is a flexion, and if they are moving away from each other it is an extension (Figure 2b). The last one is the horizontal plane where rotation movements occurs [19]. The elbow anatomy only provides movement at one single plane. Due this, it is not necessary the use of plane reference described before for this joint, being necessary only angle computation.

(1)

(2)

It is important to notice that the angle measurement for a joint is the same independently of the plane where the bone is moving. If the user is raising his/her arm in front of the body (at saggital plane) or laterally (at frontal plane), the angle measure will be the same, but, according to ISB, there is a shoulder flexion and abduction respectively. This situation can be visualized at Figure 2, where the two different movements, shoulder abduction and flexion, have the same angle measure due the same bones involved in both. In order to describe and classify the movements according to ISB standards it was necessary some information to define and guarantee that the movement is being executed at a determinate plane. For it, normal vector of each plane was determined and the angle between it and the moving bone, for example, the arm for shoulder abduction was computed. The normal vector from frontal plane, equivalent to z axis presented at Figure 2, was accessed by the cross product between two vectors from the trunk: shoulder to shoulder and shoulder to center. For the horizontal plane, a unitary vector on y axis was used to represent its normal (y axis at Figure 2). Finally, the normal of saggital plane was computed through the cross product between the others two normal’s (x axis at Figure 2). A movement to be classified at some plane has to be 90 degrees with the normal with a range of acceptability which can be configured by therapist according to precision needed on therapy. The use of normal vector computed from body parts, besides allowing the movement analysis according to ISB standards, enables to center the tridimensional coordinates in relation to user position. It makes user mobility possible in relation to sensor position, this way the movement can be recognized independently from user position, frontal or laterally to sensor. During motor rehabilitation it is common for patients to perform postural compensations to make exercise easier. This practice can reduce motor ability and, if continually executed can promote pain [9]. Due to this, an additional postural analysis was performed aiming to avoid this kind of compensation. For it the shoulders’ height (Y coordinate) was compared and must be similar during exercise. The variation of height can be accepted until a tolerance value for trunk inclination which can be controlled through a range of acceptability. This range should be configured according to rehabilitation needs or patient limitations (e.g. healthy subjects can use a range of 10% while scoliosis patients, whose trunk is naturally inclined, will need a larger range, like 20% depending of scoliosis degree).

2) Movement Mapping Procedures For rehabilitation support system to work properly it is useful to recognize a therapeutic movement, according to the ISB standards, which should be executed correctly during therapy. This project was undertaken to recognize arm and forearms movements. From them, shoulder frontal abduction, where the patient arm performs a movement from the position attached to the body (with the hand touching the hip) to a position away from it, creating an angle between trunk and arm at the frontal plane (Figure 2a), was choose as study of case. This movement is largely used on shoulder rehabilitation processes on patients with pathologies like traumas, dislocations, impact syndrome and post surgery issues [20].

Figure 2. Shoulder movements: a) Abduction b) Flexion

As previously mentioned, Kinect provides a depth image for each captured frame, which can be overlaid with the skeleton information provided by the OpenNI library [15]. Using this tool, 3D points can be accessed and used to compute angles and other information needed. The tridimensional position of each joint was extracted from kinect and used to compute vectors between successive joints representing body segments, e.g. elbow to wrist segment to represent forearm. Range of Motion (ROM) is the clinical measure of therapeutic movements which compute the angle of individual joint movement at each plane. Due to this, to complete the movement description was necessary to compute ROM of each movement through angle measurement. Each angle was obtained from arc cosine of the dot product between two consecutives bones. For example, the shoulder and elbow angles computations are presented in Equation 1 and Equation 2, respectively.

194

C. Guidance and Movement Correction The current movement mapping was developed to recognize shoulders and elbow movement. Although, the study of case presented here is working with shoulder frontal abduction to show how this recognition according to ISB standards can improve rehabilitations systems. The shoulder abduction movement is body aligned and makes an angle between arm and trunk which must be parallel to the frontal plane of the user body. The correctness of the movement execution in a rehabilitation process is essential for the treatment efficacy. Due to this, the system is programmed to punctuate whenever the user executes the movement correctly. Angles measurements as well as arms and trunk alignment are used as criteria to describe the movement. Postural analysis and users compensations during movement can also be controlled through the system. Aiming to recognize the correctly shoulder abduction execution, the following descriptors and requisites were used: i. the shoulder abduction angle must be equal or greater than 90 degrees at the end of the movement; ii. the elbow angle must be similar or higher than 160 degrees (to ensure that the arm is well stretched); iii. the angle between the arm normal vector of the frontal plane must be within the range of 80 and 100 degrees in order to guarantee the lateral alignment of the arm; iv. the right and left shoulder height (Y coordinate) must be similar, with a range of 10%; v. the actual abduction angle must be higher than it was before; vi. in order to keep punctuating, user needs to go down with their arm (the arm has to go down 30 degrees of shoulder abduction), and perform again the complete movement. Rehabilitation devices supported with feedback functionalities enable the user to be aware of their goals, i.e. what is important for treatment program. During rehabilitation process, corporal conscience of correct movement is important and can be facilitated by feedback directing treatment according to goals [6]. This way, aiming to provide a more efficient tool, a feedback system was implemented, taking into account the following considerations. Visual feedback is shown in Figure 3 to Figure 5. The score is increased each time that movement is executed correctly (Figure 3). When movement is finished correctly a message indicates to return the arm to initial position (resting position) (Figure 3b). For each five points a congratulation message is given (Figure 3c), this number of points can be chosen by user. Movement correction is also enabled describing which the movement is wrong marking it on body and saying how to perform it correctly (Figure 3d to Figure 3f).

Figure 3. Visual feedback messages: a) score counter; b) orientation message; c) congratulation message; d-f) correctness message: d) align the arm; e) straighten the elbows; f) align shoulders

It is shown that the efficacy of physiotherapy treatments is improved with visual feedback offered by AR and VR systems [21], mainly because it is given a better guidance for the movement execution and a stimulus for doing it. To guide the movement an additional feedback is provided through a target showing where the arm should attach to complete the movement (Figure 4). Additionally a movement status bar is presented and is loaded gradually accordingly to the movement route (0 to 90 degrees) (Figure 5).

Figure 4. Target guiding user movement

Figure 5. Movement status feedback bar

D. Pre-Evaluation The pre-evaluation was executed aiming to test the prototype performance before applying it to patients. To use the system, user has to be in front of the Kinect sensor which is connected to a computer (Figure 6).

195

Figure 7. Correctly executed movement Figure 6. User interacting with the system

With the purpose of analyzing if the angle measurement was equivalent to clinical uses, a comparison with a goniometer was performed by a physiotherapist [22]. Goniometry is a technique for measuring ROM in degrees, mainly dedicated to human body articulations amplitudes. The goniometer has two movable arms connected by one axis, which is provided with an angle measurement device. Each goniometer arm is directed to one body part of the studied articulation, which compose the angle in question and it is positioned according to existing protocols elaborated to standardize the measure. By using a plastic, 41cm, universal goniometer, the active movements were measured. The shoulder abduction goniometry is performed by aligning the goniometer with the lateral epicondyle of the humerus, the middle of the posterior glenohumeral joint line, and a vertical line in the sagittal plane [22]. The Kinect sensor presents a limited horizontal field of view, dependent of user distance from it [15]. In order to evaluate the robustness to occlusion, tests were executed with the user alternating between being inside and outside of the field of view. Moreover, positions with the user inclined in front of the sensor and laterally positioned in relation to the sensor were tested, aiming to verify user freedom of movement when using the prototype, which is important during rehabilitation. Seated position was also tested simulating some patients which are unable to remain standing during whole therapy, and to attend paraplegic patients as well. Finally, the recognition and score system was evaluated by a physiotherapist due to the fact that its practice can predict movement’s compensations and deviations made by patients during rehabilitation process. Prototype successes and failures were computed from different movements executed correctly (Figure 7) and wrongly, 50 and 60 respectively. Ten repetitions of each kind of wrong movement were developed: anterior and posterior elevation (out of frontal plane) (Figure 8a and Figure 8b), shoulder abduction with flexed elbow (Figure 8c), reverse movement (up to down) (Figure 8d), course deviation (Figure 8e to Figure 8h) and trunk lateral inclination (postural compensation) for each side (Figure 8i and Figure 8j).

Figure 8. Wrong movements executed: a) anterior elevation; b) posterior elevation; c) shoulder abduction with flexed elbow d) reverse movement (up to down) e - h) course deviation; i - j) postural compensation (trunk lateral inclination)

196

Tests and results of present study were executed after a first evaluation of the prototype where some wrong movements were detected as correct. Based on it improvements were made including the normal vector reference and route analysis. The resulting requirements list was implemented in order to improve the system, and these advancements are already presented at this article.

varying from few to a lot), task execution (from easy to hard), instruction clarity (confuse to clear) and environment configuration (boring to interesting) [25]. Lastly there were some questions to identify user learning and interest. The asked questions were: “Would you like to play it again?”, “Does the prototype help you to learn the correctness of movement?” and “How to improve system? Suggestions?”.

E. User Tests Motor rehabilitations system can be largely applied for different kinds of pathologies and rehabilitation programs including traumatic, neurologic and geriatric therapies. In order to evaluate the system applicability user tests were applied on three different populations: three physiotherapy professionals, four adults and three elderly subjects who are members of geriatric physiotherapy groups and potential users of the system. The physiotherapists group was asked to make a technical analysis of the prototype including application benefits and movement correction capability. On the other hand, the adults represent the general user groups, introduced mainly to evaluate the system entertainment, easiness-of-use and analysis of the movement learning process through the system. Finally, the elderly group, which are already realizing motor rehabilitation therapy being a potential user group for the system, participated on this study. Firstly, individuals were submitted to use the prototype and then a questionnaire was applied, which was based on VRUSE questionnaire made for a VR based system [23] and a website usability questionnaire [23-24] moreover merged with a questionnaire for an AR rehabilitation system proposed by Alamri and partners [25]. The detailing of the questionnaire is described below.

IV.RESULTS AND DISCUSSION The developed prototype, using Kinect as a natural interaction tool for rehabilitation proposes, showed to be responsive to users’ movements (including little ones) and effective on the evaluation of the movement correctness and indication aiming its adjustment. These characteristics can be used to improve AR and VR technologies on motor rehabilitation, optimizing treatment process. Evaluating system performance, the mean execution time of the total processing of each frame was 71.56 milliseconds, varying from minimum of 53 to maximum of 171. From this mean, 71.20 milliseconds correspond to OpenNI skeleton extraction and display routines, and 0.46 to mapping and movement analysis (Table 1). These results show good processing times, since performance is an important characteristic in order to achieve a natural interaction. In general, it is recommended to not exceed the execution time of 150ms [10], this way, the presented system fulfills this requisite with a substantial margin. TABLE 1: PERFORMANCE TEST: EXECUTION TIME.

1) Usability Questionnaire The usability questionnaire used in the tests is composed of three parts where the user should evaluate each defined criterion, pointing a score from one to five followed by some complementary questions. The questionnaire first section asked about user reaction to prototype use, what they felt by using it, in a scale from 1 to 5 using the following options: from terrible to wonderful, frustrating to satisfactory, discouraging to stimulant, hard to easy and rigid to flexible.[23] The second part was dedicated to evaluate the interface. The interface letters size was evaluated (from low to highly legible), as well as the stimulus (few to a lot), the information organization, the used terms, the clarity of information (this last varying from confuse to clear) and uniform distribution of information over the display area (never to always) [24]. The last part aimed the technical characteristics, analyzing fun perception by the user, depth perception, real environment recognition as part of the exercise, motivation to complete exercise, exercise comfort, easiness to understand how to perform the correct movement, orientation assistance for movement comprehension (these

Process

Average

Minimum

Maximum

Total Execution OpenNI Mapping Movement Analysis

71.56 ms 71.20 ms 0.46 ms

53 ms 53 ms ≈ 0 ms

171ms 171ms 65 ms

Sensibility and Angle Accuracy In order to evaluate the possibility of using the angle measured by the system as a therapy measure, the angle measured by the proposed system was compared with the data registered by the goniometry, which, as said before, is a clinical tool for it. Both angles were computed simultaneously ten times on different shoulder abduction angles. These data showed similar results, presenting four degrees of mean variation between Kinect and goniometry measurement (Figure 9). To evaluate this difference, OneWay ANOVA test was executed [26], analyzing this mean variance, then obtaining a probability value (p) of 0.848. Goniometry evaluation was chosen due to its practical and clinical use [22]. As a manual angular measurement it presents good reproducibility with a variation of 2 to 7 degrees, however, this accuracy is dependent of examiner ability [27]. Furthermore, the low difference found can be justified by the prototype reference points for angle computation, in which articulations are detected by computational methods while goniometry makes use of anatomical points. A.

197

the session. As recommended by Valli [10], the user must not feel attached to the interface. In order to achieve a more comfortable interaction it is demanded that the user feels free as long as possible.

Figure 9. Angles means through Kinect and goniometry (p=0.848)

B. Mapping and Movement Recognition The movement recognition was based on angles measurements. The presented prototype aims mainly to make therapy execution more precise with interactivity for user. For it, the proposed system uses correct movement recognition avoiding development of wrong exercises and compensations. During execution of correct movements, following all determined criteria, all were recognized correctly and punctuated, as showed in the Table 2. None of the wrong exercises were recognized as correct, meaning that the system achieved 100% of success rate, obtaining none false positives and false negatives occurrence during the recognition process (Table 2). This ratio was achieved due to the fact that the prototype enables a complete movement description, including arm and shoulder position, alignment and their evaluation during all route.

Figure 10. Correct movement executed a) anterior inclination; b) posterior inclination; c - d) seated position

Different degrees and kinds of limitations and pathologies are submitted to rehabilitation process. Due to this, a simulation of a wheelchair and a seated patient (incapable to stand up) (Figure 10c and Figure 10d) were executed. The system also succeeded on these cases obtaining the same result of the standing position. This can be explained mainly because the precision of the upper body tracking was not affected on the tested sequences. However, for future research using lower limbs, the seated position can be a limitation. From 20 postural compensations executed during tests, none were detected as a correct movement. Postural compensations (Figure 8i and Figure 8j) can be commonly performed by patients to make the arm elevation easier. This way, this misuse was recognized by the system as a wrong movement and was not computed, showing prototype efficacy to avoid it. Indeed, this is the most common compensation found on rehabilitation processes, and, according to Rainville and partners [9], it can promote pain and reduce the motor ability. Thus, trunk compensation control is extremely important in order to prevent both problems.

TABLE 2: CORRECT MOVEMENT RECOGNITION STATISTICS. Movement Correct Wrong

Executed 50 60

Recognized 50 00

Successes (%) 100 100

Movement angles and its relation to the thorax normal vector enabled the user body analysis at different positions in relation to the sensor, including inclined and lateral positions. This way, it is possible to provide the user a greater mobility on system use, moving one more requisite towards a natural interaction. A correctly performed movement being executed on different angles from sensor can be viewed in Figure 10a and Figure 10b. These movements were also included at correct performance test described in table 2. Later, the robustness to occlusions presents favorable results. From ten executed tests, all of them recovered the user after a total occlusion (caused by the user walking out of the sight of the sensor and then returning to his last valid position). This way, it turns to be possible that the user walk away from the scene for up to ten seconds without the need of a recalibration [15]. One more time, it is important to notice the presence of users’ freedom during the use of the system, not requiring their attention uninterruptedly during

C. Usability The prototype was used by three different populations, a total of ten subjects, to test its usability and its efficacy on rehabilitation support. Table 3 presents the mean of usability scores for each category from the questionnaire which has a scale of 1 to 5 (weak and strong, respectively). The first test was applied with adults to evaluate general usability and interaction. From four subjects, three learned to execute the correct movement with some prototype help and the other one got it right since the first execution, without any need of external guidance. This group marked a low score for the letter size on the presented interface (3

198

The implemented prototype showed efficacy when detecting correct therapeutic exercises, avoiding wrong movements during the rehabilitation process, this way preventing lesions and optimizing the treatment. The proposed prototype demonstrated levels of precision and sensibility which enable the adaptation for physical limited subjects. Moreover, visual feedback supplied by the system favored interaction and promoted a better execution of the exercise. Future researches comparing the prototype application on patients with and without feedback supply can analyze this efficacy. Based on positive reports from users and also on prototype precision and efficacy as natural interaction tool for rehabilitation purposes, it is pretended to apply this technology for development of a complete AR rehabilitation system, considering users opinion to improve it, as legibility and clarity of information.

points, which means a low readability), and great score to playability and depth perception (4.5 points for both). TABLE 3: USABILITY QUESTIONNAIRES SCORES FOR EACH USER GROUP.

Subjects (n) Average Age (years) Playability Satisfaction Easy Fun Motivation Environment Guidance help for movement execution Legible Stimulus Information clarity Depth perception Real environment recognition as part of the exercise Instructions

Physiotherapist

Adults

Elderly

03 26.00 4.33 5.00 4.66 4.33 5.00 4.00 5.00

04 18.75 4.50 4.00 4.25 4.25 4.00 4.25 3.75

03 72.66 4.00 5.00 5.00 4.00 5.00 4.33 4.33

1.66 2.66 5.00 4.00 5.00

3.00 3.75 4.25 4.50 4.25

3.66 4.00 2.00 4.00 3.66

4.33

3.50

4.00

ACKNOWLEDGMENT The authors would like to thank FACEPE (APQ-08681.03/10; PBPG-0024-1.03/11; PBPG-0547-1.03/11; BIC0876-1.03/11) for the financial support. They also thank Daniel Freitas, Voxar Labs designer, for the help with the figures.

The following tests were performed with three elderly subjects due to the wide actuation of motor rehabilitation on this population. In a similar way to the first adults group, one subject executed the movement correctly on their first try and the other two learned it through the system guidance. The criterion with lower score for this group was the clarity of the information (achieving 2 points only) and the greater score was related to the satisfaction and motivation provided from prototype use (5 points for both). The last evaluation was performed on a group of three physiotherapy professionals, in order to gather technical opinion about the system and the application itself. The movement execution was aided with the prototype guidance on all three subjects’ tests. The pointed negative aspects were mainly related to the letters size and the stimulus (1.66 and 2.66 points respectively) and the positive evaluation from physiotherapists were most about the user satisfaction, the system guidance (towards the improvement of movement execution), the real environment recognition as part of the exercise and the information clarity provided by the prototype (5 points all). User evaluation showed that the correction and guidance provided by the system were executed with efficacy. However, some interface improvements are needed in order to achieve a better usability for the application. Some of those improvements are related to the messages letter size and the information clarity, mainly on exercise guidance. Therewith, postural and biomechanical correction and orientation during treatment execution is possible with a good patient acceptability, and who should be more motivated to execute therapy correctly.

REFERENCES [1]

P F Beattie, et al., "Patient satisfaction with outpatient physical therapy: instrument validation," Physio Therapy, vol. 82, 2002, pp. 557-565.

[2]

D. S. E D De Bruin, G Pichierri, S T Smith, "Use of virtual reality technique for the training of motor control in the elderly," Zeitschrift für Gerontologie und Geriatrie, vol. 43, 2010, pp. 229-234.

[3]

V. B. E Richard, P Richard, G Gaudin., "Augmented Reality for Rehabilitation of Cognitive Disable Children: A preliminary Study," in Virtual Rehabilitation 2007, 2007, pp. 102-108.

[4]

A. S. Danny Rado, Joseph Plasek, David Nuckley, Daniel F Keefe, "A Real-Time Physical Therapy Visualization Strategy to Improve Unsupervised Patient Rehabilitation," IEEE Visualization, 2009.

[5]

X Luo, et al., "Integration of Augmented Reality and Assistive Devices for Post-Stroke Hand Opening Rehabilitation. ," Proceedings 2005 of IEEE Engineering in Medicine and Biology 27th Annual Conference, 2005, pp. 6855-6858.

[6]

J W Buker, et al., "Augmented Reality Games for Upper-Limb Stroke Rehabilitation," in 2nd International Conference on Games and Virtual Worlds for Serious Applications 2010, 2010.

[7]

H. Sveistrup, "Motor rehabilitation using virtual reality," Journal of NeuroEngineering and Rehabilitation, vol. 10, 2004, pp. 1-10,.

[8]

D. Sparks, et al., "Wii have a problem: a review of self-reported Wii related injuries," Informatics in Primary Care, vol. 17, 2009, pp. 5557.

[9]

J. B. S. J Rainville, C Hartigan, A Wright., "The Effect of Compensation Involvement on the Reporting of Pain and Disability by Patients Referred for Rehabilitation of Chronic Low Back Pain," Spine, vol. 22, 1997, pp. 2016-2024.

[10] A. Valli, "Notes on www.naturalinteraction.org, 2005.

V.CONCLUSION This work introduced a movement recognition method based on therapeutic movements developed using Kinect sensor information to guide and correct it.

Natural

Interaction,"

[11] T. S. Y. Loh Yong Joo, Donald Xu, Ernest Thia, Chia Pei Fen, Christopher Wee Keong Kuah, Keng-He Kong, "A feasibility study using interactive commercial off-the-shelf computer gaming in upper limb rehabilitation in patients after stroke," Journal of rehabilitation medicine, vol. 42, 2010, pp. 437-441.

199

[12] Christian Schönauer, et al., "Full body interaction for serious games in motor rehabilitation," AH '11 Proceedings of the 2nd Augmented Human International Conference, 2011. [13] C. Microsoft. 2011, Kinect http://kinectforwindows.org/

for

Windows.

.

[20] B Kisner and L. A. Colby, Therapeutic Exercise: Foundations and Techiniques, 5 ed.: Davis Plus, 2007. [21] A J Espay, et al., "At-home training with closed-loop augmentedreality cueing device for improving gait in patients with Parkinson disease," Journal of Rehabilitation Research and Development, vol. 47, 2010, pp. 571-582.

Available:

[14] J. Shotton, et al., "Real-time human pose recognition in parts from single depth images," in Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on, 2011, pp. 1297-1304. [15] PrimeSense. 2011, OpenNI: User http://www.openni.org/Documentation.aspx

Guide.

[22] J. R. W. K Hayes, Z L Szomor, G A C Murrell, "Reability of five methods for assessing shoulder range of motion. ," Australian Journal of Physiotherapy, vol. 47, 2001, pp. 289-294.

Available:

[23] S. Roy, "VRUSE—a computerised diagnostic tool: for usability evaluation of virtual/synthetic environment systems," Applied Ergonomics, vol. 30, 1999, pp. 11-25.

[16] R Kizony, et al., "Video-capture virtual reality system for patients with paraplegic spinal cord injury," Journal of Rehabilitation Researche and Development, vol. 42, 2005, pp. 595-609.

[24] N. d. E. C. D. P. NUPE. (2006, Questionário. Available: https://www.unisinos.br/nupe/_dados/Questionario.htm

[17] G. Wu and P. R. Cavanagh, "ISB recommendations for standardization in the reporting of kinematic data," Journal of Biomechanics, vol. 28, 1995, pp. 1257-1261.

[25] A Alamri, et al., "AR-REHAB: An Augmented Reality Framework for Poststroke-Patient Rehabilitation," IEEE Transactions on Instrumentation and Measurement, vol. 59, 2010, pp. 2554-2563.

[18] G. Wu, et al., "ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion—Part II: shoulder, elbow, wrist and hand," Journal of Biomechanics, vol. 38, 2005, pp. 981-992.

[26] W. A. S. W S Gosset, "Two student of science," Pediatrics, vol. 116, 2005, pp. 732-735. [27] A M Bovens, et al., "Variability and reliability of joint measurements.," American Journal of Sports Medicine, vol. 18, 1990, pp. 58-63.

[19] H. M. Clarkson, Joint motion and function assessment: a researchbased practical guide. Philadelphia: Lippincott Williams & Wilkins, 2005.

200

Suggest Documents