A Real Time, Distributed System with Haptic Interfaces for Fine Motor Skill Rehabilitation and its Quality of Experience Shanthi Vellingiri
Yuan Tian
Balakrishnan Prabhakaran
School of Computer Science University of Texas at Dallas Richardson, Texas 75080-3021 Email:
[email protected]
School of Computer Science University of Texas at Dallas Richardson, Texas 75080-3021 Email:
[email protected]
School of Computer Science University of Texas at Dallas Richardson, Texas 75080-3021 Email:
[email protected]
Abstract—Recent inclusion of haptics in rehabilitation is expanding and this requires efficient architecture for stable and synchronized operation. In this paper, we propose a distributed architecture with multiple haptic interfaces to improve the fine motor skill disability. The preceding works with haptics for rehabilitation and motor recovery practices a stand alone infrastructure with time consuming operations for training. Compared with the existing methods, the proposed system provides three significant contributions: (i) a real time 3-phase operation to avoid invariable and time consuming practice with recorded exercises; (ii) a distributed environment to support one-to-many remote accessibility for training and practice; (iii) a novel force rendering algorithm to handle better synchronization among the haptic interfaces in the network to provide smooth force feedback in real time. The effectiveness of our system is demonstrated by the user study that also includes the evaluation from a specialist.
I.
I NTRODUCTION
Haptic frameworks have shown immense potential in various applications including teaching [1], [2], [3], [6], training the visually impaired [4] and rehabilitation [9], [13], [14], [17]. However, most existing haptic frameworks exhibit some common limitations, such as stand alone operation, time bounded accessibility and practice with in-built recorded exercises. Moreover, systems with a stand-alone architecture fail to provide the collaborative features required to enhance user experience. These limitations in the existing approaches motivate the use of a distributed architecture for real time, interactive, stable, flexible, reliable, synchronized system to improve fine motor skill disability. In this paper, we propose a distributed architecture with haptic interfaces to improve the fine motor skill disability in patients. In our proposed system, the haptic interfaces coordinate to provide an interactive framework, the force feedback guides the patients through the real time design as well as rendering of the training exercises. We consider the following aspects when designing our distributed system, real time design of exercise, error free rendering, smooth force feedback and meaningful coordination between the force feedback and the rendered exercise. The contributions of the work presented in this paper are: (i) a new 3-phase framework that enables to improve
978-1-4799-5964-8/14/$31.00 ©2014 IEEE
fine motor disability, (ii) real time design and rendering of training exercises, (iii) smooth force rendering for coordinated operation among the distributed haptic interfaces, (iv) analysis of user’s quality of experience (QoE). To evaluate our system and the approach, we conducted a survey with novice (5 Children in the age group of 4 to 6 years), experts (7 Experienced professionals working with haptic technology) and a specialist (Physical medicine and rehabilitation doctor) user groups, to collect their opinion and feedback about their experience with the proposed system, its suitability and flexibility towards improving fine motor skill disability. Despite some limitations in the approach, the results of user survey highlights its potential in enhancing the training experience for patients. Currently, in the proposed system, we do not consider the effect of force exerted by the user on his/her side of the haptic interface. Also, the haptic interfaces deployed in the distributed system are homogenous in nature. The challenges related to heterogeneity and workspace incompatibility are not evaluated in the proposed approach. The rest of the paper is organized as follows. Section 2 provides an overview of various existing stand alone solutions. In Section 3, we provide a detailed discussion of the design and architecture of the proposed system. Section 4 presents results of experimental evaluation and the user study analysis. Section 5 concludes the paper and mentions the limitations of our approach. II.
R ELATED W ORK
Mostly, the potential of haptic technology is explored through various approaches in rehabilitation and teaching. The research work discussed in [14] experiments the use of handwriting and calligraphy exercises with a stand alone haptic interface to train motor disordered patients and stroke patients. In this paper [12], the authors discuss the benefits of using a 2D bidirectional haptic device with calligraphy exercises to train people to use their second hand. In the work mentioned in [13], the authors propose a stand alone haptic interface and control algorithm solution with 2D exercises to train stroke affected patients. An approach using haptic interfaces with
audio guidance system is illustrated in [4], [5] to train the visually impaired to learn to write and recognize characters. A common observation among the applications in rehabilitation, motor skill recovery and teaching is the use of haptic enabled calligraphy training exercises in 2D or 3D environment with force rendering for training, this prompted us to look into some of the interesting research approaches of haptic technology in teaching as well. A stand alone teaching system using haptic interface with recorded set of exercises in Persian and Chinese calligraphy in a 3D simulation environment is discussed in [1], [2], [3], [6], [10], [15]. The teacher inputs a model character and later the student learns the character with guidance from the haptic interface. The authors of [7], [16], introduced a stand alone Japanese handwriting learning system using haptic interface and visual sensors with predefined exercises generated using xml based schema for training. Through the work mentioned in [8], the researchers propose that haptic force feedback and audio guidance benefits the way of learning alphabets, this is tested in a stand alone haptic interface system with a recorded set of arabic characters, and the results of training is evaluated using dynamic time warping (DTW) algorithm to recognize the trained character. The majority of the former works have concentrated on the use of a haptic interface to illustrate the benefits of using force feedback for teaching and rehabilitation applications, thereby restricting the force calculations to a device. Also, they lack in explaining the need for system stability and scalability. Human computer interfaces for rehabilitation will be beneficial only if the system is distributed in its location and contain the ability to coordinate effectively among the distributed interfaces. Moreover, the training exercises are usually recorded and played, training approaches particularly in rehabilitation needs real time design and rendering based on the patient’s disability condition and successive improvement. In our approach, we describe the methodology used to establish a real time design, smooth force rendering and coordinated operations among the distributed haptic devices to improve fine motor skill disability. III.
P ROPOSED S YSTEM
In the proposed system, the haptic interfaces are distributed in the network and they coordinate in real time to create an interactive framework that can be used to improve fine motor skill disability. The patient, at one end of the network can experience the force feedback while getting trained with the exercises rendered in real time by the therapist at other end of the network. The system described in Figure 1 consists of haptic devices from Geomagic Touch [18]. The haptic interfaces communicate over the local area network (LAN) using the traditional server centric, master-slave communication paradigm. The key design considerations described below have to be considered to establish consistent operation. Co-ordination and Smooth force rendering: Communicating the position and the associated force components at every 1 KHz sample over the network will make the system
Fig. 1.
System Overview.
slow and uncoordinated. So, efficient algorithms to calculate force feedback must be incorporated to obtain coordination among the haptic interfaces in the network. Stability: Any unstable operation of the haptic interface due to network impairments or uncoordinated operation can be unsafe. So the system must be stable during force rendering and coordination with no jitter or vibrations at any time during its operation. User experience: At any time, during the training, the patient must experience the association between the force feedback, hand movement and the visual simulation state. A mismatch in this association will result not only in tracing the exercise in different order and make the system inefficient but also worsen the disability condition of patient. Based on the above considerations, we designed a stable distributed system with real time design and smooth force rendering of the training exercises that can coordinate among the other haptic devices in the network to showcase its potential to improve fine motor skill disability in patients. A. Framework The system to improve fine motor disability must follow predefined steps, in our approach shown in Figure 2, we recommend a framework that defines three phases (Design, Train and Practice) of operation for a effective haptic interface enabled rehabilitation system. The exercise to train and practice is designed and rendered from the master haptic interface to slave haptic interface in real time. The training phase of our proposed system is important because of two reasons: Firstly, it is in this phase that the master haptic interface initiate rendering exercises and starts to guide the slave haptic interface. Secondly, it is this phase that involves the challenges for synchronized and coordinated operation. Once training of the exercise is established, the master haptic interface can issue the practice command for the user to practice the exercise as often as advised by the therapist. More details about the phases are provided below: 1)
Design Phase: Once the communication among the haptic interfaces in the network is established, the
and can create unstable (oscillations of the haptic interface stylus) operation of the system. In our smooth force rendering algorithm mentioned below in Figure 3, at every instance of the train phase, the next calculated position of the training exercise is communicated from master haptic interface to slave haptic interface, not the force to render. In this way, scalability and coordination can be obtained easily, since the master conveys its next calculated position to the slave devices.
Fig. 2.
2)
3)
Phases of the proposed system.
training exercise is designed and rendered by the master haptic interface to the slave haptic interface in real time. Train Phase: In this phase, the master haptic interface guides the slave haptic interface from start to end of the rendered exercise. The master haptic interface calculates the next position to trace in the alphabet and communicates that position to the slave haptic interface. On receiving the position, the slave haptic calculates the force and exerts it to the relevant position in the exercise. Thus, by the smooth force rendering, synchronized operation and coordination among the multiple haptic interfaces in the network is achieved. Practice Phase: The indent of this phase is to make the user practice the rendered exercise as many times as advised by the therapist. Force is calculated locally at the slave haptic interface, the master haptic interface is only a silent observer for this phase of operation.
Fig. 3.
Algorithm for smooth force feedback.
On receiving the next calculated position from the master haptic interface, the slave haptic interfaces, based on their local position and the received calculated position, calculates the force locally and renders to its haptic stylus, thereby synchronized operation is obtained between the master and slave haptic interfaces. Thus, we avoid transmitting the force components for each instance of the training exercise through the network. Thereby, we achieve stable and oscillation free operation among the haptic interfaces deployed in the network. The proposed approach is followed to establish coordination in the Train phase. Whereas, in the Practice phase, the force to render on the exercise is calculated locally at the slave haptic interface and the master haptic interface is just a silent observer. IV.
B. Haptic Rendering In this section, we describe the proposed algorithm to render smooth force feedback at the haptic interfaces. Coordination among the haptic interfaces in the network in the distributed infrastructure is essential for synchronized operation between the haptic interfaces, this is achieved by a smooth force rendering. A slight mismatch in co-ordination can collapse the system and increase the fine motor disability severity of the patient. Certain approaches in the literature exchange position and force data among the devices to make the system synchronized in its operation. This is not an efficient way, because, due to the network conditions, the data from master haptic interface will reach the slave haptic interface later in point of time and make the operation intermittent. Eventually leading to uncoordination among the visual simulation and force rendering,
R ESULTS AND U SER ANALYSIS
The interactive application is developed using C++, OpenGL is used to design and render the exercises. Haptic device application programmable interface (HDAPI) function calls are used to communicate from the interactive fine motor skill rehabilitation application to haptic interface. The transmission control protocol (TCP) is used for communication between the haptic interfaces in the network. Since calligraphy and handwriting exercises are generally used to train patients with fine motor skill disability, alphabets in English language with linear, cursive and combined strokes are chosen and designed as experiments and shown in Figure 4. In-addition, simple patterns with up and down motions are designed and considered as training exercises. In all the exercises mentioned below, the ”yellow” marker is the position of the haptic interface and the ”magenta” marker is the simulation position of the rendered training exercise. As
rendering is done and the train phase is activated, the smooth force rendering algorithm is initiated, the calculated force is rendered to the haptic stylus to follow the simulation position of the exercise at that particular time frame. The user can observe a stable coordination between the force rendered on the training exercise, the movement of the markers on the training exercise and the movement of their hand according to the training exercise.
Fig. 4.
Sample exercises chosen for force rendering.
The result of smooth force rendering algorithm when the user experience train phase of the 2D sample exercises with alphabet ’A’ and alphabet ’C’ on X, Y and Z axes is captured in the Figure 5. The initial oscillation in Figures 5 and 6 are due to the position of the haptic interface stylus, as the training exercise is rendered and the force algorithm is executed, based on the haptic stylus position and the start position of the training exercise, this stabilize once the haptic stylus position is very close in range to the position of the training exercise. In some of the above sample exercise mentioned in Figure 4, the haptic stylus was forcefully pulled away from the trajectory of the training exercise to demonstrate that the simulation waits for the haptic stylus position to move closer to its boundary to make force rendering effective. Figure 6 captures the force exerted on X, Y and Z axes for the sample exercises for alphabet ’U’ and a simple wave pattern. The results of the proposed three phases of our system is depicted in this experiment shown in the Figure 7, real time haptic rendering (Design) of the exercise is the first phase, it can be noted that the client’s haptic stylus position (yellow marker) is at origin of the figure. The second figure is captured when the user is in Train phase, the haptic position marker is near the trace allowing the user to know and get trained with the exercise. The third figure is captured when the user Practice the exercise, it can be noted that the width of the alphabet is increasing due to practice of the training exercise several times.
Fig. 5. Force on X,Y and Z axis for sample exercises with alphabets ’A’ and ’C’.
To evaluate our system, the approach, its flexibility and safe applicability to rehabilitation, a user study was performed with three different user groups, the analysis and inference are mentioned in the next sections. A. Inference from novice user group In general, just by showing an exercise, the identified behaviour of the user is to attempt to write the character in the exercise using the haptic stylus in the order they know. One of the reasons behind this ambiguity is not knowing the rules to follow to learn the exercises of that language. These issues are taken care in our approach, the traces of the exercises are orderly guided (left to right) to make the user feel the alphabet or pattern rendered in the exercise through force feedback. A user study with novice (5 Children in the age group of 4 to 6 years) user group was carried out. The intention of this particular user study is to identify the users’ interest in involving a device to practice some of the exercises that they already know, their involvement in the 3 stage operation phases, their ways of holding the haptic interface stylus, and their reaction to the force feedback. The evaluation was done based on the following metrics: •
Interest to learn with a device
the user is briefed about the system and the force feedback that the haptic interface will exert on their hand. The observations of the proposed system from this user group is captured in the Figure 8.
Fig. 8.
User study observations from novice group.
Though the users’ of this group experienced a smooth force rendering, some pointed out that the force rendering was fast. A few users preferred guidance throughout the three stage training process, audio guidance can make them more comfortable with the system and phases of operation during the training. B. Inference from expert user group
Fig. 6. Force on X, Y and Z axes for sample exercises with alphabets ’U’ and a pattern.
The Figure 9 captures a user experimenting with one of the training exercises of the proposed system. In this user study, 7 experienced professionals working with haptic technology were considered. Since the experts are familiar with haptic technology and force rendering, we introduced a slight change in the infrastructure of the system. So, we created a hybrid infrastructure by integrating the slave haptic interface to a wireless client and evaluated the performance of our proposed approach.
Fig. 7. Results of design, train and practice phases at client’s side of proposed system.
•
Acceptance of a new device
•
Ease in holding the stylus
•
Feel the force feedback
•
Differentiate between phases
The observations from training and practice experiments were recorded and analyzed. It was found that while the approach generated interest in the exercise, it took some time for users’ of this group to get used to holding the haptic interface stylus in a proper way. We used a server centric, wired infrastructure of the system setup for evaluation. While the training exercise is rendered,
Fig. 9.
Users experimenting the distributed haptic interface system.
It was also observed that due to the network behavioral issues, rendering was slow and eventually the order in which the alphabet was traced was different in some of the experiments. The observations of the proposed system from this user group is captured in the Figure 10. Experts concluded that the system is stable in operation with no oscillations or vibrations on the haptic stylus during the three phases of operation. However, a few felt suggested to focus on approaches that initiate slow and steady force rendering on the exercise to make the train and practice phase more effective.
based on the severity of the disability, this will provide us with practical exercises that can be used for evaluating our system on patients. Also, the potential of the proposed approach and the effectiveness of the system can be monitored when tested in real time with suitable training exercises on patients with a therapist’s supervision. ACKNOWLEDGEMENT Fig. 10.
User study observations from expert group.
C. Opinion from a specialist The physical medicine and rehabilitation specialist pointed out that though the system is synchronized in operation with smooth force rendering, the training exercises have to be designed according to the level of fine motor skill disability of the patient. Also, in the train phase, the haptic interface at patient’s side was guided to make it follow the trajectory of the rendered training exercise. Instead of strictly making the patient experience the force feedback in the train phase, the specialist suggested us to provide a bounding box for the alphabet in the exercise and allow the patient to move the stylus within the boundary. This can provide flexibility in training and can involve the master haptic interface only when the patient tend to move the haptic interface stylus away from the described boundary.
This material is based upon work supported by the National Science Foundation under Grant No. 1012975. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. R EFERENCES [1]
[2] [3] [4] [5] [6]
V.
C ONCLUSION
Most of the existing solutions with haptic interface are stand alone systems that highlight the importance of force feedback and have not explored the potential of using the devices in a real time distributed environment to benefit the human computer interaction and future requirements in telerehabilitation. Efficient use of these systems is not practical unless its operational drawbacks and architectural limitations are solved.
[7]
[8]
[9]
In this paper, we have proposed a novel approach of introducing a interactive 3 stage (Design, Train, Practice) framework with real time design and rendering of exercises for patients to get trained and improve the fine motor skill disability. We discussed the key challenges in extending the stand alone approach to a real time distributed infrastructure, proposed the smooth force rendering algorithm for stable, synchronized and coordinated operation among the haptic interfaces deployed in the network.
[10]
The outcome of the user study highlights that the proposed approach will be beneficial, since the future architectures to support rehabilitation and improve fine motor skill disability depends on scalable and flexible infrastructures with stable force rendering during the phases of operation.
[14]
The proposed system has some limitations, the performance is inconsistent in hybrid distributed architectures, ie., when a haptic interface integrated to a wireless client was introduced, network latency affected the performance of the system. We plan to analyze the impact of these network impairments in the future work and propose algorithms to make the system stable and coordinated in hybrid distributed architectures as well. We have a plan to do a survey on the type and nature of the training exercises that will be needed for the patient
[11]
[12] [13]
[15]
[16]
[17]
[18]
M. Boroujeni and M. Misagh. Daly, Haptic Device Application in Persian Calligraphy, Proceedings of the 2009 International Conference on Computer and Automation Engineering, 2009. M. Xiong and M. Pennel, Comparing Haptic and Visual Training Method of Learning Chinese Handwriting with a Haptic Guidance, JCP, 2013. F. Cao and Z. Wu, A Learning System of Qi Gong Calligraphy, 3GCCCE, 2010. M. Clark, Kanji Writing Training with Haptic Interface for the Visually Impared, TeX90 Conference Proceedings, 1991. Q. Francis and Quek, Supporting learning for Individuals with Visual Impairment. D. Wang and Y. Zhang, Stroke based modeling and haptic skill display for Chinese calligraphy simulation system, 2006. E. Mohamad and A. Mansour, A Haptic Multimedia Handwriting Learning System, Proceedings of the International Workshop on Educational Multimedia and Multimedia Education, 2007. B. Abeer and Al. Khalifa, Towards the Development of Haptic-based Interface for Teaching Visually Impaired Arabic Handwriting, Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, 2013. N. Takeuchi and S. Ichi, Rehabilitation with Poststroke Motor Recovery: A Review with a Focus on Neural Plasticity, hindawi publishing corporation, Stroke Research and Treatment, 2013. Teo and C. Leong, A Robotic Teacher of Chinese Handwriting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2002. M. Dan and Tan, Haptic Feedback Enhances Force Skill Learning, Proceedings of the Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2007. P. Nicolo and L. Thierry, A bidirectional haptic device for the training and assessment of handwriting capabilities, World Haptics, 2013. M. James and M. Christopher, Haptic handwriting aid for training and rehabilitation,SMC, 2005. M. Jatin and P. Mani, Design and Development of Virtual Objects to be Used with Haptic Device for Motor Rehabilitation,JSEA, 2010. Y. Wua and Z. Yuana, A Mobile Chinese Calligraphic Training System Using Virtual Reality Technology,AASRI Conference on Parallel and Distributed Computing and Systems, 2013. S. Jorge and A. Alberto, Teaching to Write Japanese Characters Using a Haptic Interface,ASymposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2002. R. Sooraj and N. Akshay, Design and Analysis of a Parallel Haptic Orthosis for Upper Limb Rehabilitation, International Journal of Engineering and Technology, 2013. http://geomagic.com/en/products/phantom-omni/overview