User-friendly interaction/interface control of intelligent home for movement-disabled people Z. Zenn Bien, Jun-Hyeong Do, Jung-Bae Kim and Dimitar Stefanov
Kwang-Hyun Park
Division of EE, Department of EECS 373-1 Guseong-dong, Yuseong-gu, Daejeon, 305-701, Republic of Korea
[email protected], {jhdo, stefanov}@ctrsys.kaist.ac.kr,
Human-friendly Welfare Robot System Engineering Research Center 373-1 Guseong-dong, Yuseong-gu, Daejeon, 305-701, Republic of Korea
[email protected]
Abstract This paper comments some aspects of the project “Intelligent Sweet Home” that is primarily focused on development of control strategies and human-machine interaction in the smart house for assisting aged people and persons with disabilities. In our study, we propose a conception for control of the robot and home appliances by predefined hand gestures remotely sensed by ceilingmounted CCD cameras. Pointing at the equipments by his/her hand, user can select the device that should be controlled. Then, by gesturing and changing the hand posture, user sets the operation that should be executed. The approach complements the inconvenience of the conventional remote controllers giving additional freedom to persons with movement deficits and people without disabilities.
1
Introduction
The intelligent house for physically impaired people integrates devices for movement assistance of the resident and devices for continuously monitoring of his/her health status. Such solution will have strong positive emotional impact on the patients, improving their quality of life, giving them privacy and feeling that he/she lives in an ordinary house, not in a hospital. The same approach will reduce significantly the medical care costs per person. Apart of its social significance, the problem for development of new welfare models is highly challenged to researchers from two main aspects: 1. Development of high efficient, reliable devices for personal assistance, health monitoring and entertainment; 2. Development of user-friendly interface for easy and efficient operation of home-installed devices. The effect of the smart house toward its inhabitant is strongly dependable on the list of the devices that build the home environment and the efficiency of the automatic system that synchronizes their operation. The way of interaction of the inhabitant with the home-installed devices is also a significant aspect of the smart house design. Often, the conventional way for control of home appliances by switches and remote controllers is not suitable for some categories of people with movement disabilities. In order to offer an alternative for such people, recently some projects on voice control of home appliances were developed (Jiang, Han, Scuccess, Robidoux & Sun, 2000)(Lee & Keating, 1994). However, acceptable performance of the voice-operated systems can
be achieved only by sensitive microphone that is placed near the user’s mouth (Pentland, 1998). In addition, the recognition of command in noisy environment becomes difficult and unstable. During the last decade, some studies on gesture recognition to control home appliances have been attempted. Among the various HMI, the hand gesture control is considered as a promising alternative for human-machine interaction because of the natural way of communication offered by it. No additional devices, such as remote controllers or head-held microphones are required for control of multiple devices from various standing points at user’s house. Currently, most of the researches are focused on the recognition of hand orientation and posture in restricted environments to control a single system (Sato & Sakane, 2000) (Jojic, Brumitt, Meyers, Harris & Huang, 2000). In this paper, we propose soft remote control system that controls home appliances by recognizing the user’s pointing gesture in our Intelligent Sweet Home (Bien, Park, Bang & Stefanov, 2002).
2
Overall structure of the Intelligent Sweet Home
The Intelligent Sweet Home for assisting of people with disabilities and aged persons, developed at the Human-friendly Welfare Robot System Engineering Research Center in KAIST, aims at development and testing of new ideas for the future smart houses and their control. Our work was based on the idea that the technologies and solutions for such smart house should be human friendly, i.e. smart houses should possess high level of intelligence in their control, actions and interactions with the users, offering them high level of comfort and functionality. The Intelligent Sweet Home consists of four main parts: automatic bed, bed-mounted rehabilitation robot, soft remocon, and network. Overall view of the current scenario of the Intelligent Sweet Home is shown in Fig. 1. An automatic bed, especially designed for the project, is important component of the home since the people with movement limitations usually spend much time in their beds. The bed can change its pose in various ways. The upper part can be folded from 0 to 90 degrees and the folded curve is similarly formed as the curve from pelvis to waist of human body by adopting 4-bar mechanism. The lower part is folded from -70 to 70 degrees for the user to keep a comfortable pose. MANUS robotic arm is attached to the bed in order to serve basic everyday tasks such as pulling a book and put it back on the bookshelf, transporting a newspaper, giving a message, and pulling a quilt over and putting away it. The 3D position sensor device is utilized to supply the online position information of objects in real time. The network performs information exchange between the robotics arm, position sensor device, camera system, and hand gesture recognition system.
Figure 1: Overall System of Intelligent Sweet Home
3
The Soft Remote Control System and its structure
Significant part of our projects on Intelligent Sweet Home for assisting the elderly and the handicapped is devoted to development of innovative solutions for more natural human-oriented interface for control of home-installed devices. Our understanding is that frail disabled would feel much comfortable if the applied HMI does not require any sensor attachments to the user. In our study we propose a conception for control of the robot and home appliances by predefined hand gestures remotely sensed by ceiling-mounted CCD cameras. That part of the project was named as “soft remocon”. There are two modes of control of the home environment: 1. Simple mode – User selects the home appliances by pointing to them. Dwelling or voice is used for confirmation of the selection and activation of the selected device. Next pointing to the same device turned it off. The mode was applied to “on-off” control of the TV, lamps, curtains opening/closing, etc. 2. Extended mode – Using a hand gesture, the user first activates a mode where a list of tasks and services appear on the TV screen. Then, pointing to the TV and moving his/her hand, the operator selects from the menu a command that should be executed. Last, taking certain hand posture or using voice commands, user confirms selected command and initiates its execution. Such mode can be used for changing of the TV channels; setting of home environmental parameters, such as indoor temperature, light intensity, sound loudness of audio devices, as well as for selection of pre-programmed tasks that will be automatically executed by the robot or other home- installed devices. Figure 2 shows the whole configuration of soft remote control system.
Figure 2: Soft remote control system Three ceiling-mounted CCD color cameras with pan/tilt motions are used to acquire the image of the room. For the simple identification of the commanding hand in the complex background, it is assumed that the user should wear a colored (red & blue) hand band. The colored hand band is tracked by means of the condensation algorithm (Isard & Blake, 1998). Next, image segmentation is applied to extract the hand color region from the neighborhood of the colored hand band region. For representation of raw data, a feature extraction procedure is also included. It is followed by pointing recognition procedure that recognizes the pointing gesture and calculates the orientation angle and pointing direction of the hand. The control procedures end with sending appropriate IR signal for controlling home appliances. In the development of the soft remote control system, two assumptions were made: - Assumption 1: The user points at an object with his forefinger. - Assumption 2: The user is wearing a colored (red & blue) hand band.
The hand recognition procedure includes two problems: 1. Detection of the pointing hand in the complex background; 2. Detection of the hand orientation and the pointing direction. The approaches, used to solve these problems, are briefly commented in the next two paragraphs.
3.1
Detection of the pointing hand in the complex background
A 640x480 image from each camera is reduced to a 320x240 image using a bilinear method for fast computation. In order to reduce the effect of the luminance and shadows, we applied a normalization rgb transformation. Since the colored hand band marker has a big value of red color and a big value of blue one, to extract it from complex background, we apply proper threshold to the normalized r and b images separately. In the tracking of the colored hand band, we use the condensation algorithm that is a sampling-based tracking method. Next, the commanding hand area should be extracted by thresolding normalized r, g and b images and applying logical AND operation from the neighborhood of the colored hand band region After removing the remaining small noise using the open/close operation, we get the hand color region as shown in the lower image of Figure 3.
Figure 3: Detection of hand region
3.2
Detection of the hand orientation and the pointing direction
In order to calculate the 3D position of the hand we use the images from two cameras as shown in Figure 4. We find the space orientation of the lines that connect the center point of each camera with both end points of the hand region of the sensed image. Then, we can calculate the global coordinates of the intersection points A and B. The hand orientation vector can be obtained from the direction of the line AB.
3.3
Experimental Results
The soft remote control system was tested in the Intelligent Sweet Home for control of TV, automatic curtain, lamp and personal robot. Figure 5 shows some tests with the soft remote control system. Experiments show that the average recognition rate during the experiments for recognition of pointed objects was 95%. In extended mode, the desired item from the menu tree was successfully selected.
Figure 4: Hand orientation vector in global coordinates
4
Figure 5: Three images from each camera & detected hand region
Conclusion
We have developed an intelligent interaction/interface system using hand gesture recognition to control various home appliances in Intelligent Sweet Home. Users can control appliances or robots naturally and easily without any conventional remote controller. Since this system complements the inconvenience of conventional remote controller, it can be useful to people without disabilities as well as the aged people and persons with disabilities. Especially, even if the position of appliance changes or new appliance is installed, the user can control the appliance only after saving the position of the appliance. For further study, we will research automatic tracking without hand band for the user and recognition of more various hand postures.
5
Acknowledgement
This research is fully supported by Human-friendly Welfare Robot System Engineering Research Center (Sponsored by KOSEF) of KAIST.
6
References
Jiang, H., Han, Z., Scuccess, P., Robidoux, S., & Sun, Y. (2000). Voice-activated environmental control system for persons with disabilities, IEEE 26th Annual Northeast Bioengineering Conference, 167-169. Lee, N. C., & Keating, D. (1994). Controllers for use by disabled people, Computing & Control Engineering Journal, 5(3), 121-124. Pentland, A. P. (1998). Smart Rooms: Machine Understanding of Human Behavior. In Roberto. C and Alex. P (Ed.), Computer Vision for Human-Machine Interaction (pp. 3-21). Cambridge press. S. Sato, & S. Sakane. (2000). A Human-Robot Interface Using an Interactive Hand Pointer that Projects a Mark in the Real Work Space. IEEE ICRA, 589-595. Jojic, N., Brumitt, B., Meyers, B., Harris, S., & Huang, T. (2000). Detection and Estimation of Pointing Gestures in Dense Disparity Maps. IEEE Int. conf. on Automatic Face and Gesture Recognition, 468-475. Z. Z. Bien, K.-H. Park, W.-C. Bang, & D. H. Stefanov (2002). LARES: An Intelligent Sweet Home for Assisting the Elderly and the Handicapped, 1st Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT 2002), 43-46. M. Isard, & A. Blake (1998). CONDENSATION-Conditional Density Propagation for Visual Tracking, International Journal of Computer Vision, 29(1), 5-28.