Pointing-Based User Interface in 3D Space

5 downloads 0 Views 278KB Size Report
real 3D space for task or service delivery by the robot to human in our daily life activities. The advantages of hand pointing in real 3D environment is that user ...
䢳䣒䢳䢯䣃䢲䢸

Title: Pointing-Based User Interface in 3D Space Nur Safwati MOHD NOR* and Makoto MIZUKAWA** *Shibaura Institute of Technology, [email protected] **Shibaura Institute of Technology, [email protected] This paper introduced the system design on user interface which will be used for human robot interaction in kukanchi (Interactive Human Space Design and Intelligence). This work is going to solve interaction issues in real 3D space thus developing a natural user interface focusing on system design prototype. Hand gesture based interface using multiple Kinect XBoxes as sensory device is applied in this research. The aim of this study is to analyze and design the system requirements as well as feedback so that it can improve the task delivery process by service robot in our daily life by using gesture. Key Words: Kukanchi, Natural User Interface, Service Robot, Human Robot Interaction

1. Introduction Recently, gesture has been practiced a lot in providing natural interface either in Human Computer Interaction (HCI) or Human Robot Interaction (HRI). Although the application area is different, however the idea of adopting gesture to provide interaction is based on the same purposes or advantages that gesture-based interface can offer. Through gesture-based interface, human can directly manipulate objects as they wish physically and clearly seen. Besides, common human interpretation is easier to achieve by symbolizing certain actions using either head or hand gesture. The gesture is significant whenever the motion of human body's parts contains information [1]. Among many types of gestures, pointing and symbolic or sign gesture are the most been applied in user interface research. While symbolic or sign gesture is very common in human-human interaction, in order to be applied to HRI area, the social interpretation must be greatly considered. This is due to the sign gesture gives different meaning to different society. Meanwhile, pointing is much more undeviating and general gesture to convey information which later invoke task or service delivery in daily life. Many research done so far are focusing on pointing gesture to projection screen or virtual environment[2][3][4][5]. In this paper, we propose this pointing-based user interface in real 3D space for task or service delivery by the robot to human in our daily life activities. The advantages of hand pointing in real 3D environment is that user can act more naturally like human-human interaction. Without need to pay attention to the screen and attached device, user can directly pointing to any object in space and interpreted by the system of what types of task to be performed by the robot such as 'bring' or 'move' object. Moreover, this technique may avoid system vulnerability associated with projection screen or device showing the virtual environment.

2.

System Description and Configuration

2.1 Kukanchi In this proposed user interface system, there are three main actors namely human as user to request service, Kukanchi

middleware as the bridge to connect between the system input and output as well as robot to deliver the task required. In addition, for the usability system evaluation and experimentation in future, we have limit this user interface system to our developed intelligent room known as Kukanchi (Interactive Human Space-Design and Intelligence). In Kukanchi, the object information is distributed or also known as environment information sharing system. In this distributed object system, it has been constructed by using RT modules [6]. RT modules are composed of RT elements and RT components which have their own function thus later need to be integrated by Kukanchi Middleware. In conclusion, Kukanchi middleware will act as a platform to serve or process the service request from user and data from sensors into a task request for the service robot.. Therefore, in our user interface system the user will not interact directly to the service robot. Instead, the user employing direct manipulation technique to the Kukanchi middleware to retrieve multiple information of the objects in daily life. 2.2 System Design From Fig.1, the focus of this work is on the analyzing the system requirements of user interface and its initial design prototype. Later, the design prototype will be improved based on usability evaluation and room experiment. From the above figure, a service application only select and use necessary functions to be used from the HRI engine. Next, the HRI engine will identify and recognize hardware-related matters as well as automatically perform HRI component switching. Therefore, any service application may exchange information with HRI engine without need to bother the interaction occur between HRI engine and its components. The HRI components are divided into three main components known as person detection, gesture recognition and goods localization. Person detection component has sub components; person identification, face detection and fingers detection. Meanwhile, gesture component and goods localization component provide gesture recognition algorithm and location of target object respectively. Gesture recognition process including tracking of hand center, stretched finger direction and fingertip positions [7]. Besides, goods management system capable to provide goods location change notification and providing goods latest position [8].

䣐䣱䢰䢢䢳䢴䢯䢵䢢䣒䣴䣱䣥䣧䣧䣦䣫䣰䣩䣵䢢䣱䣨䢢䣶䣪䣧䢢䢴䢲䢳䢴䢢䣌䣕䣏䣇䢢䣅䣱䣰䣨䣧䣴䣧䣰䣥䣧䢢䣱䣰䢢䣔䣱䣤䣱䣶䣫䣥䣵䢢䣣䣰䣦䢢䣏䣧䣥䣪䣣䣶䣴䣱䣰䣫䣥䣵䢮䢢䣊䣣䣯䣣䣯䣣䣶䣵䣷䢮䢢䣌䣣䣲䣣䣰䢮䢢䣏䣣䣻䢢䢴䢹䢯䢴䢻䢮䢢䢴䢲䢳䢴 䢳䣒䢳䢯䣃䢲䢸䢪䢳䢫

Fig.1: General description of overall user interface system.

The user initialization part also play an important part in this work since by using Kinect XBox as the sensory device, we need to deal with one of its limitation which is slow user initialization[7]. Initially, the system tracks the body and head of the user. from the depth sensor of Kinect XBox. Next, our recognition system identifies the dynamics hand gesture of fingers pointing and translates into necessary or useful information. The essence of our research such that we want to process and analyze the user fingers pointing in real 3D space using multiple of Kinect sensors in room environment. Therefore, the crucial part is how to synchronize data from these multiple Kinect sensors that to be used for pointing translation and task description. To solve this issue, we propose to activate only data from one Kinects sensor which track the specific position of user facing them. The idea is that, at first all the Kinect sensors will detect the presence of user in the room (user initialization). Once the user makes a pointing gesture to any object in the room, the system interface will execute the data generated from a Kinect sensor which gives the most accurate or acceptable fingertip and head direction. This is relevant since naturally the user will face to the direction of the object they want to point out in real life. Next, our user interface system will allow the user to visual, control and manipulate the object directly in 3D space with respect to the pointed object location. After that, the system will give feedbacks to user pointing actions by showing the virtual cursor on the object to provide confirmation that the system understands user's intention before passing the information to the application middleware (service robot).

variety of RT functional module [8].

Fig.2 until Fig.4 show the HRI components and its implementation layer in details. For each components, the system has specify their own hardware or sensor to retrieve the information needed. As mentioned before, Kinect Xbox will be used to tracking the user and interpret the gesture made by user. In addition, RFID tag reader is used for goods localization hence providing goods position data with wide

Fig.2: First hardware layer.

Fig.3: Second hardware layer.

Fig.4: Third hardware layer.

䣐䣱䢰䢢䢳䢴䢯䢵䢢䣒䣴䣱䣥䣧䣧䣦䣫䣰䣩䣵䢢䣱䣨䢢䣶䣪䣧䢢䢴䢲䢳䢴䢢䣌䣕䣏䣇䢢䣅䣱䣰䣨䣧䣴䣧䣰䣥䣧䢢䣱䣰䢢䣔䣱䣤䣱䣶䣫䣥䣵䢢䣣䣰䣦䢢䣏䣧䣥䣪䣣䣶䣴䣱䣰䣫䣥䣵䢮䢢䣊䣣䣯䣣䣯䣣䣶䣵䣷䢮䢢䣌䣣䣲䣣䣰䢮䢢䣏䣣䣻䢢䢴䢹䢯䢴䢻䢮䢢䢴䢲䢳䢴 䢳䣒䢳䢯䣃䢲䢸䢪䢴䢫

3.

Conclusion

[5]

Our proposed user interface known as pointing-based user interface is introduced to provide an interactive and natural method of interacting between human and service robot. Since human robot interaction in Kukanchi is designed as a distributed object system, there is a need to aim a natural user interface system which interact directly in normal 3D space. This method is targeted to improve the interaction time and simplify the service application adopted in Kukanchi.

[6] [7]

[8] [9]

Pointing Behavior for Human-Robot Gestural Interaction”, Proc. IEEE Industrial Electronics, Vol.54, No.2, pp. 1105-1112, 2007. Bolt, R.A., “Put That There: Voice and Gesture at The Graphics Interface”, MIT, Architecture Machine Group, Cambridge, 1980. Ishiguro, Y. et al., “Architecture of Kukanchi Middleware”, URAI 2011, Dung, L. and Mizukawa, M., “A Fast Hand Feature Extraction Based on Connected Component Labeling, Distance Transform and Hough Transform”, Journal of Robotics and Mechatronics, Vol.21, No.6, 2009. Mayama, K. and Mizukawa, M., “Goods Management Framework and Verification”, Shibaura Institute of Technology, Human-Robot Interaction Laboratory, 2012. OMG Robotics-11-05-01 Robotic Interaction Service (RoIS) Framework Revised Submission http:www.omg.org/cgi-bin/doc?robotics/2011-05-01

References [1] [2] [3] [4]

Cristina Suemay, M.Y., Francisco, P.L., Javier, V.G., “Advanced and Natural Interaction System for Motion-Impaired User”, Universitat de les Illes Balears, Doctoral Thesis, 2009. Kexi, L., Daisuke, S., Masahiko, I., Takeo, I., “Roboshop: Multi-Layered Sketching Interface for Robot Housework and Management”, CHI 2011, pp.647-656. Kentaro, I., Yoshiki, T., Masahiko, I., Takeo, I., “Drag and Drop Interface for Registration-Free Object Delivery, ROMAN 2010. Sato, E., Yamaguchi, T., Harashima, M., “Natural Interface Using

䣐䣱䢰䢢䢳䢴䢯䢵䢢䣒䣴䣱䣥䣧䣧䣦䣫䣰䣩䣵䢢䣱䣨䢢䣶䣪䣧䢢䢴䢲䢳䢴䢢䣌䣕䣏䣇䢢䣅䣱䣰䣨䣧䣴䣧䣰䣥䣧䢢䣱䣰䢢䣔䣱䣤䣱䣶䣫䣥䣵䢢䣣䣰䣦䢢䣏䣧䣥䣪䣣䣶䣴䣱䣰䣫䣥䣵䢮䢢䣊䣣䣯䣣䣯䣣䣶䣵䣷䢮䢢䣌䣣䣲䣣䣰䢮䢢䣏䣣䣻䢢䢴䢹䢯䢴䢻䢮䢢䢴䢲䢳䢴 䢳䣒䢳䢯䣃䢲䢸䢪䢵䢫

䣐䣱䢰䢢䢳䢴䢯䢵䢢䣒䣴䣱䣥䣧䣧䣦䣫䣰䣩䣵䢢䣱䣨䢢䣶䣪䣧䢢䢴䢲䢳䢴䢢䣌䣕䣏䣇䢢䣅䣱䣰䣨䣧䣴䣧䣰䣥䣧䢢䣱䣰䢢䣔䣱䣤䣱䣶䣫䣥䣵䢢䣣䣰䣦䢢䣏䣧䣥䣪䣣䣶䣴䣱䣰䣫䣥䣵䢮䢢䣊䣣䣯䣣䣯䣣䣶䣵䣷䢮䢢䣌䣣䣲䣣䣰䢮䢢䣏䣣䣻䢢䢴䢹䢯䢴䢻䢮䢢䢴䢲䢳䢴 䢳䣒䢳䢯䣃䢲䢸䢪䢶䢫

Suggest Documents