CREATING AN INTERACTIVE SUPPLY STUDENT Deborah I. Fels†, Laurel A. Williams †*, Graham Smith ‡, Jutta Treviranus§ and Roy Eagleson* †
Ryerson Polytechnic University, 350 Victoria St., Toronto, ON, CANADA, M5B 2K3; phone (416)-979-5000 ext. 7619; fax (416)-979-5249; email
[email protected] * University of Western Ontario, 1151 Richmond St., London, ON, CANADA, N2L 3G1 ‡ Telbotics Inc., 317 Adelaide St. W., Suite 504, Toronto, ON, CANADA, M5V 1P9; (416)-586-9434; fax (416)-586-0341; email graham@telbotics. ca § ATRC, University of Toronto, 130 St. George St., Toronto, ON, CANADA M4L 3P3; (416)-978-5240; fax (416)-971-2629; email
[email protected]
ABSTRACT When an elementary or secondary school student is away from school for an extended period of time due to illness, the student is provided with a tutor or access to in-hospital classrooms to keep up with his/her studies. This arrangement is not only expensive but isolates the child from normal, everyday classroom experiences. PEBBLES (Providing Education By Bringing Learning Environments to Students), a remote controlled communication system, was developed to provide a student with access to his/her regular classroom from the hospital or home and to provide the student with a classroom presence. PEBBLES consists of a modified video conferencing system which allows two way visual and audio communication between the class/teacher and the remote student using ISDN (Integrated Service Digital Network). Four studies were carried out with children using various versions of PEBBLES. Each study has had an impact on the design of the PEBBLES system. This paper will report the design process and the human factors involved in creating this very innovative technology. 1. INTRODUCTION A multidisciplinary research team at Ryerson Polytechnic University, the University of Toronto and Telbotics Inc have developed a modified video conferencing system - PEBBLES (Providing Education By Bringing Learning Environments to Students). One of the goals of our research is to investigate and construct a communication system which will allow a child to interact with his/her regular school while in hospital. The student is represented by and controls a “robot” like mobile unit which is in the classroom.
Social interactions are reported to have an important contribution to the development and well-being of children [1]. Most of these interactions take place at
school. When children are in the hospital they are isolated from their normal peer group and from the benefits of social interaction. The isolation can result in increased stress during a hospital stay as well as during the re-integration process [2]. PEBBLES’ design presents issues which are distinct from research into either remote control, video conferencing or user interface design for children. Providing effective control of remote systems is a complicated problem which is exacerbated with the use of video conferencing. In addition, research into telelearning normally focuses on distance education using adult participants and a remote lecturer. This typically takes place in a static and structured classroom environment. In contrast, PEBBLES integrates a remote participant into a dynamic and unstructured classroom environment. Finally, considerably less research has been done into human computer interaction involving children, particularly with a complex integrated system such as PEBBLES. Thus, a combination of approaches must be used to produce an effective strategy. Remotely controlled robotic arm systems have been developed and investigated for use by people with disabilities (including children) [3], [4], [5], [6]. Researchers suggest that the relationship between these systems and the users should be supported by integrating human factors principles such as task functions, user needs and interface design with the technical requirements and limitations of the robot. In order to promote effective interaction between the child in hospital and the classroom participants, the classroom component of PEBBLES must also support the needs of the support staff, teachers, parents and other students in the classroom.
The problem of providing effective control of remote systems has been well documented. A number of different strategies have been employed for controlling remote systems. These include: 1) direct manipulation using video monitoring and a joystick, or programmable switches, etc. [3]; 2) manipulation of graphic representation of the environment [7]; 3) voice control of robots [8], [9]; 4) robot navigation using control languages [3], [10]; and 5) multidimensional (3-D, 6-D) manipulation with 3-D or stereoscopic displays [11], [10], [7]. Many of these control strategies have been investigated for robotic aids performing specific manipulation tasks such as moving physical objects from one place to another as directed by a remote user. Remote control of PEBBLES should facilitate two-way communication between the student and the individuals in the classroom and should support tasks such as gaining attention to ask or answer a question, moving to face an appointed group, and turning toward a speaker. PEBBLES must move as directed by a remote student allowing that student to have presence in the classroom.
Less research and development has been done in designing interfaces for children rather than adults [17]. Alloway [18] suggests that interface design for children, specifically input device design should be based on children’s stated preferences rather than “adult logic or reasons.” Commercial computer products are available in the entertainment sector specifically oriented toward children (e.g., Nintendo, and Br∅ derbund). The popularity of these products with children seems to indicate that the interface designs are largely successful [19]. Children are the users of PEBBLES and thus their needs and knowledge levels are considered in the design of the system.
Research in video conferencing has been focused primarily on studying the interactions between adults [12]. Generally these interactions take the form of meetings with specific, goal-oriented agendas (e.g., Montage system developed by Tang and Rua [13], Hydra/ Brady Bunch systems developed by Buxton, Sellen and Sheasby [14]), or distance learning activities [15] in which the teacher lectures to students via a video conferencing system.
2. SYSTEM DESIGN, DEVELOPMENT AND EVALUATION
Buxton [16] suggests that a crucial element to the success of video conferencing is the ability of users to have a social presence. The concept of videomediated presence has been studied in a structured, adult environment where participants are following known and practiced social protocols and roles. Researchers suggest that the interactions between users of a video conferencing session are improved through a more realistic sense of presence and awareness of each other [13]. In this project, we are attempting to provide students with a video-mediated presence in an unstructured and dynamic learning environment through the communication systems described in this paper. Presence is provided by three representations: physical (a “robot” like device located in classroom with a “head” and “body”); an audio/visual interface (a video conferencing system); and remote control of the system by the user.
This paper provides a description of the process of construction and evaluation of two prototype communication systems and the remote control interface. Four studies were carried out with children using various versions of PEBBLES. Each study has had an impact on the system design. This paper will report the design process and the human factors involved in creating this very innovative technology
The design and development of PEBBLES has progressed through three design and modification iterations and four pilot studies. While there is insufficient data to report on any statistically significant results, the design process is insightful. Typical desktop video conferencing uses two static computer-based systems that have cameras, microphones and computer screens. People tend to react to the images on the screen like they behave with television. They “sit and watch”. There is little movement and interactions are usually limited to verbal transactions. One of the goals in the design of PEBBLES was to avoid the television metaphor and encourage people to interact and participate. Toward this goal there have been several design milestones. First, the McLuhan Programme in Culture and Technology at the University of Toronto introduced a mobile robotic video conferencing system, named Senator Pobot, in the streets of Washington D.C. Senator Pobot was a “robot-like” device that could be moved around on the street from remote location. Telephone tones and a telephone keypad were used to remotely control the system. An FM signal from the remote location was broadcast to the device on the street.
Senator Pobot was designed to introduce “people on the street” to video-conferencing and robot technologies, and to gather observational data about people’s reactions to the technology in an unstructured environment. Insight was gained about some of the technical and social problems, and issues surrounding this technology. There was a positive reaction from people on the street to the unstructured environment. The PEBBLES project was born as an extension of the Senator Pobot project. We want to examine the issues of promoting social presence through the use of video conferencing combined with simple robotics technologies (i.e., combining virtual and physical technologies). In addition, there is an opportunity to develop highly sophisticated technologies for use by children, and study their interaction. 2.1 PEBBLES I The initial system, PEBBLES I, was a proof of concept model. It was built for approximately $1,500 not including the video conferencing systems (which were donated). PEBBLES I uses video conferencing to provide two way audio and video via ISDN (Integrated Services Digital Network). An IBM compatible 80486 equipped with a PictureTel PCS100 video conferencing system provides communications from the remote end of PEBBLES (in hospital). On the classroom end, a Mitsubishi Diamond Series 9000 system is used. PEBBLES I used the user interfaces provided by the manufactures of the video conferencing systems (no modifications have been made). The remote user’s image (head and shoulders) and voice are captured by an ordinary video camera and hands-free headset microphone respectively. The video and audio are transmitted to the classroom end and output on a television and its internal speakers. In the classroom, images and sounds are gathered using a Canon VC-C1 communications camera and room microphones. The classroom video and audio are transmitted to the remote end of PEBBLES I and output on the computer screen and through external speakers. Both ends of the video conferencing system allow local video feedback so that the user can see him/herself on the computer screen and the classroom participants can see themselves on the classroom television monitor. Figure 1 provides a schematic view of PEBBLES I.
Figure 1: Schematic representation of PEBBLES
The classroom end of PEBBLES is on wheels so that the remote student can be pulled around the classroom allowing him/her to participate in a variety of activities (see Figures 2 and 3). The classroom television monitor is mounted on a pedestal with the centre of the television monitor at 107 cm high or at approximate eye level for a child (age 7-13). The pedestal allows the classroom camera (mounted on top of the television) and the television to pan together left and right in response to left and right control signals from the user. The up, down, zoom in, and zoom out controls tilt and zoom of the classroom camera. The attention control signal activates a red light on the top of the classroom television in order to gain the attention of a teacher or classmates without interrupting.
Figure 2: Image of the remote end of PEBBLES I designed for a single user.
Performance of the system on computer oriented activities was poor, however, this type of activity may be uncommon in classrooms. Thus, support for this activity is not a priority. Further evaluation is required to determine the effectiveness of the system in other classroom settings with a variety of activities and users, and for an extended time.
Figure 3: Classroom end of PEBBLES I designed to represent remote user in class. Children’s control preferences for this type of system were gathered in an informal study performed early in the life-cycle of this research resulting in the specification of Nintendo control pad as an input device [20]. A Nintendo controller is used as the interface to the video conferencing system to perform the seven control actions associated with PEBBLES (left, right, up, down, zoom in, zoom out, attention). PEBBLES I was tested in two pilot studies. In the first pilot study, six 8-10 year cub scouts participated in a 2-hour computer session to obtain their computer badge. One of the cubs participated from a remote location. In the second study, a senior with a disability participated in a two week multimedia workshop using the system from home. In these studies we collected data on the types of interactions, the success rate of those interactions, and the error types for the control actions. Subjective attitudes of the participants, their classmates and teachers toward use of the system were also measured. In general, the results indicated that there were very positive subjective attitudes toward PEBBLES by all people who participated. PEBBLES seems to be successful in allowing a student to participate in some classroom activities including communicating one-on-one and as part of a group. Success was indicated by a positive subjective attitude and a low number of control errors. The Nintendo interface appears to be an effective control method for children but may not be as effective for seniors. The results for use of the attention device are more negative.
Although the Nintendo controller was originally specified as an interface for children, it may be appropriate for other users (adults, people with disabilities). The controller is physically small and may suit a person with limited movement in their hands. It has a fairly simple button interface so that control movements are made by pressing a button rather than manipulating a pointing device such as a mouse or a trackball. Older adults may find button controls easier to understand and use than pointing devices. In the first pilot study the attention device was a red light. It was only activated when the child was pressing the attention control button (normally activated for less than 2 seconds). Upon analysing the success rate of this device, we found that approximately 56% of the attempts to gain attention were missed by the instructors (unsuccessful). From an independent investigation conducted by the researchers comparing various attention-getting devices, 100% of the attempts to gain attention by activating a light which remained on for 10 seconds were unsuccessful. Hence we need to make the attention-getting device more intrusive so that it will be noticed. Based on our observation of attention getting behaviour in face-to-face situations, movement is required in order to attract the attention of the teacher through his/her peripheral vision. In the second pilot study, the attention light flashed on and off for approximately five seconds after the attention control button was pressed. Again the results of the attempts to gain attention were highly unsuccessful (66% failure rate). Based on further analysis of attention-getting behaviour in face-to-face situations, an audio component was deemed necessary. We decided to completely redesign the attention-getting device and add an audio component. 2.2 PEBBLES II The results of the first two pilot studies provided us with proof of concept. A second version was
constructed in an attempt to resolve some of the issues identified in the first two studies. Specifically, a new user interface was designed for use by children (grade 1 through grade 8). The physical appearance was redesigned to be more child-friendly and safe. The attention light was replaced by a hand that waved. Also, we wanted to evaluate the new system with children in the hospital.
mechanisms are not accessible by children’s hands. The colour scheme is carefully designed to be fun and friendly. The head is bright yellow and the base and face plate are turquoise blue.
The custom-designed software interface consists of a school house image as the introductory screen. A child clicks on this picture to initiate the video conference with their school (no phone numbers must be entered by the child). Once connected, the child in the hospital has a full screen view of her classroom as well as a small window showing the local view (a view of herself). The control actions are left, right, up, down and zoom in and out. These are activated by the game pad input device. Feedback is provided through the video conferencing system (change in the video image) as well as a visual change in the interface. When the left/right button is pressed the left/right side of a blue frame surrounding the video image flashes yellow. The same feedback is provided for the up/down control action only the up/down part of the frame flashes. When the zoom control is activated, an icon appears on the screen matching the direction of the zoom (in/out). A hand icon appears on the screen upon activation of the attention button. There are two other controls that the user can perform, however these require a mouse input device. These controls are: 1) stop the conference, and 2) place the system in a pause state (and re-start it). If the student in the hospital is interrupted for medical procedures, or by visitors, he can place the system in a pause state instead of shutting it down completely. The “stop the conference” function appears as an stop sign icon, and the pause state appears as a yield sign. The physical appearance of the classroom component changed drastically as seen in Figure 4. The head is egg-shaped with smooth “hugable” contours. The base is rounded and just large enough to house all of the hardware components (e.g., computer case, audio mixer, ISDN switch, and power connections). The “neck” is a rotating metal structure that contains the servo-motor and electronic control box. The head structure covers most of the neck so that the rotating and power
Figure 4: Classroom end of PEBBLES II.
Two studies have been completed with this system with more planned for the next year. A group of high school students performed an initial test of PEBBLES II. They also provided a high-school age perspective on the design of the system. In the second pilot study, an eight-year old girl who had been in the hospital for five months used the system to attend her normal classroom. We used the same experimental protocol as in the evaluation of PEBBLES I. Results from the two studies with PEBBLES II confirmed some of our initial results with PEBBLES I as well as provided new insights. All users including parents, teachers, children and medical personal had a primarily positive subjective attitude toward the system. The success rate of the face-to-face and group communication tasks were similar to those experienced in the PEBBLES I evaluation. The system seems to support communication and it allows a child to have a virtual social presence in the classroom. The game pad control interface remained an effective control for children. The hand attention device is much more effective in gaining the attention of the instructor but it is too distracting for the students in the classroom. When the hand is activated it protrudes out from the side of PEBBLES in a waving motion. While the hand is effective, the remote student cannot determine if
someone is standing in the way and he may accidentally hit someone with the device. For safety purposes the hand was removed. The attention device must be replaced by a more appropriate mechanism. Formal experiments to evaluate a variety of devices and determine which is most effective are currently being carried out. Some new insights that we collected in these two studies include: 1) the effectiveness of the system for high-school aged students; 2) the logistics of trying to integrate hospital-based curriculum with public school curriculum; 3) misunderstandings between the game pad and mouse functions, and the interface elements; and 4) lack of sense of physical self for remote student. Feedback from the high-school level students indicated that the colour scheme and shape of PEBBLES was too “toy-like” and more appropriate for younger children. They wanted a sleeker, “older” version of the system. A third version, PEBBLES III, is planned to address some of the concerns expressed by the high school students. It will have a sleeker design and a different colour scheme. Communication between teachers in the hospital and teachers in the regular school must be facilitated for the system to be most effective. However, this is a challenge since lessons and materials are usually not prepared much in advance. One possible solution is to provide basic school supplies (a portable writing surface, writing implements, etc.) with the hospitalbased system so that the hospital teacher can facilitate academic transactions. Misunderstandings occurred between the button functions on game pad and the functions of the icons on the screen. Also, using both the mouse and the game pad to control the interface was confusing. The child in the hospital has little understanding and knowledge of the physical appearance and the physical limits of PEBBLES as indicated by verbalisations and control errors arising during the studies. For example, the head can only turn approximately 270o and as the system reaches its physical turning limit the child in the hospital would continue to press the control button, attempting turn further. Better feedback of the physical state of the system is required for the interface. Suggestions include an animated icon of the robot displayed on the remote user’s screen. It would illustrate
directions the head and camera are moving as well as when the head had reached the physical limit. 3. CONCLUSION The knowledge base for this type of technology with children is limited. Each iteration provides the research team with many new insights and information. While some results might seem obvious it was difficult to predict. Other input devices must also be considered because the functionality is rapidly outgrowing a button/game pad style interface. It is expected that at least two more prototypes will be constructed based on further national and international studies where cultural implications are unknown. 4. ACKNOWLEDGMENTS The authors would like to thank Ryerson Polytechnic University, Cinematronics, Telbotics Inc., The Royal Bank of Canada, The Bloorview MacMillan Centre, Bell Canada, PictureTel Canada, PictureTel Inc., and PicTech Inc. for their generous contributions and support in this project. Funding for this project was provided by NSERC grant # OGP0184220, The Royal Bank of Canada, The Ontario Ministry of Education and Training -TIPPSII, and Ryerson Polytechnic University. David Spargo was instrumental in all aspects of the project and its name. We would also like to gratefully acknowledge 44th Toronto Cub Pack, and the multimedia workshop participants and instructors at the University of Toronto for their time and participation in the pilot study. 5. REFERENCES 1. Larcombe, I. (1995). Reintegration to school after hospital treatment: Needs and services. Aldershot, UK: Avebury. 2. Sandler, I. N., Miller, P., Short, J., & Wolchick, S. A. (1989). Social support as a protective factor for children in stress. In D. Belle (Ed.), Childrens social networks and social supports (pp. 277-307). New York, NY, USA: John Wiley & Sons. 3. Masanic, C., Milner, M., Goldenberg, A.A., Apkarian, J. (1990). Task Command Language Development for the UT/HMMC Robitic Aid. Proc. of RESNA 13th Annual Conference, Washington D.C., 301-302.
4. Hammel, J., Hall K., Lees, D., Leifer, L., Van der Loos, M., Perkash, I., Crigler, R. (1989). Clinical evaluation of a desktop robotic assistant. Journal of Rehabilitation Research and Development. 26(3). pp 1-16. 5. Harwin, W.S., Rahman, T., and Foulds, R.A. (1995). A review of design issues in rehabilitation robotics with reference to North American research. IEEE Transactions on Rehabilitation Engineering. 3(1), 3-13.
6. Dallaway, J.L., Jackson, R.D., & Timmers, P.H.A. (1995). Rehabilitation robotics in Europe. IEEE Transactions on Rehabilitation Engineering.3(1), 35-45. 7. Zhai, S., Buxton, W., Milgram, P. (1994). The “silk cursor”: Investigating Transparency for 3D target acquisition. Proc. of Human Factors in Computing Systems - CHI’94. Boston, MA, 459464.
14. Buxton, W., Sellen, A. & Sheasby, M. (in press). Interfaces for multiparty videoconferencing. To appear in K. Finn, A. Sellen & S. Wilber (Eds.). Video Mediated Communication. Hillsdale, N.J.: Erlbaum. WWW address: http://www.dgp.toronto.edu/OPT/papers/bill.buxton/ multiparty.html
8. Hackenberg, R.G., (1986). Using natural language and voice to control high level tasks in a robotic environment. Intelligent Robots and Computer Vision: Fifth in a Series, SPIE, 726, 524-529.
15. Isaacs, E.A., Morris, T., Rodrigues, T.K., Tang, J.C. (1995). A comparison of face-to-face and distributed presentations. Proc. of Human Factors in Computing Systems - CHI’95. 354-361.
9. Cammoun, R., Detriche, JM. Lauture, F., Lesigne, B. (1993). Improvements of the MASTER manmachine interface. Proc. of European Conference on the Advancement of Rehabilitation Technology - ECART 2. Stockholm, Sweden, 24.2.
16. Buxton. W. (1992). Telepresence: Integrating Shared Task and Person Spaces. Proceedings of Graphics Interface '92, 123-129.
10. Kameyama, K., Ohtomi, K. (1993). A shape modeling system with a volume scanning display and multisensory input device. Presence, 2(2). 104-111. 11. Halpern-Hamu, C.D. (1993). Direct manipulation, through robots, by the physically disabled. PhD Thesis. Department of Computer Science, University of Toronto. 12. Gowan, J.A., and Downs, J.M. (1994). Video conferencing human-machine interface: a field study. Information and Management. 27(6) 341-356. 13. Tang, J.C., Rua, M. (1994). Montage: Providing teleproximity for distributed groups. Proc. of Human Factors in Computing Systems - CHI’94. Boston, MA, 459-464.
17. Robertson, J.W. (1994). Usability and children’s software: A user-centered design methodology. Journal of Research on Computing in Education.5(3/4).257-271. 18. Alloway, N. (1994). Young children’s preferred option and efficiency of use of input devices. Journal of Research on Computing in Education. 24(1). 104-109. 19. Rimalovski, I. (1996). The Children’s Market. Interactivity. June, 30-39. 20. Treviranus, J., Smith, G. (1995). The Adaptive Technology Resource Centre. Augmentative and Alternative Communication News. August. Trademarks Canon VC-C1 is a trademark of Canon Inc. Diamond Series 9000 is a trademark of Mitsubishi Inc.
Nintendo is a trademark of Nintendo Inc. PCS100/Live50 is a trademark of PictureTel Corporation.