Personalized Socially Assistive Robotics The Interaction Lab
More information: http://robotics.usc.edu/interaction/assistive
Contact: Prof. Maja Matari,
[email protected]
Socially assistive robotics focuses on non-contact human-robot interaction (HRI) that provides assistance through social rather than physical interaction.
Aim: We focus on assistive human-machine interaction methods aimed at creating robot systems capable of aiding people with special needs in daily life. We assume that intelligent, personalized robots can provide individualized care through monitoring, coaching, encouragement, and motivation in the contexts of convalescence, rehabilitation, training, and therapeutic aid.
Project Areas: Socialization of children with Autism Spectrum Disorders Developing methods for using robots as therapeutic social partners for children with Autism Spectrum Disorders (ASD), to encourage, facilitate, and train social and communicative behavior through embodied social interaction.
Robot-assisted post-stroke rehabilitation Developing methods for real-time modeling of user motivation, personality, empathy, and adaptation aimed at personalizing the poststroke therapy process through intelligent human-machine interaction.
Special care of the elderly undergoing cognitive changes Developing methods for integrating robots and non-invasive lightweight sensors in order to evaluate user state and perform real-time adaptive assessment and feedback during cognitive and physical exercises.
Special education Developing robots capable of acting as a social and pedagogical partners to aid in the education and learning processes of children with cognitive, attentional, and social disabilities.
Research Goals: Leveraging Embodiment: Understanding and utilizing the role and impact of physical embodiment on assistive human-machine interaction in different contexts (school, hospital, rehabilitation center, elder care home, etc.) and with real-world user populations. Proxemics: Ensuring that an embodied robot makes appropriate use of social space to allow the human user to feel safe and comfortable, in accordance with his/her personality type and cultural and contextual constraints. Multi-Modal Communication: Expanding our understanding of the relationship between synthetic expression (including vocal communication, facial expression, and body movement) and user interpretation and behavior in task-oriented scenarios. Expression: Determining a subset of synthetic behaviors created through multi-modal expression and communication, that result in reliably interpreted emotions. Modeling Personality and Empathy: Effectively modeling and employing personality and empathy in ways that leverage the robot's physical embodiment to provide individualized and engaging assistive interaction. Engagement and Learning Through Imitation: Modeling mimicry, mirroring, and learning from demonstration and by imitation as key components of user engagement and task teaching and training. Adaptation and Learning: Dynamically optimizing interaction parameters such as distance/proxemics, expressiveness, and vocal style and content of the robot’s personality to that of the user so as to improve the user’s task performance and sustain engagement.
User state and activity tracking Using physiological sensors and motion tracking devices to obtain user state and activity. These wearable devices allow the robot to track and adapt to the user’s internal and external state and task performance in real-time.
Technologies: Navigation The robots are equipped with ultrasound and eye-safe laser sensors and software providing safe collision-free navigation in indoor environments.
Speech and Sound The robots communicate through pre-recorded or synthesized speech. We are using basic speech recognition and emotional state recognition (via sound and other cues).
Activity Tracking We developed a system comprised of a set of small and lightweight inertial measuring units. Each sensor provides its own global orientation; the information is communicated via wireless to provide real-time activity tracking.
Physiological State Estimation
Publications: Maja J Matari, Jon Eriksson, David Feil-Seifer, and Carolee Winstein, "Socially Assistive Robotics for Post-Stroke Rehabilitation", International Journal of NeuroEngineering and Rehabilitation, 4(5), Feb 19, 2007. Emily K. Mower, David J. Feil-Seifer, Maja J Matari, and Shrikanth Narayanan, "Investigating Implicit Cues for User State Estimation in Human-Robot Interaction Using Physiological Measurements, Proceedings, 16th International Workshop on Robot and Human Interactive Communication (RO-MAN 2007), Jeju Island, South Korea, 2007. Adriana Tapus, Cristian Tapus, and Maja J. Matari, "Hands-Off Therapist Robot Behavior Adaptation to User Personality for Post-Stroke Rehabilitation Therapy", Proceedings, IEEE International Conference on Robotics and Automation (ICRA-07), April 2007. Joshua Wainer, David Feil-Seifer, Dylan A. Shell, and Maja J Matari, "Embodiment and HumanRobot Interaction: A Task-Based Perspective", Proceedings, 16th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN 2007), Best Poster Presentation Award, Jeju Island, South Korea, Aug 26-29, 2007. Adriana Tapus, Maja J Matari, and Brian Scassellati, "The Grand Challenges in Socially Assistive Robotics", IEEE Robotics and Automation Magazine, 14(1), Mar 2007. David Feil-Seifer, Kristine Skinner, and Maja J Matari, "Benchmarks for Evaluating Socially Assistive Robotics", Journal of Interaction Science, 8(3), 2007, 423-439. Maja J Matari, "Socially Assistive Robotics", IEEE Intelligent Systems, 21(4), Jul/Aug 2006, 81-83.
We have used real-time physiological data (heart rate, galvanic skin response, body temperature) to determine the user’s state, including emotion, boredom, and engagement.
Simulation We have created physics-based simulated software agents that accurately model our physical robots, to be used as controls in experiments.