The Future of AI
Symbiotic Robotic Systems: Humans, Robots, and Smart Environments Silvia Coradeschi and Alessandro Saffiotti, Örebro University
P
ippi, the housekeeping robot, is preparing breakfast. She moves to the refrigerator, elegantly opens the
door with one of her grippers, searches the shelf with her video camera connected to a powerful vision system, finds the milk, and picks it up with her other gripper. Then she closes the refrigerator door, moves toward the table, and puts down the milk. This vignette illustrates how we’ve imagined we could use an autonomous robot in everyday life and in industry: an almost totally autonomous agent that relies on its own sensors and actuators to interact with the world and has limited cooperation with other devices and humans. Such robots are well in line with the initial dream of AI and can be considered its concrete implementation. As we well know, this dream has been difficult, if not impossible, to realize, despite the success of AI research in many specialized fields and applications. In robotics, the initial dream of a fully autonomous robot that can operate in an unknown, unmodified environment is still alive and driving robotics research. However, we’re far from realizing the simple scenario with Pippi; users in factories, hospitals, and homes are still waiting for the intelligent robots that AI researchers promised to deliver long ago.
How robotics is changing So, what’s the future of intelligent robotics? We believe it lies in a paradigm shift: from fully autonomous, solitary robots working in an unmodified and unknown environment, to pervasive robotic systems working in symbiosis with people and their (smart) environments. Let’s first consider how a smart environment can help the robot achieve its tasks. The aim of the smart environment is to reduce the requirements on robot sensors and actuators. For instance, a smart environment can include • distributed Web cameras, RFID tags, or motes embedded in everyday objects and storing information about those objects; 82
• gas sensors that are activated by smells; • devices for opening and closing doors and for moving objects; and • touch sensors. In our scenario, instead of using her own vision system and grippers to find the fridge handle and open the door, Pippi would cooperate with the fridge, which would have a mechanism to open and close the door on request. The milk package would have an RFID tag that informs the robot of its shape, material, size, and weight, thereby greatly simplifying the gripping process. The tag could also store information about the package contents— milk—and its last date for consumption. Interestingly, the fact that this information is in a symbolic form would allow symbolic reasoning to be smoothly integrated into robotic systems, bypassing the pixel-topredicate problem. A second important aspect that can help the robot achieve its task is cooperation with humans. The human and the robot can share sensing, cognitive, and physical abilities and thus complement each other. Human-robot systems are more capable than either humans or robots alone. The robot can ask the human to remove an obstacle or to provide additional information for disambiguating a situation. Natural language communication will help determine the success of human-robot cooperation and especially the connection between symbols and sensory experience.1
Challenges of symbiotic robotic systems A symbiotic robotic system consists of a robot, a human, and a (smart) environment that cooperate symbiotically to perform a task. While the possibility of cooperating with the smart environment and with humans might greatly simplify the execution of robotic tasks, it also presents new challenges that open new research directions. For instance, the robot must be aware of the smart environment’s capabilities and know how to use them. Moreover, people’s presence in the robot’s workspace raises
1541-1672/06/$20.00 © 2006 IEEE Published by the IEEE Computer Society
IEEE INTELLIGENT SYSTEMS
Workspace Digital world
Tracker
World model
Perception
E Task execution
Milk
H
H Robot
Physical world
E
Figure 1. A symbiotic robotic system consisting of a robot, a human, and a (smart) environment that cooperate symbiotically to perform a task.
important safety concerns. In our scenario, Pippi must take into account that children might be playing on the floor when she’s moving in the kitchen and make sure she won’t harm them. To better visualize the possibilities and new research challenges that symbiotic robotic systems offer, consider the system illustrated in figure 1. Compared to traditional robotic systems, symbiotic systems have some peculiar aspects, highlighted in yellow in the figure. First, the smart objects in the environment possess capabilities for information storage, processing, and communication, so they have a presence in both the physical and digital worlds. The robot has two complementary ways to interact with these objects: physical interaction, by observing or manipulating them; and digital interaction, by querying them about their properties (for example, the milk bottle’s weight) or asking them to perform particular actions (such as asking the refrigerator to open). Coordinating these two forms of interaction is a new, challenging issue. Second, the robot’s world model must include knowledge about the capabilities and the state of the smart objects in the environment (the yellow box labeled “E” in MAY/JUNE 2006
figure 1). It must also include knowledge about the humans’ position, intentions, and possible future actions (the yellow box labeled “H”). The robot can acquire these models through a combination of direct communication with the objects, linguistic communication with the humans, and perceptual observations. Third, the robot must consider explicitly the actions performed by the smart environment (the tree labeled “E” in figure 1) and by humans (the tree labeled “H”) when deciding its own actions during task execution. For instance, the robot might ask a person which of two similar bottles should be grasped. Even when the robot and the human perform independent tasks, the robot must take into account the human’s (expected) course of action to guarantee safe cohabitation. That is, the robot must adapt its own task execution to the human’s expected actions to eliminate any possibility of interference or collision.
A new paradigm In general, symbiotic robotic systems can be considered three-agent systems in which the agents are performing interdependent tasks. In such systems, the robot’s actions are generally controllable and observable, the environment’s actions are partially conwww.computer.org/intelligent
trollable and partially observable, and the human’s actions are not (directly) controllable but are partially observable. This new paradigm will enable us to incorporate techniques from AI and robotics in everyday, smart environments. Symbiotic robotic systems will shift our focus from the development of fully autonomous, almighty robot servants to the development of an open, extensible environment in which humans will cohabitate with many cooperating robotic devices. The ingredients for this shift are already present in the fields of safe human-robot interaction,2 ambient intelligence,3 and smart objects4 as well as in the integration of these fields with autonomous robotics.5 Symbiotic robotic systems will also shift our research priorities in intelligent robotics. It will simplify some of the current big challenges such as localization, object recognition, and manipulation. At the same time, it will open several new and exciting research fields such as information sharing and coordination among objects, robots, and humans. This paradigm will likely become dominant in robotics in the coming years, eventually making these technologies as pervasive in our houses and workplaces as computer technologies are today. 83
CALL FOR ARTICLES: PAPERS:
Silvia Coradeschi is
Interacting with Autonomy
an associate professor in the Department of Technology at Örebro University, Sweden. Contact her at silvia.coradeschi@ tech.oru.se.
Submission deadline: 11 August 2006 2005 Publication: January/February 2007
This special issue of IEEE Intelligent Systems solicits unpublished research papers on human interaction with autonomous or semiautonomous physical systems such as: • • • •
Robots, including ground-based, airborne, space, and underwater robots Smart houses Assistive technologies or companion robots Entertainment-based systems, including immersive environments
We’re especially interested in articles that report on interaction implementations that have been evaluated with the intended user groups; validated theoretical constructs that explain or describe observed phenomena of human interaction with autonomous technolo-
gies; and research concerning human control over variable-autonomy systems, especially if it provides a useful framework for further research. Submissions should be 3,000 to 7,500 words (counting a standard figure or table as 200 words) and should follow the magazine’s style and presentation guidelines (see www.computer.org/intelligent/author. htm). References should be limited to 10 citations. To submit a manuscript, access the IEEE Computer Society Web-based system, Manuscript Central, at http:// cs-ieee.manuscriptcentral.com/index.html. For the full call, see www.computer.org/portal/cms_docs_ intelligent/intelligent/content/x1-07cfp.pdf.
Guest Editors: Holly Yanco,
[email protected] • Jill Drury,
[email protected]
IEEE
Call for Papers
Special Issue on Recommender Systems Recommender systems support decision processes for identifying (complex) products and services in e-commerce environments. To fully exploit such technologies’ potential, research must focus on their human factor and applicability. This special issue solicits unpublished research papers related to such topics as • Knowledge acquisition for recommender knowledge bases • Consumer decision making models • Intelligent and adaptive user interfaces in recommender applications • Learning models for such systems • Probabilistic reasoning and data mining • Group recommender systems For the full call, see www.computer. org/portal/cms_docs_intelligent/intelligent/ content/x2-07cfp.pdf.
Submission Guidelines
Submissions should be 3,000 to 7,500 words (counting a figure or table as 200 words) and should follow the magazine’s style and presentation guidelines (see www.computer.org/intelligent/author. htm). References should be limited to 10 citations. To submit a manuscript , access the IEEE Computer Society Web-based system at http://cs-ieee.manuscriptcentral. com/ index.html.
in the Department of Technology and head of the Center for Applied Autonomous Sensor Systems’ Mobile Robotics Lab at Örebro University, Sweden. Contact him at alessandro.saffiotti@ tech.oru.se.
1. S. Coradeschi and A. Saffiotti, “An Introduction to the Anchoring Problem,” Robotics and Autonomous Systems, vol. 43, nos. 2–3, 2003, pp. 85–96. 2. B. Graf, M. Hans, and R.D. Schraft, “Mobile Robot Assistants,” IEEE Robotics and Automation Magazine, vol. 11, no. 2, 2004, pp. 66–77. 3. N. Shadbolt, “Ambient Intelligence,” IEEE Intelligent Systems, vol. 18, no. 4, 2003, pp. 2–3. 4. M. Weiser and J.S. Brown, “Designing Calm Technology,” PowerGrid J., vol. 1, no. 1, 1996. 5. A. Saffiotti and M. Broxvall, “PEIS Ecologies: Ambient Intelligence Meets Autonomous Robotics,” Proc. Int’l Conf. Smart Objects and Ambient Intelligence, 2005, pp. 277–282; www.aass.oru.se/~peis.
Guest Editors: Alexander Felfernig
[email protected] Gerhard Friedrich
[email protected] Lars Schmid-Thieme
[email protected]
Submissions due: 22 September 2006 Publication: March/April 2007 84
Alessandro Saffiotti is a professor
References
IEEE
Dynamic situations will require human supervisors to vary their level of interaction with autonomous systems as conditions change. For at least the foreseeable future, people will still need to interact with these systems.
www.computer.org/intelligent
For more information on this or any other computing topic, please visit our Digital Library at www.computer.org/publications/dlib. IEEE INTELLIGENT SYSTEMS