Preparation of Papers in Two-Column IEEE Format for ...

7 downloads 13117 Views 198KB Size Report
technical directors to create custom windows and scripts or reconfigure the Maya user interface to make a completely custom application. 2. Related Work.
The 5th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2008)

Extending Animator Tool Sets for Humanoid Robotics Paul Diefenbach1, Daniel Letarte1, Christopher Redmann1, Robert Ellenberg2, Paul Oh2 Drexel University {pjdief, daniel.letarte,cpr25,rwe24,pyo22}@drexel.edu

Abstract - There has historically been a disconnect between the engineering of robotic behaviors and the artistic, simple creation of fluid, human-like motions. This work discusses an approach which leverages 3D tools originally created for computer graphics animators and repurposes them for humanoid robotics. Initial proof-of-concept work on a simple off-the-shelf humanoid built from simple servos and brackets provides a testbed prior to migration of the techniques to a larger, complex, reactionary biped robot, the HUBO.

1. Introduction While there are many similarities between the fields of robotics and CGI, there is a disconnect between the engineering of robotics and the artistic tools of computer animation. Robotics has expanded to include modular programming structures, 3D simulation environments, physics engines, and dedicated programming environments, yet most control falls into purely canned routines of meticulously pre-calculated forward-kinematic sequences, real-time IK linkages driven by set end-effector goals, or some sensor-driven combination. Most systems lack the tools for adding the artistry of motion that modern humanoid robots are now supporting. For complex robotic platforms such as humanoid robots, specialized architectures support a means for simple IK control, dynamics, and testing controllers; yet this provides an engineer’s view to the problem. Existing animation tools have long provided the artist a means to control simulated humanoids and even simulated humans, but these tools have never been adequately explored as a possible platform for driving physical robots. We propose a system built around a powerful, extensible, commercial modeling and animation system to study the feasibility of leveraging a mature artist tool for the engineering task of controlling both a virtual and physical humanoid robot. For our project, we chose Autodesk’s Maya® software. Maya’s scripting support offers full access to any Maya software feature and is frequently used by artists and technical directors to create custom windows and scripts or reconfigure the Maya user interface to make a completely custom application.

2. Related Work 2.1 Simulation Platforms 1

Digital Media Program and Drexel RePlay Lab (www.replay.drexel.edu) 2 Mechanical Engineering Department

Virtual testing of robots in simulated environments has long offered the advantages of providing a simple, safe environment where real-world complexities can be discounted. There are numerous free and commercial robotic software platforms available today which provide (to varying degrees) unified programming and service execution environments, reusable components, a simulation environment, drivers for robotics hardware, and other modules for vision, navigation, etc. Microsoft Robotics Studio, for example, provides Microsoft Visual Programming Language to build robotics applications using a graphical data-flow-based programming model on top of a 3D simulation engine. This simulation engine is based on game engine technologies and includes hardware-accelerated physics. OpenHRP [1] (Open Architecture Humanoid Robotics Platform) furthers the robotics platform concept by creating an architecture specifically for humanoid robotics, and consists of a dynamics simulator, a view simulator, motion controllers and motion planners. Yet despite improvements in software fidelity, the use of a virtual simulation as a testbed had limited application by itself due to the lack of noisy data, the use of incomplete or inaccurate models, and the assumption of perfect systems such as vision or absolute rigid-body dynamics. Beyond the use as a testbed, other simulations have tried to provide a hardware-in-the-loop solution by decoupling various subsystems such as vision. Stilman et. al presented an approach for fully decoupling these systems to promote development and testing through an augmented reality environment. [2] Because much of the focus of these simulators has been for debugging of tasks prior to a physical implementation, they are designed around the concept of the run-time environment. They do not readily support the creation of realistic humanoid animations. In the remainder of this paper, we will discuss taking an author-centric approach that supports the artistry found in animated virtual characters that is traditionally lacking in animated physical characters. 2.2 Author-centric Approach One project that leveraged an artistic approach is Interbots[3], which is a commercial spin-off from Carnegie Mellon University’s Entertainment Technology Center project. Interbots used Maya to build a virtual prototype of their robotic character, mill the actual shell based on this prototype, and author a series of animations in Maya which were then exported for playback on the

The 5th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2008)

physical hardware. A limit tester parses the file for warnings of any bits of animation that may be too extreme for the hardware to handle. Interbots Maya user interface includes an Animation Editor Tool (AET) which provides a simple 2D slider and keyframe interface non-animators. This Maya Plug-in outputs a file containing DMX signal values which are then sequentially sent to the servos. While the Interbot project served as an early justification that our approach is valid, the project failed to address numerous aspects that our research is evaluating. First, the Interbot robot is a rigid-mounted robot (non-biped), and therefore balance (and dynamics in general) is not an issue other than for some primitive limit testing. Second, this approach cannot use existing servo signal sequences; no import is supported. Furthermore, even animation composition of sequences generated by Maya is done through an external high-level authoring system. Lastly, the animation generation support does not exploit Maya’s advanced capabilities. While the AET creates a user-friendly interface for animation generation, it fails to support user-generated libraries of animations, multi-track blending of animations, support for FK-IK blending, etc.

3. Project Description 3.1 Goal Our goal is to create a software architecture for producing high-level animations and simulations that equally apply to a virtual humanoid robot, a half-scale version, and a full-scale production version of the robot. Our animation software architecture is intended to integrate a variety of animation/simulation sources. One component currently under development is a dance generator which analyzes music and dynamically calls a series of choreographed animation sequences to correspond to the beat. Our authoring environment must work in conjunction with these other modules. Our eventual target is the next-generation HUBO robot based on the prior generation KHR-3[4], and prototyping work is being performed on the Robonova-1, both shown in Figure 1.

Fig. 1. Hitec's Robonova-1 Humanoid Robot(left) (source: www.robonova.com) and HUBO (right). Not to scale.

3.2 HUBO KHR-3(HUBO) is a 125cm tall, 55Kg weight, biped humanoid robot which has 41 DOF (12-DOF in leg, 8-DOF in arm, 6-DOF in head, 14-DOF in hand and 1-DOF in trunk). It has a Pentium 111-933MHz embedded PC as a main controller running Windows XP and RTX (Real Time Extension). A 400W servo controller controls each joint motor. There are two CCD cameras in the head, F/T (Force and Torque) sensors on the ankles and wrists, accelerometers on the soles, and an inertial sensor system on the torso. A distributed controller architecture permits communication between servo controllers, sensors, and the main controller. DC motors and a harmonic drive reduction gear mechanism are attached as actuators for the joints. Our next generation HUBO will extend the KHR-3 functionality. HUBO supports controllers such as ZMP (Zero Moment Point), vibration reduction, landing orientation, damping, landing timing and landing position controller according to its objectives. The F/T sensors at the ankles of the robot and accelerometers at soles can compensate for the input position profiles to maintain dynamic balance. This permits reduction of unexpected external forces such as landing shock and vibration induced by compliances of the F/T sensor structures, link frames and reduction gears. 3.3 Robonova As we are not yet in receipt of the HUBO, and in order to test different approaches, we are currently experimenting with an off-the-shelf humanoid, Hitec’s Robonova-1. Robonava is constructed of analog/digital servos joined with simple stamped brackets connecting the joints. Robonova supports the RoboBasic language and development environment. This language permits posing the robot under power and capturing that position as a servo movement command. The low-level software can interpolate between two such positions. The relative slowness of the Robonova’s 7.81Mhz CR3024 microcontroller dictates the external processing is required for complex simulations. The supported RoboBasic language has limited features but supports reading commands from a serial port to producing more complex motion sequences. One example of this is the previously-mentioned music interpretation software. This performs the processor-intensive calculations of music beat tracking & gesture commands externally yet drives the relatively unsophisticated Robonova’s motion processor to produce more complex, dynamic sequences. In addition to the simpler kinetic structure of Robonava versus the HUBO, an existing simulator, Mocono Untilities’ RZ1Action 1, provides a robust virtual metric and testbed for us to compare and contrast our approach. This is important so that we can address fundamental

1

http://web.mac.com/micono/RZE/Robomic.html

The 5th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2008)

issues in bipedal robotic control before moving to the more complex and more capable platform of the HUBO.

4. Maya as a platform Autodesk’s Maya 2 is a powerful, integrated 3D modeling, animation, and rendering platform that provides two embedded scripting languages and a comprehensive API to access its modular, extensible architecture. This modularity and access permits full customization of the appearance of the application by removing the standard user interface and remaking the software into a custom application. While the core Maya kernel is written in C++, much of the functionality is implemented in one of its scripting languages, the Maya Embedded Language (MEL). Many production and effects studios rely on this feature to create custom tools built around the kernel. Maya also supports Python scripting as well as commands that can be issued from an HTML page via the Maya web browser plug-in. Any functionality not accessible by MEL or Python can be accessed via the OpenMaya API/SDK, which provides direct access to Maya’s scene hierarchy. Plug-ins and standalone applications that run from the Maya command line can be written using C++. These extensions can query and modify existing Maya objects such as geometry, transforms, scene hierarchy, and dependency graph nodes, as well as extend Maya with new types of objects such as file translators and MEL commands. Maya has an advanced modeler which supports polygons, NURBS, and subdivision surfaces, but it also has support for all major CAD and exchange formats. In addition to advanced modeling capabilities, Maya supports keyframe, path, non-linear, mo-cap, and skeletal animation using both forward and inverse kinematics. It supports access of the animations through graph and Dope Sheet Editors for advanced timing control and the manipulation of dense mo-cap keyframe data. Blend channels and the Trax nonlinear animation editor permits nondestructive mixing and editing of poses and animation clips, each of which can be stored in libraries. Maya also supports procedural and expression-driven animations. Maya’s IK system consists of seven built-in IK solvers and supports a comprehensive assortment of constraints, including parent, point, aim, scale, geometry, normal, tangent, etc., as well as joint properties such as joint limits, preferred angles, and joint mirroring. A spline IK solver allows for the easy animation of skeletal chains, like a character’s spine, and includes easy-to-use twist and roll controls while the single chain and lightweight 2-bone solvers are optimized for real-time interactivity. The spring IK solver allows for precise control over multi-jointed appendages, and the Full Body IK System is intended for articulation of biped and quadruped models. The IK system supports smooth blending between IK and FK animations.

2

www.autodesk.com

Maya’s dynamics system supports the dynamic interaction of geometry, including collisions between rigid and soft bodies. Its Rigid-Body Dynamics provides realistic, high-speed simulation of multiple rigid objects and includes dynamic constraints such as nails, hinges, barriers, pins, and springs. Field forces—such as gravity, vortex, air, and turbulence—can be applied to rigid bodies, soft bodies, or particle objects.

5. Animation Generation Animation generation and export required creating a rigid-body model with the correct dimensions, joint placement, joint limits, and IK chains. While a Robonova model file was readily found on the web, this was unsuitable for use as the joints were not correctly specified. Therefore the joint hierarchy was completely rebuilt with correct specifications and a cleaner joint naming convention for scripting. Correction of the joint types and limits led to problems with Maya’s IK solver. Common issues associated with inverse kinematics such as flipping were made more prominent due to the strict requirements of our skeleton. The majority of these problems were resolved through combinations of Maya's animation constraints which were used to help guarantee that each joint was only capable of rotating in a single axis even in combination with the IK. Maya's built-in joint limits were used to lock the total rotation of each joint to 180 degrees. The exporter itself, written in MEL, had to be capable of translating the joint angles from Maya to their RZ1 Action counterparts. The initial values of all the Maya joints are set to zero, while the default RZ1 angles in the same position range anywhere from 10 to 190 degrees, depending on the particular joint. With the possibility of a one-to-one relation between these joints ruled out, four more custom attributes were added to each joint in the Robonova skeleton. The first attribute holds the initial value of the corresponding joint in the RZ1 simulator. This locked and un-keyable attribute is used during the import process to convert RZ1 angles to Maya angles through simple subtraction. Through the use of Maya's driven keys, the second attribute's value is generated based on the rotation of the joint it resides upon. The second two attributes hold similar but “flipped” values for use when the motor directions are inverted. Using this method conveniently allows for automated calculation of the new RZ1 angle during the Maya animation process instead of in the exporter itself. Other challenges with the exporter involved time conversions to output the adjusted speed of the Robonova motors. The following equation was used for this: Speed = (1/elapsed_time*speed_multiplier

(1)

Elapsed time is evaluated as time units between keyframes, and the speed multiplier is adjusted based on the Maya user's current unit of time measurement. Based on keyframe locations and joint information, the exporter generates MOVE G24 and SPEED commands

The 5th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2008)

recognizable by the RZ1 Action simulator. This data is then formatted properly and written to a file for Robonova interpretation

Fig. 1. Maya generation of animation sequence.

Fig. 2. Imort of Maya sequence for validation in RZ1Action.

6. Animation Import The animation importer, also written in MEL, is capable of reading a RoboBasic script and parsing it accordingly. Maya reads through the file on a per line basis, checking the first word of each line against a library of known commands. If the word is determined to be a recognized command, the script reads the remainder of the line as its associated parameters. Once a string of parameters is caught, it is tokenized, attached to the command from which it came, and interpreted to a change in time and/or the placement of keyframe data. In the case of an unrecognized command, it is still caught and read, though exported separately to determine what went uninterpreted. Currently there are actions associated with all of the primary and many of the secondary movement commands. The following keywords have Maya actions assigned to them: MOVE G24, MOVE G6A, MOVE G6B, MOVE G6C, MOVE G6D, DELAY, DIR, and SPEED. The importer is also able to determine when a new subfunction is declared, and which commands fall within that subfunction. While this method proved acceptable and straightforward in early forward kinematic test rigs, the introduction of IK to the system resulted in a few challenges. Because the goal is that the animation should be completely customizable – appended to, edited, blended, etc., additional steps must be taken in order to preserve this flexibility. For instance, where in an FK assembly you would simply read the joint angles from the file and assign these values to the corresponding joints, in an IK rig you must first disable all IK solvers, use the new values to position the joints without keying them, translate/rotate the IK control object into its precise new position, key the control object, and then re-enable the IK solvers. This ensures that when the user goes to edit the scene, all keyframes will be located on the expected objects, with nothing broken due to constraint conflicts. Currently measuring at approximately 1300 total lines of code, the Maya RZ1 Importer/Exporter script is capable of nearly the same amount of functionality as the RZ1 Action Simulator, and does so with a more natural way of controlling the Robonova. In addition, Digital Media student Joyce Tong has produced a plug-in version of the importer using the Maya C++ API which we will use to compare the two different approaches.

7. Future Work While the initial work has demonstrated the feasibility of this approach, the true benefit lies in creating complex motions which exploit the full mobility of the HUBO platform. This requires several complex extensions which we will be investigating. 7.1 Motion Capture Fig. 3. Robonova playback of animation sequence.

The 5th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2008)

Motion-retargeting is crucial for generation of animations based on motion-capture. Maya has native support for retargeting via its FBX extension in addition to the algorithms described above, and we will be testing these various approaches using our Vicon motion capture studio to generate animations and importing them into Maya. 7.2 Balance Balance is obviously a key consideration for any controller approach. Although HUBO does have controllers to maintain balance, this relies on small adjustments to motion curves. In addition, Robonova has no such feedback system. Therefore, our goal is to simulate the physics the real robot will undergo as closely as possible. Maya does not have native support for bipedal balance, but there are several approaches that we plan to investigate. In addition to Maya’s native support of rigid and soft-body physics, an open-source plug-in is available for Maya that integrates the PhysX physics engine from AGEIA and supports COLLADA physics. This plug-in, called Nima, can simulate a ragdoll directly inside Maya and enables the user to interact with the model through the standard Maya manipulators. The reliance on Zero Moment Point methods for balance is well-established in bipdedal robots[5] and is used by robots such as Honda's ASIMO. Using this approach, the robot's onboard computer attempts to maintain the total inertial forces (gravity, accelerations and decelerations of walking and movement) exactly opposed to the contact reaction force (the force of the floor pushing back on the robot's foot) so that the two forces cancel out, leaving no moment to cause the robot to rotate and fall over[6]. As the ZMP is affected by the mass and inertia of the robot’s entire structure including the torso, its motion typically would require large ankle torques or the use of small trunk motions for balance. If the ZMP is confined to a stability region through the use of motion planning, the motion of the robot is more balanced and natural. In computer graphics, the notion of balance is often found in the problem of motion-retargeting, and often has followed variations of spacetime constraints[7]. Tak and Ko[8] implemented motion retargeting which addressed dynamic balance based on the ZMP which also addresses real-world torque limits. In addition, this work was actually implemented as a Maya plug-in. More recently, Abe et al. [9] addressed the control to include the frictional properties, the mass and posture of the figure, and the actions being performed. Maya also supports a plug-in for NaturalMotion’s Endorphin software, which uses the company’s Dynamic Motion Synthesis (DMS) to create bipedal characters that will balance and respond to their environment. While not capable of any derived logic, endorphin characters can respond to impacts and other physical forces. The plug-in support allows for a cohesive workflow between animation and simulation. This technique takes into account the physical properties of the character such as

mass, and is composable with keyframed animated sequences. In addition, Natural Motion has released euphoria which is the real-time incarnation of their DMS for use in next-gen gaming consoles. 7.2 Artificial Intelligence The field of artificial intelligence and adaptive behavior, as applied to pre-rendered animations, has evolved rapidly over the course of the last several years. One leader in the field is Massive Software, which initially developed its Massive Prime software for simulating large-scale autonomous agents for the Lord of the Rings trilogy. Recently, the software was extended by Hanson Robotics for use in their “Zeno” robot that debuted at Wired NextFest in the Fall of 2007. The combination produced a robot that is capable of learning from its environment and adapting its behavior accordingly; this is possible due to the inherent vision, sound and physics inputs already available in Massive Prime. We will investigate this approach as support for our animation authoring to provide feedback during the creation process while still permitting the artistic freedom currently lacking in most approaches.

References [1] F. Kanehiro, H. Hirukawa, and S. Kajita, “OpenHRP: Open Architecture Humanoid Robotics Platform,” The International Journal of Robotics Research, Vol. 23, No. 2, pp. 155-165, 2004. [2] M. Stilman, P. Michel, J. Chestnutt, K. Nishiwaki, S. Kagami, and J. Kuffner Augmented Reality for Robot Development and Experimentation Tech. Report CMU-RI-TR-05-55, Robotics Institute, Carnegie Mellon University, November, 2005. [3] Interbots: http://www.etc.cmu.edu/projects/ibi/ [4] I.W. Park, J.Y. Kim; J.H. Oh, “Online Biped Walking Pattern Generation for Humanoid Robot KHR-3(KAIST Humanoid Robot - 3: HUBO)” 6th IEEE-RAS International Conference on Humanoid Robots, pp. 398 – 403, 2006 [5] Qinghua Li, A. Takanishi, I. Kato, “Learning Control Of Compensative Trunk Motion For Biped Walking Robot Based On ZMP Stability Criterion,” Proceedings of the 1992 lEEE/RSJ International Conference on Intelligent Robots and Systems, Volume: 1, pp. 597-603, 1992 [6] K. Hirai, M. Hirose, Y. Haikawa, and T. Takenaka, "The development of Honda Humanoid Robot," IEEE International Conference on Robotics & Automation, pp. 1321-1326, 1998 [7] A. Witkin and M. Kass, “Spacetime constraints,” Computer Graphics (Proc. SIGGRAPH '88), Vol. 22, pp. 159-168, 1988. [8] S.Tak and H. Ko, “A Physically-Based Motion Retargeting Filter”, ACM Transactions on Graphics (TOG), Vol. 24 , No. 1, pp. 98 – 117, 2005 [9] Y. Abe, M. da Silva, and J. Popovi, “Multiobjective Control with Frictional Contacts,” Eurographics/

The 5th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2008)

ACM SIGGRAPH Animation, 2007.

Symposium

on

Computer