ARTICLE International Journal of Advanced Robotic Systems
Design and Implementation of a Remote Control System for a Bio-inspired Jumping Robot Regular Paper
Jun Zhang*, Guifang Qiao, Guangming Song, and Aimin Wang School of Instrument Science and Engineering, Southeast University, Nanjing, China * Corresponding author E-mail:
[email protected]
Received 26 Jun 2012; Accepted 27 Jul 2012 DOI: 10.5772/51931 © 2012 Zhang et al.; licensee InTech. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract This paper presents the design and implementation of a remote control system for a bio‐ inspired jumping robot. The system is composed of a server, a gateway, and a jumping robot. The proposed remote control system is used to monitor the posture of the jumping robot and control it in remote places. A three‐axis accelerometer is used to detect the tilts of the robot. A compass is used to sense the azimuth of the robot. The calibrations of the accelerometer and the compass are conducted. The sensor data of the robot can be sent to the server through a ZigBee wireless sensor network (WSN). An algorithm is designed to calculate the posture of the robot from the sensor data. The posture of the robot can be displayed on the human‐computer interface of the server using the virtual reality technology of OpenGL. The robots can be controlled by the operator through the interface. Two experiments have been done to verify the posture detection method and test the performance of the system. Keywords Jumping Robot; Remote Control; Self‐Righting; Steering; Posture Detection; Accelerometer; Compass; OpenGL www.intechopen.com
1. Introduction Mobile robots have been widely used to execute difficult tasks when the working environments are dangerous or even can not be reached by human. Most of the traditional mobile robots use wheeled locomotion manner. In [1, 2] wheeled robots are used in WSN systems to detect intruders in indoor environments. In [3], the authors use a wheeled mobile robot as sink node to cooperate with other static sensor nodes. Wheeled robots have always been confronted with the problem of locomotion in uneven terrains. If obstacles are higher than the radius of their wheels, wheeled robots can not overcome obstacles efficiently. Bio‐inspired robots have efficient locomotion capabilities. These kinds of robots have received increasing attentions from researchers worldwide. Many bio‐inspired robots are designed. For example, the bipedal robot in [4] based on studying human locomotion, the quadruped walking robot Little Dog in [5] based on studying four‐legged animals, and the hexapod robot RHex in [6] based on studying six‐legged insects.
IntQiao, J AdvGuangming Robotic Sy, 2012, Vol.Aimin 9, 117:2012 Jun Zhang, Guifang Song and Wang: Design and Implementation of a Remote Control System for a Bio-inspired Jumping Robot
1
Creatures with jumping locomotion capabilities, such as frogs, locusts, and kangaroos provide stimulations for researchers. They have designed some jumping robots such as the Jollbot in [7], the JPL Hopper in [8], the Mini‐ Whegs in [9], the Grillo in [10], the EPFL jumping robot in [11] and the MSU robot in [12]. With the capabilities of quickly overcoming obstacles and avoiding risks, jumping robots can be applied in many fields such as planets exploring [8], search and rescue [13], surveillance operations, and scout [14]. These robots mounted with sensors and wireless communication devices are able to enter into dangerous and unfriendly environments to execute their missions. Inspired by the jumping locomotion of locust, this paper presents the design of a bio‐inspired jumping robot. Robots designed to be used in dangerous environments should work automatically or should be controlled by human in remote places. Remote control for robots applications have been studied by many researchers. A feedback control strategy is proposed in [15] to control mobile robots in remote places. The control commands are sent to the robot via the Internet. In order to have high availability, performance, scalability, fault tolerance, and security, server‐decentralized internet architecture for robot remote control is proposed in [16]. The control commands and robot state data are delivered through XML streams. Wireless LAN is selected for tele‐operating of mending robots [17]. The 3D graphical model of the robot and the environment with the web based control software are combined to develop a robot remote control system in [18]. Ethernet is chosen as a long distance data transmitting manner between the robot system and the control station in [19]. Virtual reality technology is adopted by many researchers to realize the simulation and remote control of robots. In [17], OpenGL and 3DS MAX are used to establish virtual environment and model the virtual robot. A three dimensional real‐time kinematic simulation system for a four‐legged robot using OpenGL is presented in [20]. Visual C++ and OpenGL are used to build simulation systems for robots in [21‐23]. The authors in [24] use OpenGL to simulate the movement of a team of microrobots. OpenGL is used to render the 3D graphics in a visual information processing system in [25]. In this paper, we use Visual C++ and OpenGL to design a human‐computer interface. The interface can show the posture and the state of the jumping robot. A three‐axis accelerometer is used to detect the posture of the robot while a compass is adopted to sense the movement direction of the robot. The data from these two kinds of sensors are received by the server through a WSN. These data are processed on the server. The posture and state of the robot is shown in the interface. Operators in the
remote places can monitor the visualized 3D model of the robot and control the robot effectively. Accelerometers are used in many robot systems to detect the tilt angle of the robots. The authors propose a tilt sensing scheme with three‐axis accelerometers in [26]. In [27], the tilt of a mobile robot is getting from accelerometers. In [28], a single‐axis accelerometer is used to sense the angle of a probe for surface identification. MEMS Accelerometers are used to sense the tilt in [29]. An accuracy of 0.3° over the full measurement range of pitch and roll is presented. In [30], a static error detection system of an acceleration sensor is designed for inclination measurement. Compasses are used in many systems to sense the azimuth of the systems. Anisotropic magnetoresistive (AMR) sensor is used in the navigation systems in [31]. Various errors of orientation detection of compass attitude information are investigated and quantified in [32]. Several compass sensor applications are presented in vehicle detection and navigation systems for magnetic sensing based on magnetic fields [33]. In [34], the application of electronic compasses in mobile robots is discussed. Different error sources are considered, and solutions are proposed to correct these errors. In [35], absolute heading respect to the magnetic North of a robot is detected by using an electronic compass. The authors give advises to reduce the effect of interferences from motors by placing a compass as far as possible from the motors. In [36], an efficient tilt compensation algorithm for the low cost two‐axis magnetic compass is presented. Jumping robots need to detect their postures and locomotion directions for remote control by the operators. This paper presents the design of a remote control system for the remote control of a jumping robot. The system architecture is introduced in Section 2. The system design and working principles are presented in Section 3. The testbed setup and two experiments are given in Section 4. Section 5 will conclude the presented work and introduce the future work. 2. System overview The architecture of the remote control system is shown in Figure 1. It is composed of a server, a gateway, and a jumping robot. The server runs software for saving sensor data and displays the postures and states of the robot. The gateway connects with the server through serial interface. The gateway builds a ZigBee protocol WSN for collecting data from the jumping robot and sending control commands to the robots. The jumping robot joins into the WSN and communicates with the gateway. The jumping robot is controlled by the operator in the remote place through the WSN. The postures and the states of the robot
2
Int J Adv Robotic Sy, 2012, Vol. 9, 117:2012
www.intechopen.com
Figure 1. Architecture of the proposed robot control system. It is composed of a server, a gateway, and a jumping robot.
are detected by a three‐axis accelerometer and a digital compass. The sensor data are collected and sent to the server. The server saves the data into a database. The sensor data are processed to get the posture of the robot. The 3D model of the robot is displayed on the human‐ computer interface of the server in real‐time. The operator can look over the states of the robot and control it remotely. 3. System design and working principles 3.1 Jumping Robot The CAD model of the proposed jumping robot is shown in Figure 2. The robot is composed of a body frame, a jumping mechanism, an adjusting mechanism, a three‐ axis accelerometer, a digital compass, two infrared sensors, a control board, and a lithium battery. The jumping mechanism consists of a DC motor, a reduction gear box, a cam, a main leg and four torsion springs. The detailed mechanical design work is presented in [37]. The adjusting mechanism is composed of a DC motor, a pole leg, and an additional weight. The robot always falls down on its left, right, or front side after landing. The pole‐leg is used to prop the robot up and steer in level ground. The detailed self‐righting and steering principles are presented in [38, 39].
Figure 3. Prototype of the proposed jumping robot.
Figure 3 is the prototype of the proposed jumping robot. The code wheel on the surface of the cam combined with the infrared sensor is used to provide feedback control for the jumping motor. The compass is fixed on the foot of the robot to sense the locomotion azimuth of the robot. 3.2 Sensing of the Jumping Robot 3.2.1 Tilt Sensing with Accelerometer The three‐axis accelerometer in this paper is used to detect the tilts of the jumping robot. The tilts are determined by the components of the gravitational acceleration. As shown in Figure 4, when the sensor rotates angle α on axis x0, and angle β on axis y0, the gravitational acceleration g will project to the three sensitive axes as Ax, Ay, and Az. The angle between Ax and g is θx. The angle between Ay and g is θy. The angle between Az and g is θz. From Figure 5 we can see that β is the complementary angle of θx. Similarly α is the complementary angle of θy. The rotation matrix equation is shown in expressions (1).
Figure 2. The 3D model of the jumping robot.
www.intechopen.com
Figure 4. Components of the gravitational acceleration after tilts of α and β.
Jun Zhang, Guifang Qiao, Guangming Song and Aimin Wang: Design and Implementation of a Remote Control System for a Bio-inspired Jumping Robot
3
is shown in Figure 9. The maximal error is ‐53mg at 50°. The result is 45.48° if we use the testing value of 713.03mg at 50° in (6) to calculate α. The error of α test is 4.52°. The resolution of Ay is low near to 90° as can be seen in Figure 8. So the angle detection is not precise near to 90°. The angle of α is 72.31° when Ay is 952.69mg at 90° in expression (6). The error of α test is 17.69°. It is unacceptable for tilt detection.
Figure 5. The relationship between θx and β.
Rxy ( , ) R( x , )R( y , ) (1) The relations between Ax, Ay, Az, and g can be expressed as follows:
Ax 2 Ay 2 Az 2 g 2 (2) Ay g sin (3) Ax g sin (4)
Az g cos z (5)
Figure 6. The 3D model of the calibration platform for the tilt angle test.
From (3) and (4), we can get
arcsin( Ay / g) (6) arcsin( Ax / g) (7) Ax and Ay can be detected by the three‐axis accelerometer. So β and α can be calculated from expressions (6) and (7). The accelerometer used in this robot is the 3‐axis accelerometer and 3‐axis magnetometer LSM303DLH. The platform used to test the performance of the accelerometer is shown in Figure 6. The base is put in the desk and leveled using four levelling screws. The degree code board is fixed on the side face of the base. The rotation board can rotate in the vertical face. The sensor board is fixed on the rotation board in the position A to test tilt when axis x is rotating while in the position B to test the tilt when axis y is rotating. The calibration platform used for the tilts testing is shown in Figure 7. The rotation board rotates step by step with 5 degrees every step. The acceleration data are collected by the test board and sent to the server. The results of tilt test when axis x is rotating are shown in Figure 8. Ax, Ay, and Az are the actual testing values, while AxT, AyT, and AzT are the theoretical values. AxT is zero all the time with the rotation of axis x. AyT is a sine function of angle α. AzT is a cosine function of angle α. Ax, Ay, and Az are near to AxT, AyT, and AzT respectively. The error of Ay
4
Int J Adv Robotic Sy, 2012, Vol. 9, 117:2012
Figure 7. The calibration platform used for the tilt angle testing.
Figure 8. Acceleration of Ax, Ay, Az with the changing of the tilt angle α.
www.intechopen.com
Figure 12. Acceleration of Ax, Ay, Az with the changing of the tilt angle β.
Figure 9. The acceleration error of Ay.
Figure 13. The acceleration error in axis y.
Figure 10. The sine function fitting of Ay.
The tilt angle of the robot mostly is from 0° to 90°. So we use (9) to calculate the fitting value of the angle. By comparing the fitting angles and the testing angles, we get the error of α which is shown in Figure 11. The maximal error is ‐9.75° at 90°. The error using (9) is smaller than using (6). If we see 0° to 70° as the acceptable detection range, the angle error is less than 2.3°.
Figure 11. The error of α from the inverse function testing.
We use fitting method to fit the function of Ay. Ay is fitted using sine function. The result is shown in Figure 10. The fitting function is (8). The inverse function of (8) is (9). It will be used to calculate the tilt angle of α.
Ay 30.99275 981.73307*sin(
* ( 0.50729) 179.79139
Ax is fitted using sine function and shown in Figure 14. The fitting function is (10). The inverse function of (10) is (11). It will be used to calculate the tilt angle of β.
) (8)
0.5073 179.7914 * arcsin 0.0316 0.0010 * Ay / (9) www.intechopen.com
Using the same methods and procedures, the data from Ax, Ay, and Az are collected when the axis y is rotating. Their relationships with the angle of β are shown in Figure 12. Ax is used to calculate the tilt angle of β. The error of Ax is shown in Figure 13. The maximum of the error is ‐19.30mg at 55°. The result is 53.12° if we use the testing value of 969.88343mg at 55° in (7) to calculate α. The angle error is ‐ 1.88°. Similarly, the angle of β is 81.13° when Ax is 988.05mg at 90° in expression (7). The error of β is 8.87°.
Ax 10.10042 994.14526*sin(
* ( 0.00436) 179.93949
) (10)
0.0044 179.9395 * arcsin 0.0102 0.0010Ax / (11)
Jun Zhang, Guifang Qiao, Guangming Song and Aimin Wang: Design and Implementation of a Remote Control System for a Bio-inspired Jumping Robot
5
Figure 17. The calibration platform used for the azimuth testing.
Figure 14. The sine function fitting of Ax.
Figure 18. The compass output in axis x and y.
Figure 15. The error of β from the inverse function testing.
Figure 19. The azimuth of the compass compare to the theoretical value.
Figure 16. The 3D model of the calibration platform for azimuth test.
By comparing the fitting angles and the testing angles, we get the error of β which is shown in Figure 15. The maximal error is ‐3.42° at 90°. If we see 0° to 70° as the acceptable detection range, the absolute error is less than 1.2°.
3.2.2 Azimuth Sensing with Digital Compass The digital compass is used to detect the locomotion direction of the robot in the proposed system. The principle of azimuth sensing is that the compass is sensitive to the magnetic field of the earth. The azimuth γ is as follows:
arctan(Cy / Cx) 180 (12)
6
Int J Adv Robotic Sy, 2012, Vol. 9, 117:2012
www.intechopen.com
where Cy is the vector of the magnetic field on axis y, Cx is the vector of the magnetic filed on axis x. The compass is easily influenced by the ferrous objects, motors of the robot, and electrical wires near to the system. Some tests are conducted to get the real relationship between the testing value and the theoretical value. The 3D model of the calibration platform for the azimuth testing is shown in Figure 16. The base is put on the desk. The degree code board is fixed on the surface of the base. The sensor board is fixed on the rotation board. The rotation board rotates around the axis Z. The calibration platform used for the azimuth testing is shown in Figure 17. The output of the two axes of x and y are shown in Figure 18. We use the function of (12) to calculate the azimuth of the compass. The results are shown in Figure 19. The theoretical value is a straight line, while the testing line is near to the straight line. The error of the azimuth is shown in Figure 20. The maximal error is about 25.76° in the direction angle of 240°.
A 5 order polynomial is used to fit the inverse function of the testing value function. The result is shown in Figure 21. The 5 order polynomial is (13)
y 0.04603 1.13239 x 0.00292 x 2 4.77048 * 10 5 x 3 1.79301 * 10 7 x 4 2.00578 * 10 10 x 5
(13)
The residual of the azimuth after fitting is shown in Figure 22. The error is less than 1.3°.
3.3 Remote Control and Display
In this paper, a remote control system based on wireless sensor network is designed to control the jumping robot. The sensor data are sent to the PC server. The server processes these data and displays the posture and state of the robot. The stability of the remote control is mainly influenced by the time delay of sensor data wireless transmitting and control commands wireless sending. The remote control of self‐righting and steering is sensitive to the time delay. If the pole leg is not controlled to stop after self‐righting, it will make the robot fall down. The biggest time delay that the proposed remote control system can tolerate is decided by the rotation speed of the self‐righting motor. The no load speed of the adjusting motor is 33 rpm. The time used to control the pole leg to stop rotating must less than 2s. In the proposed control system, the longest distance from which the robot can be controlled is about 300 m in barrier‐free outdoor environments.
The 3D model of the robot is established using 3DS MAX and imported into Visual C++ software as a 3DS file. The 3D posture display of the robot is designed based on OpenGL. The operator can monitor the posture of the robot and control the robot in the remote places.
Figure 20. The error of azimuth testing.
The algorithm used in the 3D posture display is shown in Figure. 23. When receiving data from the serial port, the server extracts the accelerometer data and compass data. Then the server calculates the tilt angles α and β, and the azimuth γ. If this is the first sensor data package received by the server, the server will rotate the 3D model of the robot α, β, and γ pertaining to axis x, y, and z one after another. The rotation matrix is shown in expression (14).
Rxyz ( , , ) R( x , )R( y , ) R( z , ) (14)
Figure 21. 5 order fitting of the azimuth of the compass.
www.intechopen.com
Then the server will display the 3D model in the software interface. The angles of α and β, and γ are saved in α’ and β’, and γ’ for the processing of next rotation and display. If the sensor data package is not the first data package, the 3D model rotates γ’, β’, and α’ pertaining to axis z, y, x one after another to the previous state. The rotation matrix is shown in expression (15).
Rzyx ( ʹ, ʹ, ʹ) R( z , ʹ)R( y , ʹ)R( x , ʹ) (15)
Jun Zhang, Guifang Qiao, Guangming Song and Aimin Wang: Design and Implementation of a Remote Control System for a Bio-inspired Jumping Robot
7
Then the server rotate the 3D model α, β and γ calculated from this new data package pertaining to axis x, y and z one after another as the same matrix of expression (14).
Figure 22. The residual of the azimuth after fitting.
Figure 23. 3D pose display algorithm.
Figure 25. The pose of the robot during self‐righting from left side.
Figure 24. Pose display of the robot model during self‐righting from left side.
8
4. Experiments A testbed is built to test the posture detection method and remote control of the jumping robot. The performance of the accelerometer and the compass are tested. The robot posture calculation and display methods are verified by the self‐righting and steering of the jumping robot. 4.1 Posture Display of the Robot during Self‐Righting The accelerometer sensor and the compass sensor data are sent to the PC server. Using the algorithm proposed in section 3.3, the server processes the data and calculates the tilt angles and azimuth and displays the 3D model of the jumping robot dynamically. Figure 24 shows the posture display of the robot model during self‐righting from left side. The posture of the robot during self‐ righting from left side is shown in Figure 25. Figure 26 shows the posture display of the robot model during self‐righting from right side. The posture of the robot during self‐righting from right side is shown in Figure 27. The operator can monitor the posture of the robot and control the pole‐leg to prop up the robot. The real‐time display of the posture provides visualized monitoring and control for the operator.
Int J Adv Robotic Sy, 2012, Vol. 9, 117:2012
Figure 26. Pose display of the robot model during self‐righting from right side.
www.intechopen.com
Figure 27. The pose of the robot during self‐righting from right side.
Figure 28. The pose of the robot model during steering.
Figure 29. The pose of the robot during steering.
4.2 Posture Display of the Robot during Steering When adjusting its locomotion direction, the robot is controlled to steer according to the azimuth from the compass. The posture of the 3D model during steering is shown in Figure 28. The posture of the robot during steering is shown in Figure 29. 5. Conclusion We have presented a remote control system for a jumping robot. The postures of the robot are detected by a three‐ axis accelerometer. The azimuth of the robot is sensed by a compass. The sensor information is sent to a PC server www.intechopen.com
using a WSN. The server processes the information and displays the postures and states of the robot. By monitoring the states of the robot, operators can control the robot to self‐right and steer in remote places. Experimental results verify the validity of the proposed methods and the performance of the system. The real‐ time display of the robot posture provides visualized monitoring and control for the operator. Future work will focus on the following two aspects. 1) The solutions of localization and trajectory planning for continuous jumping of our robot will be investigated. 2) The working environment sensing of the robot based on micro‐camera will be studied. 6. Acknowledgments The research reported in this paper was carried out at the Robotic Sensor and Control Lab, School of Instrument Science and Engineering, Southeast University, Nanjing, Jiangsu, China. This work was supported in part by the Scientific Research Foundation of Graduate School of Southeast University under Grant YBJJ‐1221, the Research & Innovation Program for Graduate Student in Universities of Jiangsu Province under Grant CXLX‐0102, Program for New Century Excellent Talents in University under Grant NCET‐10‐0330, and Natural Science Foundation of Jiangsu Province under Grant BK2011254. 7. References [1] C. Lin, S. Yang, H. Chen, and K. Song, “Mobile robot intruder detection based on a Zigbee sensor network,” in Conf. Proc. IEEE Int. Conf. Syst. Man Cybern., pp. 2786‐2791, Singapore, Oct. 12‐15, 2008. [2] Y. Li and L. E. Parker, “Detecting and monitoring time‐related abnormal events using a wireless sensor network and mobile robot,” in IEEE/RSJ Int. Conf. Intelligent Rob. Syst., pp. 3292‐3298, Nice, France, Sept, 22‐26, 2008. [3] J. Takahashi, T. Yamaguchi, K. Sekiyama, and T. Fukuda, “Communication Timing Control and Topology Reconfiguration of a Sink‐Free Meshed Sensor Network with Mobile Robots,” IEEE/ASME Trans on Mechatronics, vol. 14, no. 2, pp. 187‐197, April, 2009. [4] S. Collins, A. Ruina, R. Tedrake, and M. Wisse, “Efficient Bipedal Robots Based on Passive‐Dynamic Walkers,” Science, vol 307, pp. 1082‐1085, 2005. [5] F. Doshi, E. Brunskill, A. Shkolnik, T. Kollar, K. Rohanimanesh, R. Tedrake, and N. Roy, “Collision Detection in Legged Locomotion using Supervised Learning,” in Int. Conf. Intelligent Rob. Syst., pp. 317‐322, San Diego, CA, USA, Oct 29 ‐ Nov 2, 2007.
Jun Zhang, Guifang Qiao, Guangming Song and Aimin Wang: Design and Implementation of a Remote Control System for a Bio-inspired Jumping Robot
9
[6] E. Sayginer, T. Akbey, Y. Yazicioglu, and A. Saranli, “Task oriented kinematic analysis for a legged robot with half‐circular leg morphology,” in IEEE Int. Conf. Robot. Autom., pp. 4088‐4093, 2009. [7] R. Armour, K. Paskins, A. Bowyer, J. F. V. Vincent, and W. Megill, “Jumping robots: a biomimetic solution to locomotion across rough terrain,” Bioinspiratoin and Biomimetics Journal, vol. 2, pp. 65‐82, 2007. [8] J. Burdick and P. Fiorini, “Minimalist jumping robot for celestial exploration,” Int. J. Robot. Res., vol. 22, no. 7, pp. 653‐674, 2003. [9] B. G. A. Lambrecht, A. D. Horchler, and R. D. Quinn, “A Small, Insect Inspired Robot That Runs and Jumps,” in IEEE Int. Conf. Robot. Autom., pp. 1240‐ 1245, Barcelona, Spain, 2005. [10] U. Scarfogliero, C. Stefanini, and P. Dario, “The Use of Compliant Joints and Elastic Energy Storage in Bio‐inspired Legged Robots,” Mech. Mach. Theory, vol. 44, no. 3, pp. 580‐590, 2009. [11] H. Tsukagoshi, M. Sasaki, A. Kitagawa, and T. Tanaka, “Design of a higher jumping rescue robot with the optimized pneumatic drive,” in IEEE Int. Conf. Robot. Autom., pp. 1276–1283, Barcelona, Spain, 2005. [12] J. Zhao, N. Xi, B. Gao, M. Mutka, and L. Xiao, “Development of a Controllable and Continuous Jumping Robot,” in IEEE Int. Conf. Robot. Autom., pp. 4614‐4619, Shanghai, China, 2011. [13] S. A. Stoeter and N. Papanikolopoulos, “Kinematic Motion Model for Jumping Scout Robots,” IEEE Trans. Robot. Autom., vol. 22, no. 2, pp. 398‐403, 2006. [14] M. Kovac, M. Schlegel, J. Zufferey, and D. Floreano, “Steerable Miniature Jumping Robot,” Auton. Robots, vol. 28, no. 3, pp. 295‐306, 2010. [15] W. Chang, and S. Lee, “Autonomous vision‐based pose control of mobile robots with tele‐supervision,” in Proc. IEEE Int. Conf. Control Appl., pp. 1049‐1054. Taipei, Taiwan, Sep. 2‐4, 2004. [16] Q. Wang, S. Liu, and Z. Wang, “A New Internet Architecture for Robot Remote Control,” in Proc. of IEEE/RSJ Int. Conf. Intelligent Rob. Syst., pp. 4989‐ 4993, Oct. 9‐15, 2006, Beijing, China. [17] F. Cui , M. Zhang, G. Cui, and X. Wu, “A Robot Teleoperation System Based on Virtual Reality and WLAN,” in World Congr. Intelligent Control Autom., pp. 9193‐9197, June 21‐23, 2006, Dalian, China. [18] X. Yang, D. C. Petriu, T. E. Whalen, E. M. Petriu, “A web‐based 3D virtual robot remote control system,” in Can Conf Electr Comput Eng, pp. 0955‐0958, May 2‐5, 2004, Niagara Falls, Canada. [19] L. A. Mateos, M. Sousa, and M. Vincze, “DeWaLoP ‐ Remote Control for In‐pipe Robot,” in IEEE Int. Conf. Adv. Rob.: New Boundaries Rob., pp. 518‐523, June 20‐23, 2011, Tallinn, Estonia.
10 Int J Adv Robotic Sy, 2012, Vol. 9, 117:2012
[20] Q. Ling, M. Meng, T. Mei, and Haoran Lu, “3D Simulation Design Based on OpenGL for Four‐ legged Robot,” in Int. Conf. Robotics Biomimetics, pp. 713‐717, July 5‐9, 2005, Shatin, N.T., China. [21] Y. Wang, Y. Gai, and F. Wu, “A robot kinematics simulation system based on OpenGL,” in IEEE Conf. Rob. Autom. Mechatronics RAM Proc., pp. 158‐161, Sep. 17‐19, 2011, Qingdao, China. [22] C. Marcu, G. Lazea, and R. Robotin, “An OpenGL Application for Industrial Robots Simulation,” in IEEE Int. Conf. on Simulation Automation, Quality and Testing, Robotics, pp. 254‐259, May 25‐28, 2006, Cluj‐Napoca. [23] Z. Zhang, S. Ma, B. Li, L. Zhang, and B. Cao, “Development of an OpenGL Based Multi‐robot Simulating Platform,” in Proc. Int. Conf. Control Autom. Rob. Vis., Dec. 6‐9, 2004, Kunming, China. [24] S. Guo, X. Ye, and B. Gao, “Motion Planning of Underwater Multi‐microrobot System,” in Proc. of IEEE ICMA 2008, pp. 690‐695, Aug. 5‐8, 2008, Takamatsu, Japan. [25] N. Abe, R. Mizokami, Y. Kinoshita, and S. He, “Providing Simulation of Medical Manipulation with Haptic Feedback,” in Proc. Int. Conf. Artif. Real. Telexistence, ICAT, pp. 143‐148, Nov. 28‐30, 2007, Esbjerg, Jylland, Denmark. [26] J. Qian, B. Fang, W. Yang, X. Luan, and H. Nan, “Accurate Tilt Sensing With Linear Model,” IEEE Sensors Journal, vol. 11, no. 10, pp. 2301‐2309, 2011. [27] L. Ojeda, and J. Borenstein, “FLEXnav: Fuzzy Logic Expert Rule‐based Position Estimation for Mobile Robots on Rugged Terrain,” in IEEE International Conference on Robotics and Automation, pp. 317‐322, May 11‐15, 2002, Washington, DC, United states. [28] P. Giguere, and G. Dudek, “A Simple Tactile Probe for Surface Identification by Mobile Robots,” IEEE Trans. Rob., vol. 27, no. 3, pp. 534‐544, 2011. [29] S. Łuczak, W. Oleksiuk, and M. Bodnicki, “Sensing Tilt With MEMS Accelerometers,” IEEE Sensors Journal, vol. 6, no. 6, pp. 1669‐1675, 2006. [30] X. Chang, L. Chen, H. Wei, and J. Duan, “Design and Realization of Acceleration Sensor Static Error Detection System,” in IEEE Int. Conf. Mechatronics Autom., ICMA, pp. 2259‐2263, August 7‐10, Beijing, China. [31] J. Vcelák, P. Ripka, J. Kubík, A. Platil, and P. Kašpar, “AMR Navigation Systems and Methods of Their Calibration,” Sensors and Actuators, A: Physical, vol. 123‐124, pp. 122‐128, 2005. [32] J. Vcelák, J. Kubík, “Influence of Sensor Imperfections to Electronic Compass Attitude Accuracy,” Sensors and Actuators, A: Physical, vol. 155, no. 2, pp. 233‐240, October, 2009. [33] M. J. Caruso, and L. S. Withanawasam, “Vehicle Detection and Compass Applications using AMR Magnetic Sensors,” Honeywell, http://www. ssec.honeywell.com/magnetic/datasheets/amr.pdf.
www.intechopen.com
[34] L. Ojeda, and J. Borenstein, “Experimental Results with the KVH C‐100 Fluxgate Compass in Mobile Robots,” in Proceedings of the IASTED International Conference on Robotics and Applications, Aug. 14‐16, 2000, Honolulu, Hawaii, USA. [35] V. Skvortzov, H. Lee, S. Bang, and Y. Lee, “Application of Electronic Compass for Mobile Robot in an Indoor Environment,” in IEEE Int. Conf. Robot. Autom., pp. 2963‐2970, April, 10‐14, 2007, Roma, Italy. [36] X. Li, R. Kang, X. Shu, and G. Yu, “Tilt‐Induced‐Error Compensation for 2‐Axis Magnetic Compass with 2‐ Axis Accelerometer,” in WRI World Congress on Computer Science and Information Engineering, pp. 122‐125, Mar. 31‐Apr. 2, 2009, Los Angeles, CA, United states.
www.intechopen.com
[37] J. Zhang, Y. Zhu, H. Wang and J. M. Zhang, “Design of a bio‐inspired jumping robot for rough terrain,” in Int. Conf. on Environmental Science and Information Application Technology, pp.40‐43, Wu Han, China, Jul. 17‐18, 2010. [38] J. Zhang, G. Song, T. Meng, H. Sun, and G. Qiao, “An Indoor Security System with a Jumping Robot as the Surveillance Terminal,” IEEE Trans Consum Electron, vol. 57, no. 4, pp. 1774‐1781, Nov. 2011. [39] J. Zhang, G. Song, G. Qiao, Z. Li, and A. Wang, “A Wireless Sensor Network System with a Jumping Node for Unfriendly Environments,” International Journal of Distributed Sensor Networks, vol. 2012, Article ID 568240, doi:10.1155/2012/568240.
Jun Zhang, Guifang Qiao, Guangming Song and Aimin Wang: Design and Implementation of a Remote Control System for a Bio-inspired Jumping Robot
11