DTCNN Implementation in a LEGO Mindstorm NXT for Infrared and Ultrasonic Sensor Data Processing J. Alb´o-Canals S. Consul-Pacareu Jordi Riera-Babur´es and X. Vilasis-Cardona LIFAELS La Salle - Universitat Ramon Llull E-08022 Barcelona, Spain Email:
[email protected]
Abstract—This paper presents an implementation of a DTCNN, programmed entirely in LEGO Mindstorms NXT robot to, together with perceptron, guide a robot avoiding obstacles. The map will be processed DTCNN obtained from ultrasonic and infrared sensors. The main objective of this implementation is to demonstrate the feasibility of implementing these and other applications of the CNN with Minstorm LEGO NXT.
I. I NTRODUCTION This paper presents a solution using DTCNN to path planning based on sensory information (ultraonic and infrared) in uncertain environments where the size, shape and location of obstacles are unknown. This method is called local path planning with obstacle avoidance carried out in on-line manner [1]. In order to avoid the complexity of enviroment knowledge controller with rule-based system, which has the problem of consuming a lot of time to be designed, a DTCNN processes the sensor map and a perceptron neural network with two outputs is proposed. II. I NTRODUCTION TO ROBOTIC C ONTROL WITH CNN S Still now, autonomous navigation is a complex problem hard to solve. Complex control solutions are applayed in different manners. In reference [2] we find the implementation of CNNs in Robot Guiding with a line following application. The evolution of the problem is found in [3] ehere the complex feature detection and object recognition are used. Other works like [4] CNNs for Long range vision and path planner and in [5] genetic algorithms are proposed to obtain best coeficients for CNNs templates. All of them have in common is that all uses a camera as sensor. Ultrasonic and infrared sensor are not the same and have not the same capabilities, however is a cheap solution wich work can be improved because of use of CNNs. The paper is organized as follows. In Section II there is an Introduction to Robotic Control Implementations related with CNNs. Then, Section III details the LEGO Mindstorm NXT
Specifications wich are important for the application done. The implementation structure is presented in Section IV. And finally, in Section V, concluding remarks are discussed. III. LEGO M INDSTORM NXT S PECIFICATIONS To understand the capabilities and limitations of the implementation we have to introduce the technical specifications of the Robot equipment [6] A. The Controll Process Unit The Processor is a 32-bit ARM-7, the AT91SAM7S256. It contains 256 KB of memory FLASH for software and firmware, and 64 KB of RAM for data. His speed Performance is 48 MHz. A part of this, the NXT has another processor, an 8-bit processor with full dedication to manage inputs and outputs of the robot and another small processor which control Bluetooth communications. The USB is controlled directly from the central processor, reaching a speed of 12 Mbit/s. This CPU has incorporated into its framework a 100x64 pixel LCD screen and 8-bit sound system resolution supports a sampling frequency up to 16Khz B. The Infrared Sensor The LEGO light sensor based on the sfh3094 Silicon NPN Phototransistor appears to have its peak sensitivity at around 800nm wavelength. As long as more than 700nm is infrared light we have called it infrared sensor. C. The Ultrasonic Sensor The ultrasonic sensor is used to measure the distance from the unit to the next object in its line of sight, it can measure up to 2.5 meters with a precision of up ±3 cm, and its field of view is 40 degrees per side. The working frecuency of the Ultrasonic sensor is 40 Khz.
D. NXT Motor The motor works at 9 V. Over 7.2 the response between power and speed is Lineal, so the response of the Robot is predictable. There are less lineality if motor torque exceeds 12 Ncm. The linear equation that defines the behaviour of the motor is as follows [7]: Ra · ia Kτ · ia
Kb · w = ea −B · w = τd
(1)
where Ra = 6.84, Kb = 0.468, Kτ = 0.317 and B = 1.13×10−3 Nm/rad., ea is the voltage applyied to the motor,ia is the motor current τd is the load torque and ”w the angular speed. IV. I MPLEMENTATION S TRUCTURE A key feature in the design of mobile robot navigation system is the identification and classification of the enviroment situations. In reference [1] there are 7 situations presented, which has been used to design perceptron weights. Our flowchart is presented in fig.1. Let us remark how the combination of CNN templates simplify the Neural Network design. Thus there is only one iteration, two if we consider another one as long as we use the last sample in sensor map processing.
Fig. 2.
Implementation Schematic
V. R ESULTS & C ONCLUSIONS As long as robots have been made, and as long as obstacle detection has been used, there has existed a dilemna for which sensor to use for ranging and non-contact obstacle avoidance. An avoiding obstacle application with DTCNN is implemented. The sensor mapping is processed by the DTCNN to optimizate the motor driving response done by a one layer neural network. The algorithm flowchart is more simple that most implementations due to DTCNN parallelism [8]. The adquisition process can be improved using more sensors instead of the radar system. Sensor based avoiding obstacles can not be compared to camera vision systems. However, like CNNs are used as a pre-processing complement in image processing systems [9], sensors can be considered as a security complement to avoid unexpected collisions. Discussion about infrared and ultrasonic sensor has been done for a long time. Infrared sensors are useful if there are nonvariable light conditions, acuracy is not needed and a narrow beem is enough. In the other hand, Ultrasound sensor should be used if more accurracy is needed and there are not sound absorbing obstacles. R EFERENCES
Fig. 1.
Algorithm Flowchart
The DTCNN function is identify the situation and prepare the input to the input perceptron layer. Then the perceptron determines how each motor has to response in order to avoid the obstacle in the best way. To optimizate the motor response two sensor maps are used as is shown in fig 2. The sensor map is obtained using a radar system. Thus the sensor turns in variable steps depending on the number of samples we want to process. Adquisition time can be improved using more sensors around the Robot [1] Several designs propose similar algorithms to solve obstacle avoidence [8].
[1] H.R. Beom and H.S.Cho , A Sensor-based Obstacle Avoidance Controller for a Mobile Robot Using Fuzzy Logic and Neural Network, Robotica,Volume 24 , Issue 5,Sept.2006) [2] X. Vilasis-Cardona et al., Guiding a Mobile Robot with Cellular Neural Networks, Int. J. of Circuit Theory and App., Vol.30, n.6, pp. 611-624, 2002 [3] Marco Balsi and X. Vilasis-CardonaRobot Vision Using Cellular Neural Networks,Autonomous Robotic Systems, Soft Computing and Hard Computing Methodologies and Applications,Part 4:Vision and Perception, pp. 431-450, 2003 [4] Ayse Naz Erkan et al., Adaptative Long Range Vision in Unstructured Terrain, IEEE/RSJ International Conference on In Intelligent Robots and Systems, pp. 2421-2426., 2007 [5] Umair Ali Khan et al., Genetic Algorithm Based Template Optimization for a Vision System: Obstacle Detection, ISTET09, L¨ ubeck, Germany, Jun. 2009 [6] Baum et al.,Extreme Mindstorms: An Advanced Guide to Lego Mindstorms, Apress, 2000 [7] Michael Gasperi et al.,Extreme NXT:Extending the LEGO MINDSTORMS NXT to the Next Level, Apress, 2007 [8] Eun-Young Park, Practical Study about Obstacle Detecting and Collision Avoidance Algorithm for Unmanned Vehicle, ICCAS, Gyeongju, Korea, Oct. 2008 [9] J. Alb´o-Canals et al., An Efficient FPGA Implementation of a DTCNN for Small Image Gray-scale re-processing, ECCTD09,Antalya, Turkey,Aug. 2009