an android interface for an arduino based robot for ...

9 downloads 189974 Views 776KB Size Report
Keywords: Teaching Robotics, Android interface, Arduino microcontroller .... On the lefthand side, lies the battery pack, which contains four 1.5 V AA batteries ...
AN ANDROID INTERFACE FOR AN ARDUINO BASED ROBOT FOR TEACHING IN ROBOTICS Rodriguez, K., Crespo, J, Barber, R. University Carlos III de Madrid

Abstract This work is focused on the development of a platform based on Android in order to interact with robotic systems based on an Arduino microcontroller. Nowadays, Arduino is an open microcontroller system which makes easy the control of home-made robots, which can be used to start students in robotics issues. Android, is an OS that is widely used in phones and tablets by students. An (easy) robotic platform has been built using Wi-Fi to communicate with the Android device. An Android interface has been developed in order to make students familiar with first steps in robotics such as sensors reading, actuators enabling and low level control loops closing. Several tests have been developed as a student guide for practical sessions, including the experimental results and conclusions for the students. Keywords: Teaching Robotics, Android interface, Arduino microcontroller

1

INTRODUCTION

For the development of the proposed robotic system, several technologies have been joined. Between them, Arduino and Android are found. Android [1] is an operating system based on Linux, designed to run on mobile devices with touch screen. The Android structure consists of applications running on a Java framework for object-oriented applications on the core library of C / C in a virtual machine with run-time compilation. Android is a stable system which is widely used in current research in all fields that require technology on mobile devices. For example, in [2] they create an application that tries to make real-time audio streams. The transmission of high-quality audio is an important branch that is constantly evolving, however most applications are made on PC platforms and in this work they present a solution in adaptive real-time transmission of high-quality audio which they choose to develop in Android successfully. It is also the system used in some medical applications. In [3] they show a prototype machine to machine (M2M) that combines mobile and IPv6 technologies in a wireless sensor network to monitor the health status of patients. A server module displays the recorded biomedical signals on Android mobile devices in real time. Similarly, in [4] is designed a platform for brain computer interface (BCI) in real time also based on Android. In this case they have a computer with which the user can control a portable device through some electroencephalographic rhythms. The web service has been used as wireless technology to send these brain signals to a PC and vice versa. Referring to Arduino [5], it is an open hardware platform, based on a microcontroller board and a development environment designed to facilitate the use of electronics in multidisciplinary projects. It is low cost, versatility, user friendly and is supported by a large community of users. This makes it a widely known and used platform. Arduino can be used to develop stand-alone interactive objects or can be connected to computer software. The electronic board can be assembled by hand or purchased. Examples of this versatility can be seen in works such as in [6], where they research on the control of a snake robot. It combines centralized and distributed controlling, and the multichannel joints' parallel controlling method. Another example of the variability of forms and applications that robots can be controlled with arduino are shown in [7], where it is presented the design, the construction and the control of a robot that maintains the balance on two wheels. The system consists of a pair of DC motors, a microcontroller Arduino a single axis gyroscope and two-axis accelerometer. Their experimental results show that equilibrium is achieved with PI-PD controllers. Besides controlling small robots, Arduino is a useful low

cost teaching device. In [8] they used Arduino as a data acquisition card and practices for engineering students. In this work both technologies are joined. Combinations with Arduino-Android are widespread because both technologies have open source codes that allow to be used in many applications. In [9] proposes a smart plug design of an energy monitoring system in real time. They use an arduino, ethernet module and current transformer sensor. The user interface of the device is developed on Android.

2

ARQUITECTURE DESCRIPTION

The general idea of this work is to develop a bi-directional communication architecture between a land rover robot and an Android device in order to remotely control the robot by an Android device. And in turn, create a robot prototype that meets these specifications, using this architecture, and that can serve as a starting point for future more complex projects. In addition, it is intended that this platform can be carried out without requiring devices or expensive materials, so that is accessible to as many people as possible. That is why, it combines the use of two open source platforms, and widely used today, such as Android and Arduino. So this will help this project to be easily extended by other developers. A part of the remote control of the robot with an Android device, it is intended that the user can be receiving data perceived by the robot from the environment via sensors and that can be directly monitored in the display of a mobile device. A bidirectional communication, Fig 1, has been developed: 

Android to Arduino: The user sends commands to control the robot.



Arduino to Android: The robot collects information from its environment, and sends it back to the user for directly monitoring the robot and the environment. This closes the communication loop.

User with device Android

Commands

ROBOT

Data collection from sensors

Fig. 1: General system description

3 3.1

ROBOT DESCRIPTION General overview

The robot built has been accomplished using low-cost commercial components. It has been based on a toy car with wheels big enough to allow movement in environments with ridges and slopes and on top of it the rest of the components are mounted: motors, sensors and controllers that complete the design of the robot. Fig 2 shows a side view of the robot. This image shows the electronics protective housing placed on the robot’s rear end as well as the robot chassis and the wheels.

Fig. 2: Side view of the robot In (Fig. 3), the robot’s front image is presented. The ultrasound proximity sensor mounted on the actuator can be observed. On the lefthand side, lies the battery pack, which contains four 1.5 V AA batteries that power the Arduino microcontroller.

Fig. 3: Front view of the robot

3.2

Sensors an actuators

As mentioned above, for building the robot low cost hardware has been chosen, in order to promote the students to build their own robot. 

As a wheeled platform, a toy with big wheels has been un-mounted in order to get the chassis and the wheels.



For the robot control, an Arduino Leonardo microcontroller has been chosen. For Wi-Fi communications with an Android device, the Arduino Wi-Fi Shield has been used.



As actuators a DC motors has been chosen to move the wheels movement and a servo motor is used to position the ultrasound range sensor. DC motors are connected to the Arduino microcontroller through a L293B chip which actuates as the power interface.



As sensors a SEN136B5B sonar sensor has been chosen for obstacle detection. For light detection a standard photoresitor has been used.



As inclinometer, a 9DOF RAZOR IMU sensor has been connected to the microcontroller through the serial port. Communications with the microcontroller have been configured and programed.



4

The robot power supply system is based on AA batteries. Finally, the robot is completed with a box where the electronics are allocated.

ANDROID INTERFACE

The application has a single view or layout, which has the appearance shown in Fig. 4.

Fig. 4: Overview of the developed Android app

4.1

Used Views

The layout that forms the user interface of the application is written in an XML file, which consists of several elements, which are presented below: 

SurfaceView: Provides a surface on which the programmer can use and paint. The programmer can change the format and size of the surface, and then, with the help of a Canvas, it is possible to do free drawings. In this case, it is used to draw a visual form of the sensor data that incorporates the robot.



TextViews: Displays text to the user and allows the programmer to change the content, but not the user.



Seekbar: Consists of a horizontal bar, whose position can be modified by the user, either by dragging the finger side, within certain limits. It is used to remotely control the rotation angle of the servo motor which is mounted on the ultrasound sensor.



ImageView: These are images that are displayed as icons to the user. The advantage provided by this type of "Views", is the used of images as buttons is allowed. They are used to control scrolling through robot’s buttons. scrolling by buttons robot.



Button: Represents a button that can be pressed. In this application, there are three buttons. One button is labeled text "Stop", which is used to stop the Arduino robot by the ImageViews control interface. There is another button, whose text varies between "B" and "A", which is used to switch the robot control mode between the ImageViews control and the Android device's accelerometer. Finally, the third button is "statistics", used at the end of the robot’s exploration in order to get the statistics of the uneven terrain.

4.1.1 SurfaceView To draw graphics, animations or any kind of illustration with different shapes the use of Canvas, is widely spread, as so Android and other programming languages such as HTML5. It is for this reason that to represent in a visual, intuitive and easy to understand, this tool has been used when displaying data from 9DOF Razor IMU sensor (accelerometer), proximity sensor and LDR(Light Dependent Resistor), of the Arduino robot. First, a SurfaceView must be defined. It is the surface where to paint any drawing that needs to be displayed, and once its size and format is defined, with the help of the Canvas class, that surface will be held in the paint functions.

This SurfaceView appearance is shown in Fig. 5.

Fig. 5: SurfaceView of the application working with the robot The application consists of six elements that are painted and two TextViews, which are located at the same SurfaceView, but they are not directly drawn into it. These six elements are: 

Two lines, one red and one green, which are used to represent the inclination of the land that the robot captures.



A white text, to show the distance in centimeters to the nearest obstacle detected by the proximity sensor.



Three circles, which are of a more or less clear, depending on the amount of light captured by each LDR.

The two lines represent the terrain inclinations the robot suffers, the X and Y axes, as shown in Fig 6. 

The red line shows the robot’s inclination in the X axis.



The green line represents the robot’s inclination in the Y axis.

Fig. 6: Robot coordinate system When the robot is uphill, the red line simulates a proportional slope rising from the right side. And vice -versa when placed downhill. Similarly, when the robot seen from its rear part (the wheels) on the left side rises more than the right, the green line will rise proportionally to the right, and vice-versa.

4.1.2

Brightness representing

Brightness representation captured by one of the three photoresists placed in the robot’s front end is accomplished using three circles painted on SurfaceView. Being brightness represented by the color of these circles, as shown in Figure 7. If a large amount of direct light is received then, a completely white circle is drawn and a completely black circle is represented when no light has been captured. Each circle may represent a different brightness level, so that the color may be different in each of them, thereby the idea is to help the user to address the robot in the maximum captured light direction.

Fig. 7: Brightness representation detected by the robot

4.2

Control modes of the robot using the mobile device.

One of the most important points of the end user is the kind of control that can be performed over the robot. In this first version control has been carried out using touch buttons and via the accelerometer placed in the 9DOF RAZOR IMU sensor, although future work will consider voice control and image control.

4.2.1 Control using buttons It is a simple control using buttons, such as in the ones used in this paper, the buttons forward, backward, turn and stops are enabled through the Touch Screen of the Android terminal. The advantage of this control mode is that it is quite easy to implement and intuitive for the user when the application is run.

4.2.2 Control using the accelerometer device Today, most of Android devices have sensors in order to make certain tasks easier or enrich the user experience. Among them, it is very common the use of the accelerometer. The accelerometer helps interpret the inclinations that are performed on the device. This provides three different values, depending on the inclination level in the X, Y or Z of terminal. Based on this, to control the robot, only the inclinations in the X and Y axis have been taken into account. As a starting point, when the Android device is set, completely parallel to the ground with the screen facing up, the values for the X and Y axis that are returned will be zero in both cases. As it inclines in the direction of either of these two axes, the values begin to range from 0 to +10 in the case of positive values of the inclination axis, and from 0 to -10 in the case of negative values of the inclination axis. Therefore, when these values are read, the position in which the telephone or tablet is can be calculated and therefore, it can send an action command to the robot according to this position.

5 5.1

PROPOSED EXECISES AND EXPERIMENTAL RESULTS Exercise 1: Light follower Robot

In this first experiment, only the photoresists sensor is used to control the robot. This experiment final goal is that the robot will be able to correctly follow the beam of a flashlight lighting the LDR, as shown in Fig. 8. For this test, the student must develop the Arduino code that allows the robot to follow a light beam.

Fig. 8: Exercise 1: Robot following the light For the student guide the following indication can be provided: 

When the three photoresists detect a small luminance value when the function 'analogRead', is used, the robot remains motionless. This is due to the possibility that a small light source is detected by the LDR, but not enough to be considered as valid.



If in any of the three LDRs a level higher than a threshold (0.14 V) is detected, the robot will move towards the direction of the higher value. That is:

5.2



If the detected LDR higher level is placed in the front left end, the robot will move forward performing a left turn at the same time.



If the detected LDR higher level is placed on the right front, the robot will move forward performing a right turn at the same time.



If the detected LDR higher level is placed in the back, the robot will move backwards.



If both LDRs in the front detect an equal level in both cases, and higher to the LDR in the rear of the robot, it will move forward without any rotation.

Exercise 2: Robot obstacles avoidance

This exercise is used to test the operation of the ultrasound proximity sensor. This exercise final goal (Fig. 9) is that the robot will be able to move freely in an environment where obstacles are arranged in its way, and it must be able to avoid them and leave them, reaching the end of the road without colliding or getting blocked. It is proposed to the students to do an exercise in which the robot moves forward, while measuring the distance detected in front of it. If no obstacle is detected close to 30 centimeters, the robot continues moving in the right direction. If an object is detected within that distance, the robot will stop and immediately back down for a second and a half, to separate from the object, and thus have enough space to turn and avoid it. Once it has to back away a few centimeters, it stops again and starts making a scan of the terrain. For the student guide the following indications can be provided: 

Perform a scan of the sensors.



Check which of these three directions have an object at a greater distance, and the robot will move in that direction. The three possible directions to be considered are described below:





Straight, if it is detected that the greatest distance to an object is directly in front of the robot.



Forward and to the right, if the maximum distance detected corresponds to a 45 degrees sensor right position.



Forward and to the left, in the case of a 45 degrees sensor left position.

In the case that any of the three directions is found to have a distance to obstacles of about 30 centimeters, the robot backs away a few centimeters in order to lengthen the distance to the object that made it stop.

Fig. 9: Exercise 2: Obstacle Avoidance

5.3

Robot Telecontrol and data receiving

Finally, a third test was carried out in which the goal is that students, with a code previously given, evaluate several environments by processing the inclinometer mounted on the robot. The goal of this test is that users can control, through their Android device, the robot motion through the Arduino microcontroller, and at the same time, receive real-time data captured by the sensors incorporated on the robot. In addition, to provide more functionality to the obtained data and received by the user, it incorporates a button on the Android app. When it is pressed, the robot returns to the user the statistics of the uneven terrain that has been exploring. The user has two ways to control the robot through the Android terminal: By touch buttons or by the mobile device's accelerometer. At the same time, it sends to the user, information that sensors have captured. This information is displayed by the user device in real time by using two straight lines that are inclined in the same range as the robot is. During the experiment, all the inclinations detected in each step of the environment exploration are stored in an array. So at the end of it, it can be accessed. At the end of the experiment, the user has a button that will show a series of statistics of the terrain explored by the robot: 

Average elevation in X axis.



Average elevation in Y axis.



Maximum elevation in X axis.



Maximum elevation in Y axis.

As an example, the results achieved using three different scene areas with different options are shown. Fig. 10 shows the environment example.

Fig. 10: Scenario for test 3 of the Arduino robot As an example of exercises and results the following are proposed: 1. Data collection in the paved area: The robot had to move in a straight line through the area that simulates an area covered with stones that provide sloped terrain. The data obtained in this test were the following: 

Slope average at X axis of 23 %



Slope average at Y axis of 11 %



Maximum slope captured in X axis of 58 %



Maximum slope captured in Y axis of 37 %

2.- Data collection in area with pronounced slope and and paved area. The data obtained in this test were the following: 

Slope average at X axis of 4 %



Slope average at Y axis of - 43 %



Maximum slope captured in X axis of 33 %



Maximum slope captured in Y axis of -73 %

3.- Data collection in a ramp area, with a smooth slope. An un even zone needs to be overcome, finishing in a descent ramp, with a higher slope the first ramp. Los datos recogidos son los siguientes:

6



Slope average at X axis of 6 %



Slope average at Y axis of 8 %



Maximum slope captured in X axis of -63 %



Maximum slope captured in Y axis -53 %

CONCLUSIONS

This work presents a low-cost robot easily implemented by students of courses in robotics, which leads them to program basic algorithms for mobile robots and to shows the operation of basic sensory systems used in mobile robotics. Extension of the robot is planned and new exercises will be

proposed in order to introduce students to the concepts of mapping and detection of crossable areas by the inclination map generated by the robot.

REFERENCES [1]

Arduino, [en línea]. [Consulta: septiembre de 2013]. Disponible en web: http://arduino.cc/.

[2]

Enríquez Herrador, Rafael. “Guía de Usuario de Arduino” [en línea]. 13 de noviembre de 2009, [Consulta: septiembre de 2013]. Disponible en web: http://www.uco.es/aulasoftwarelibre/wpcontent/uploads/2010/05/Arduino_user_manual_es.pdf

[3]

Wikipedia contributors, "Puerto serie" [en línea]. Wikipedia, The Free Encyclopedia. 26 de abril de 2013, [Consulta: septiembre de 2013]. Disponible en web: http://es.wikipedia.org/wiki/Puerto_serie

[4]

Campo, Celeste; García Rubio, Carlos. “Android: Introducción”. Aplicaciones Móviles. Universidad Carlos III de Madrid. Febrero de 2013.

[5]

“Getting Started with Android Studio” [en línea]. [Consulta: septiembre de 2013]. Disponible en web: http://developer.android.com/sdk/installing/studio.html

[6]

Wikipedia contributors, "Motor de corriente continua" [en línea]. Wikipedia, The Free Encyclopedia. 13 de septiembre de 2013, [Consulta: septiembre de 2013]. Disponible en web: http://es.wikipedia.org/wiki/Motor_de_corriente_continua

[7]

Wikipedia contributors, "Servomotor" [en línea]. Wikipedia, The Free Encyclopedia. 25 de junio de 2013, [Consulta: septiembre de 2013]. Disponible en web: http://es.wikipedia.org/wiki/Servomotor

[8]

“Control de velocidad y giro de motores” [en línea]. 6 de diciembre de 2011, [Consulta: septiembre de 2013]. Disponible en web: http://pepechorva.com/wordPress/control-develocidad-y-giro-de-motores/

[9]

Salmerón Ruiz, Tamara. “El acelerómetro de un dispositivo Android” [en línea]. 19 de abril de 2012, [Consulta: septiembre de 2013]. Disponible en web: http://lsedkigo.blogspot.com.es/2012/04/el-acelerometro-de-un-dispositivo.html

[10]

Wiki. "Ultra Sonic range measurement module “ [en línea]. 4 de mayo de 2013, [Consulta: septiembre de 2013]. Disponible en web: http://www.seeedstudio.com/wiki/index.php?title=Ultra_Sonic_range_measurement_module

[11]

Wikipedia contributors, "Fotorresistencia" [en línea]. Wikipedia, The Free Encyclopedia. 18 de agosto de 2013, [Consulta: septiembre de 2013]. Disponible en web: http://es.wikipedia.org/wiki/Fotorresistencia

[12]

Gonzales Ruiz, Vicente. “Sockets UDP y TCP” [en línea]. Universidad de Almería. [Consulta: septiembre de 2013]. Disponible en web: http://www.hpca.ual.es/~vruiz/docencia/imagen_y_sonido/practicas/html/texputse57.html

[13]

Wikipedia contributors, "User Datagram Protocol" [en línea]. Wikipedia, The Free Encyclopedia. 26 de agosto de 2013, [Consulta: septiembre de 2013]. Disponible en web: http://es.wikipedia.org/wiki/User_Datagram_Protocol

[14]

Google, “android.widget” [en línea]. [Consulta: septiembre de 2013]. Disponible en web: http://developer.android.com/reference/android/widget/package-summary.html

[15]

"Another exercise of SurfaceView, in a FrameLaout inside another LinearLayout" [en línea]. 30 de mayo de 2010, [Consulta: septiembre de 2013]. Disponible en web: http://androider.blogspot.com.es/2010/05/another-exercise-of-surfaceview-in.html

Suggest Documents