On a Matlab/Java Testbed for Mobile Robots - Department of ...

3 downloads 649 Views 2MB Size Report
This thesis presents the MICA (Mobile Internet Connected Assistant) testbed and ...... of Java and offers a C/C++/Java cross platform developing environment.
On a Matlab/Java Testbed for Mobile Robots

Sven Rönnbäck EISLAB Dept. of Computer Science and Electrical Engineering Luleå University of Technology Luleå, Sweden [email protected]

Supervisors: Dr. Kalevi Hyyppä and Prof. Åke Wernersson

ii

To my relatives...

iv

A BSTRACT This thesis presents the MICA (Mobile Internet Connected Assistant) testbed and how it can be used to interface a mobile robotic system; in this case a modern wheelchair equipped with a CAN bus. In MICA we do research in embedded Internet systems, sensor technologies, navigation algorithms and in the knowledge about user. In order to incorporate this we have developed the MICA testbed. The MICA wheelchair has sensors like laser scanners, cameras, rate gyros, odometric encoders and flash cameras. The MICA testbed software can be divided into three main parts; the embedded server software, the Java clients and the MATLAB algorithms. The client software gives the user access to the sensor databases. The clients are written in JAVA which gives the code portability and the possibility for use in MATLAB. MATLAB is a good environment to do research of new algorithm in; as the algorithms work, they are step-by-step ported to Java. The MICA wheelchair has an embedded PC that runs Linux, which is used to collect sensor measurements continuously and store them into databases. All sensor databases can easily be accessed through the MICA software. It is possible to access and manipulate the wheelchair and its sensors over Internet since all sensor measurements are time stamped and stored in the system. The MICA client/server solution with Wave-LAN (IEEE802.11) provides the possibility to write complex distributed programs that runs on a network. Partly analyzed and processed data, for instance by a MATLAB script, can be sent to other clients on the network for further use. Lots of algorithms are written during research and the software becomes encapsulated and hard to maintain for other people than the programmer if they are written in programming languages like C. In MICA, new algorithms are mainly coded in MATLAB which makes it easy for other people to reuse and contribute in the developing process of new algorithms. Tests have shown that the MICA testbed is easily understood and used by undergraduate students in education. The MICA testbed is also a good environment to do research with. As the sensor are directly available to arbitrary clients over the Internet the system is ready for multi-vehicle implementations. This makes it possible to evaluate algorithms that uses several platforms. The thesis also presents a navigation filter used to estimate 10 states of an airborne UAV; these were position, velocity and attitude. The Kalman filter was written in extended information form and used a quaternion to represent attitude. The observations feed to the filter were position, velocity and attitude. The estimated errors were feed back to the INS. A setup with four GPS receivers were used to calculate the attitude of the airframe. The coming research in MICA will focus on navigation algorithms using cameras and laser scanners. v

vi

vii

C ONTENTS C HAPTER 1 - T HESIS I NTRODUCTION 1.1 Background . . . . . . . . . . . . . . . . . . 1.2 MICA - Mobile Internet Connected Assistant 1.3 MICA Testbed Equipment . . . . . . . . . . 1.4 Telecommands . . . . . . . . . . . . . . . . 1.5 UAV - Unmanned Aerial Vehicles . . . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

1 1 1 2 9 9

C HAPTER 2 - T HE MICA C OMPUTER H ARDWARE /S OFTWARE AND THE C LIENT /S ERVER APPROACH 11 2.1 The MICA Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.2 The Embedded PC, a PC104 . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.3 Example of Hardware Setup on some Robotic Vehicles . . . . . . . . . . . . . 13 C HAPTER 3 - F RAMES , ATTITUDE R EPRESENTATIONS AND I NERTIAL U NITS 15 3.1 On Frames and Coordinates . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.2 The IMU - Inertial Measurement Unit . . . . . . . . . . . . . . . . . . . . . . 17 3.3 Attitude Representation Using Euler Angles . . . . . . . . . . . . . . . . . . . 18 C HAPTER 4 - DATA F USION AND THE K ALMAN F ILTER E STIMATOR 4.1 Data Fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Kalman Filtering . . . . . . . . . . . . . . . . . . . . . . . . 4.3 Extended Kalman Filter . . . . . . . . . . . . . . . . . . . . . 4.4 Decentralized Data Fusion-The Information Filter . . . . . . . 4.5 Calibration of the Fiber Optic Rate Gyro . . . . . . . . . . . . 4.6 Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

23 23 23 26 27 28 30

C HAPTER 5 - PAPER S UMMARY 41 5.1 Summary of Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 5.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 PAPER A 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Presentation of the MICA Wheelchair Platform . . . . . . . 3 Software Description . . . . . . . . . . . . . . . . . . . . . 4 JAVA in the MATLAB Environment . . . . . . . . . . . . . 5 Soft Realtime Operations of the Wheelchair from MATLAB 6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . 8 Discussion and Future Work . . . . . . . . . . . . . . . . . 9 Acknowledgments . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

51 53 54 56 59 60 63 63 63 64

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

67 69 72 79 81 82

PAPER C 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 The Navigation Frame and Body Frame . . . . . . . . . . . . . . . . . 3 Global Positioning System(GPS)-Receivers and Attitude Determination 4 Vehicle Attitude and Euler Angles . . . . . . . . . . . . . . . . . . . . 5 The IMU Inertial Measuring Unit . . . . . . . . . . . . . . . . . . . . . 6 The INS - Inertial Navigation System . . . . . . . . . . . . . . . . . . 7 The Nonlinear System Model . . . . . . . . . . . . . . . . . . . . . . . 8 The UAV Motion Model . . . . . . . . . . . . . . . . . . . . . . . . . 9 The Kalman Filter in Information Form . . . . . . . . . . . . . . . . . 10 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . .

. . . . . . . . . . . .

. . . . . . . . . . . .

. . . . . . . . . . . .

87 89 91 91 91 95 96 96 97 102 104 105 106

PAPER B 1 Introduction . . . . . . . . . . 2 The CAN server . . . . . . . . 3 Results . . . . . . . . . . . . . 4 Conclusions and Future Work . 5 Acknowledgments . . . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

P REFACE This thesis summarizes my research and contributions within the area of robotic platforms and embedded systems. The research was mainly done at EISLAB Luleå University of Technology under the supervision of Dr. Kalevi Hyyppä and Prof. Åke Wernersson. A part of the work was done at the Australian Centre for Field Robotics and Sydney University. Funding was provided by the European Union’s support for regional development and Luleå University of Technology faculty funds. I am grateful to those who have supported and guided me during my life. I have learned that it is easier to learn what is known than to find out things from the unknown; but still the greatest knowledge lies in what still is not known. Sven Rönnbäck, Mars 2004

ix

x

Abbreviations

AD/DA CAN CCD DOF EKF EIF EISLAB FIFO GPS GUI HTML HTTP IF IMU INS IO IP JVM KF LAN LCD LUSAR MICA PC PC104 POSIX RS232 RTAI SICK TCP TTCAN UAV USB UTC WGS-84 WLAN

Analog to Digital / Digital to Analog Control Area Network Charged Coupled Device Degree of Freedom Extended Kalman Filter Extended Information Filter Embedded Internet System Laboratory at Luleå University First In First Out Global Positioning System Graphic User Interface Hyper Text Markup Language Hyper Text Transfer Protocol Information Filer Inertial Measuring Unit Inertial Navigation System Input/Output Internet Protocol Java Virtual Machine Kalman Filter Local Area Network Liquid Crystal Display A three wheeled autonomous vehicle Mobile Internet Connected Assistant Personal Computer A stack based PC format Portable Operating System Interface for UNIX Recommended Standard-232C Real Time Application Interface A range scanning laser from the SICK company Transmission Control Protocol Time Trigged Control Area Network Unmanned Aerial Vehicle Universal Serial Bus Coordinated Universal Time. The time since the Epoch 00:00:00 1 Jan 1970 World Geodetic System 1984 Wireless Local Area Network

xi

Nomenclature

       # %$'& %.-/

 &)(+*

%0&1 %0&

65

&)(7

$9:* 8 ; < = @?BA DC A  > F G

H O P Q

-23&4 R T U V   



 W  X # Y[Z]\ g A  gih  gj l m n nLo

Matrix transpose Matrix inverse           Indicates frame  Coordinates expressed in frame . !"  A rotation matrix that rotates a vector from frame into frame . A variable that is bounded to a special physical object, in this case w=Wheel chair. & &,( Addresses an element of a matrix- where is the row and is the column. A continuous variable over time . & -23&4 A discrete variable with index -2at time . 3&  -23&)(7 given Conditional given value at time . Process noise matrix Expected value State transition matrix Control input matrix Linear observation matrix ?EA C A Normal distribution; =mean value, =std deviation State covariance matrix Process noise covariance $:J6K  J  JL(  JLM/*N A quaternion, HI . Observation noise covariance Distance Innovation covariance & -23&1 I &SR The time at discrete step , Discrete time step Control input Observation noise vector Process noise vector State vector !   -axis of frame . Information matrix Information vector Observation vector Orientation of the wheelchair. Y7Z:\ I_^ +`bId Y[Z:\ Ie^ D` Ifc a c and .    k  Angular velocities around the axes and . Roll angle Heading or yaw angle Pitch angle Pitch angle of laser

xii

Part I

xiv

C HAPTER 1 Thesis Introduction 1.1 Background EISLAB (Embedded Internet System Laboratory) covers research in the area of Embedded Internet Systems. This incorporates electronics, robotics and computer engineering. The EISLAB technique gives the possibility to remotely control and maintain a system over a physical distance. If all devices in a home could be connected to the Internet such as the refrigerator, TV, stereo, central heating, electric consumption units, burglar alarms and hovers, the house would become a remote unit that could be manipulated over distance. The user could, by using his mobile phone or any Internet computer, check the contents of the the refrigerator or, by a telecommand, activate the car motor heating. The county of Norrbotten has five areas of strategic value p

The forest industry p

The mining industry p

The vehicle testing areas p

Education

All of them require and use technologies found in robotics. This licentiate thesis will mainly focus on the Mobile Internet Connected Assistant (MICA) wheelchair project and how to make the wheelchair accessible from a user point of view, especially for autonomous operations and semi autonomous navigation. The thesis also covers the area of Autonomous Unmanned Vehicles (AUV) [46][3] which can be used in almost any environment including space and even down to the deepest ocean.

1.2 MICA - Mobile Internet Connected Assistant The Mobile Internet Connected Assistant project includes a high tech wheelchair designed by Boden Rehab. The project is a collaboration between Lapland University in the city of Rovaniemi in Finland and Luleå University of Technology. In Rovaniemi the research focuses 1

2

I NTRODUCTION

on the user and gerontology. Gerontology being the study of the process by which we get old and the problems that old people have. The MICA wheelchair will through the MICA research step-by-step be more and more accessible to a designated user. In Luleå the focus is on navigation and algorithms to make the system meet the needs of a user. Several different navigation equipments are mounted on the wheelchair, they can all be fused together and tested in the MICA testbed environment. In particular the commercial LazerWay[57][50] navigation equipment can be used as a reference to the other navigation units and algorithms. As the algorithms function on the research testbed systems as required, they are bit by bit transfered and implemented to the target system computer onboard the wheelchair. The MICA testbed has several sensors that can be accessed by an arbitrary user with the client software. The system is also used in education as a platform from which students can get instant measurements, from for instance the range scanner and the onboard navigation system. An assisting wheelchair gives the user an opportunity to travel more efficiently [14, 55, 61, 58] in complex environments that might include obstacles and difficult passages. The user should on such a vehicle be able to switch the operating mode from semi-autonomous mode to vehicle stabilization control mode, or complete manual control [82, 64]. A wheelchair can be operated in different ways as in [81], which describes how to operate a wheelchair through head mounted electrodes. The measured signal was used to move a cursor on a computer LCD screen. On the screen the user can point the cursor in any of the directions forward, backward, left and right. As the user selects a direction the wheelchair is programmed to respond to this by starting to move. This method of manually controlling a sophisticated wheelchair can be very useful for a person with limited motoric capabilities. The technique would increase his mobility. One of the ideas with MICA is to create an intelligent wheelchair that will be able to use both natural landmarks and reflective beacons for navigation. The beacons physical position are stored in the system either by some SLAM (Simultaneous Localization and Map Building) [21, 77][75] method or entered into the system by a user. Reflective beacons are strong navigation points since their signal to noise ratio is very high. They can easily be detected with the right sensor and the reflection from beacons provides enough information to make a vehicle move in a complex environment. To detect obstacles there are a lot of different sensor techniques such as ultra sound, laser scanners, contact switches or small distance measuring units based on triangulation [58]. The pose of the wheelchair can also be measured by matching the range scans of the environment to a known map by using either the Hough transform or some sort of histogram technique [76][56].

1.3 MICA Testbed Equipment A number of navigation equipments are mounted on the MICA wheelchair. Some of them are visible in Figure 1.2. They have unique characteristics and can be combined in arbitrary ways to meet the requirements of the environment.

1.3. MICA T ESTBED E QUIPMENT

3

Figure 1.1: The MICA wheelchair. The LazerWay unit is on the top of the wheelchair and the SICK range laser is mounted above the seat. The joystick is visible on the right armrest. The two front wheels are used for differential drive( they allow the wheelchair to turn and move forward or backward).

MICA equipment list: p

Range scanning laser p

LazerWay Navigation System p

GPS receivers p

Odometric Encoders p

Rate Gyros p

Navigation Flash Cameras p

Video Cameras

Cameras are often needed in remote navigation, a distant user can then through the camera image get a view of vehicle surroundings. With the range scanner the user can easily detect obstacles in the front of the vehicle. Indoors the GPS give no position solutions, but outdoors you receive valuable information about the absolute position as well as the velocity of the vehicle. The LazerWay Navigation System is programmed to operate with reflective beacons and therefor the navigation is restricted to an area where the beacons are found.

4

I NTRODUCTION

Figure 1.2: A) LazerWay unit. B) Flash Camera Unit. C) Web-Camera D) SICK laser range scanner.

1.3.1 The LazerWay Navigation System The LazerWay navigation system is based on a rotating laser beam that measures the angle to known beacons, the unit can be seen in Figure 1.1 and 1.2 mounted on the top of the wheelchair. The LazerWay system is a commercial product and was invented by Dr. Kalevi Hyyppä [50]. The system is now owned by the company Danaher-Motion [4]. The system is widely used, for example in: p

Steelworks p

Post terminals p

Truck manufacturing p

Mines p

Distribution terminals like harbors

1.3.2 The Range Scanning Laser A time of flight laser range scanner of type LMS200, from SICK[72], is mounted on the wheelchair, Figure 1.1. It emits a laser beam that is used to measure the ranges to the surrounding objects. The time between the transmission and reflection is directly proportional to the distance. A range scanning laser can with the correct algorithms be used as a sensor to detect obstacles and extract features fromt a surrounding environment. The SICK laser covers in standard setup a field of q6r ^)s with a ^ s angular resolution, Table 1.1.

1.3. MICA T ESTBED E QUIPMENT

5

Manufacturer SICK Type/Model LMS200 [72] Range max 80m ^ uvt sDw ^ t sDw s Angular resolution q Resolution 0.01m C o Standard deviation ( ) 0.005m Rate 5 scan/s in polling mode Table 1.1: Data for the SICK range laser scanner

As with the other sensors the SICK laser clients has full access to the laser device over Internet. For more specific demonstrations a Windows executable client can be downloaded from the MICA wheelchair Web page1 . The client polls scans and shows range images in near realtime on any Internet computer that runs the Windows operating system. 1.3.2.1 SICK Range Laser Scan - script in Matlab As a demonstration a MATLAB script polls the laser server for one range scan and plots the ranges in Cartesian space, Figure 1.3. Since the laser scans are directly available from the MATLAB environment it is possible to modify and update algorithms while the system is on line and running. Methods like Hough transforms, line segmentation methods and parameterization of data can easily be tested and as a second step be implemented as Java programs as they function as desired (Paper A). Figure 1.4 is a view of the environment where the range scan was taken. The Cartesian X and Y coordinates of the range scan are calculated by the Java instance that has the scan and then plotted. All scans are time stamped and since all other sensor measurements on the MICA platform are time stamped in the same way, with the same clock it is possible to fuse sensor information and create advanced data fusion programs. For instance it is possible to get the correct pose of the vehicle as the scan was taken. This functionality demonstrates how the system is used to build a 3D map of the surrounding environment. For further information, read section 4.6.5.

1.3.3 The GPS - Global Positioning System Another navigation unit on the wheelchair is the GPS receiver. It uses a constellation of satellites to calculate its position. Each GPS satellite transmits its position and the current time. The time synchronization is done so that the satellites send information at the same instant. As a satellite transmits a signal it moves with the speed of light to a receiver. Every satellite has its own modulation code, which makes it possible for a receiver to identify individual satellites in the received signal. By measuring the time lag between the satellite and the receiver it is possible to calculate the distance to the transmitting satellite; this is called the pseudorange. With pseudorange in1

http://rullstol.sm.luth.se, Jan 2004

6

I NTRODUCTION

Figure 1.3: An example of a laser scan taken from the MICA wheelchair. The measured ranges are converted to Cartesian space and plotted. The marked positions A,B,C and D are shown in Figure 1.4.

formation from at least four satellites the receiver can calculate the longitude, latitude, altitude and the current GPS time. The GPS receivers also estimate the time with high precision and simple receivers can be used to synchronize clocks on network-connected computers. This is very important in GSM base stations, Internet servers and other sorts of relay stations. Only one satellite need to be tracked by the receiver to calculate a time estimate.

1.3.4 Odometric Encoders for Dead Reckoning On the MICA wheelchair there are odometric encoders mounted on the shaft of each driving motor. They register the angle of the wheel. As the wheel is rotating they register the angular velocity. The velocity of the wheel is equal to its angular velocity multiplied by the radius of the wheel.

1.3.5 Flash Camera System The MICA wheelchair has four flash cameras [59] mounted on the top, below the LazerWay unit, see Figure1.2. A camera unit sends out a strong infrared flash. The camera lens is located near the flash. As the light reflects back from the retro reflective beacons they will be more visible because of higher intensity in the picture. By thresholding the image a beacon is identified

1.3. MICA T ESTBED E QUIPMENT

7

Figure 1.4: A view from the Web camera which is mounted on the SICK range laser, see figure 1.1. The horizontal line in the figure is the approximate position where the laser scanner sliced the scene. The related positions A,B,C and D can be found in Figure 1.3.

and the offset angle from the optical axis is calculated. Most of the image processing is done in custom built hardware. The estimated parameters are broadcasted on the CAN bus.

1.3.6 Controller Area Network (CAN) The MICA wheelchair has a CAN bus and a CAN server implemented (Paper B). The CAN bus was invented by Bosch GmbH company for the use in cars [13]. It was designed to meet the rough environment that can be found in a modern car, such as electromagnetic disturbances. The CAN bus can be found in many electrical systems, including: p

Robotics machines p

Factory applications p

Home electronic equipments p

All sort of vehicles p

Medical equipments p

Navigation equipments

8 p

I NTRODUCTION Measuring and data input/output units.

The maximum speed of the CAN bus is 1 Mbit/s. CAN messages sent by one node are acknowledged by all other nodes on the bus. 1.3.6.1 CAN Bus Example



>zy){|

>z}i~ }

A CAN bus has two signal wires, x and x . CAN nodes only need to be interfaced to the two signals and have a power supply. Each node on the bus has the same status so there is no central unit. Figure 1.5 shows a bus with three connected nodes. The bus is O terminated at each end with a resistor . If the bus was that of the wheelchair the nodes could be either the embedded PC, the manual control unit or the central control unit.

CAN LOW

R

R

CAN HIGH

A

B

C

Figure 1.5: A CAN bus with three visible connected nodes (A,B,C). The bus is terminated at each end with a resistor (R). The bus has two signal wires (CAN-HIGH and CAN-LOW). There is no central unit on the bus.

1.3.6.2 A CAN Message Each message has an 11 or 29 bit identifier and a payload of maximum 8 bytes, Figure 1.6. The identifier is used as a marker so that other nodes can identify what sort of information the frame contains. Special hardware polls CAN frames from the bus and can be programmed to mask away CAN messages with a specific identifier field.

IDENTIFIER

D0

D1

D2

D3

D4

D5

D6

D7

Figure 1.6: A simplified view of a CAN message. The identifier field is either 11 or 29 bit wide. The payload has a size from zero up to eight data bytes (D0-D7).

1.4. T ELECOMMANDS

9

1.4 Telecommands Telecommands are often used to make a system more easy to use, specially in remote operations [31][44][51]. With telecommands it is possible to remotely operate a system even through a communication link with no realtime performance. Telecommands are short instructions sent to a remote semi- or autonomous system. Telecommands can for example be: p

Follow a wall. p

Go to way point. p

Follow or track a path or object. p

Run through a door way. p

Share and distribute information. p

Distribute computer power and look at a team as one, or as separate units. p

Explore an environment and build a map for further use. p

Report and detect changes in an environment. p

Send information to an operator. p

Go into power save mode. p

Various motion commands.

January 2004 provided an excellent example of telecommands implemented and used by NASA in the remote control of the Mars rover explorer called Spirit [54]. This system had almost been impossible to control without telecommands because of the distance between Earth and Mars.

1.5 UAV - Unmanned Aerial Vehicles Paper C in this thesis present work on an UAV system at ACFR (Australian Centre for Field Robotics)[66]. The Autonomous Navigation and Sensing Experimental Research (ANSER) [29] project demonstrates decentralized data fusion and simultaneous data fusion, localization and map building. The ANSER project uses unmanned aerial vehicles as the robotic platform. With the use of several aerial vehicles in formation flight it is important to prevent collisions in the air [28]; therefor it is necessary to know the position and velocity vectors of all flying units. The paper presents the development of an estimator based on Kalman filtering[32]. The filter estimated the position, attitude and velocity of the UAV in use. UAV:s have become more and more important in the modern society, both in modern warfare and surveillance. UAVs can be used: p

to detect fires in forests

10

I NTRODUCTION p

in surveillance p

in agricultural spraying p

to patrol coastlines p

in radio relay stations, as described in [73]

1.5.1 Different UAV systems UAVs exists in all sizes, some units weigh below 1000g while others weigh several tons. They are all special in the way they work and were developed for use in different tasks, Table 1.2. A tiny UAV equipped with a camera can be carried and operated by one man [35]. The endurance of such a system is rather limited but it gives an eagles eye to the personal on the ground. Other UAV systems are used as radio relay stations or flying base stations. They can be loaded with heavy equipment such as radar installations and signal tracking devices.

1.5. UAV - U NMANNED A ERIAL V EHICLES

Pointer[5] Operating altitude 0.3 km Flight distance (km) 10 km Velocity (km/h) 35-80 Endurance 1h Wing span 2.7m Weight 3.6kg Useful payload 1kg

Pioneer[47] 3km 185km 110-175 5h 5m 190kg 45kg

11

Brumby[25] 180 30min 2.4m 30kg 11kg

Ugglan[42] 1km 75km 160 3h 4.2m Video

Global Hawk[67] max 20km 26000 km 640 42h 35.4m 11620 kg 950kg

Table 1.2: A few different UAV systems, worldwide. The Pointer UAV has a three-manned ground team to keep it airborne during a typical mission, and is ready for constant launch, control, and recovery. The Pioneer is used by the US navy in battle field control. Ugglan was a Swedish UAV demonstrator that needed a 40-manned ground team during a typical mission, from launch to recovery. Global Hawk is an American UAV for long endurance missions.

12

I NTRODUCTION

C HAPTER 2 The MICA Computer Hardware/Software and the Client/Server approach This chapter will present the software and the PC hardware mounted on the wheelchair. The basic testbed clients are explained and how they are used in MATLAB. At the end of the chapter there is a table that shows the PC hardware setup for different robotic systems.

2.1 The MICA Software There are a lot of open software projects to interface robotic vehicles. Both the Player/Stage [34, 68] and Miro [45] projects are software environments for mobile robots. Both are programmed in C++ and offer an extensive sensor environment. Player/Stage comes with a robot simulator called Stage and lot of research groups all over the world have joined this software project. Player/Stage gives support for sensors like the SICK laser, ultra sound sensors as well as odometric sensors. In MICA we have developed our own software. This makes it easier to change the functionality of the code since it is not yet public. The MICA project has a combination of four programming languages in the software: p

The server software is written in C and C++. p

The server clients are written in Java. p

The developing process of new code is done in MATLAB. p

The operating system on the MICA embedded PC is Linux with a realtime patch.

13

14

T HE MICA S OFTWARE REMOTE USER

INTERNET

MAC

XP

LAN

LOCAL ENVIRONMENT WLAN WIN98 PC104 WLAN

€ ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚€ ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚€ ‚€‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚ € ‚€ ‚€‚

WIRELESS LAN ACCESSPOINT

LAN

LAN PDA ....... ...

LINUX

..

Figure 2.1: An example of the MICA wheelchair and the nearby network. The embedded PC104 has a WLAN connection to a wireless access point. With the pocket PC it is possible to remote control the wheelchair. Computers on the local network and the Internet are able to access the MICA testbed.

2.1.1 The Java clients Java was developed by Sun to be a portable language that can run on different operating system platforms from tiny computers to super computers. The clients in the MICA software are written in Java. This makes it possible to run the programs on an arbitrary computer system. The JAVA clients are used in MATLAB programs to poll information from the servers onboard the MICA wheelchair that has an embedded PC. The clients expand as the development goes on. As the MATLAB programs are finalized they are step-by-step ported to Java. In the future it would be nice if the Java clients could run in realtime. There exists a specification for realtime Java1 in JavaX that would make it possible to run threads with the RealtimeThread class. The TimeSys corporation offers an integrated Real-Time specification of Java called JTime [78]. This is the reference implementation for the Real-Time Specification of Java and offers a C/C++/Java cross platform developing environment. More information about realtime Java can be found at Java Community Process 2 . The MICA Java clients are not involved in loops that need realtime since they do not process time-critical information. 1 2

RT-Java,http://www.rtj.org, Dec 2003 RT-Java,Java Community Process, http://jcp.org/en/jsr/detail?id=1, Dec 2003.

2.2. T HE E MBEDDED PC,

A

PC104

15

2.1.2 The MATLAB environment The MATLAB environment is a good environment in which to develop and investigate the performance of different algorithms. It has been shown that it can be used to remote control robotic equipment [26, 33][Paper A]. In MATLAB a lot of different sensors can be accessed through the MICA software servers. Some of the hardware devices that are supported are listed below: p

Inclinometer p

Rate gyro p

CAN interface p

Frame Grabber p

GPS p

Servo Card Unit p

Accelerometer p

Incremental Encoders

The servers have databases that record and store sensor measurements. The measurements are time stamped using the Coordinated Universal Time (UTC), which measures the time in seconds since the Epoch, 00:00:00 1 Jan 1970. In the Linux kernel the clock has micro second resolution. The PC104 runs a Network Time Protocol (NTP) daemon that synchronizes the clock to the Coordinated Universal Time (UTC [23]). An option for synchronizing the clock outdoors would be to use a GPS receiver, since it has a very good estimation of the current time. If both the server and the clients run time synchronization programs the errors due to misaligned clocks can be neglected.

2.2 The Embedded PC, a PC104 Onboard the wheelchair there is an embedded PC built in the PC104 format [19]. The PC104 cards are stacked and fixated with screws, Figure 2.2. The PC104 format is a standard for embedded PC computers and the market offers a lot of different cards and gears in this format. The embedded PC is placed inside an aluminium container that protects it from dirt, water, cold, ice, snow, vibrations and shock. A PCMCIA interface with a Wave LAN card makes it possible to access the embedded PC104 over a physical distance.

2.3 Example of Hardware Setup on some Robotic Vehicles Robotic systems have different hardware setups. Table 2.1 compares the MICA wheelchair with the Brumby UAV and a three wheeled semi autonomous robot called Lusar[31]. Lusar

16

T HE MICA S OFTWARE

Figure 2.2: The PC104 stack with the mother card at the top. A single PC104 card and a floppy disk placed to the right as references. The additional cards in the stack are a DC/DC power supply card, an RS232 expansion card, a video card, a CAN controller card and a PCMCIA expansion card.

has a standard PC computer in AT format with a 486 processor. Lusar runs the QNX realtime operating system. The Brumby aerial vehicle has a 486 processor in PC104 format that meets the requirements of the UAV system. A modern embedded 486 processor uses less power than a Pentium based system and therefor it needs less amount of batteries. The Brumby aerial vehicle is equipped with mm-wave radar, located in the front, for distance sensing. Lusar and the MICA wheelchair has laser scanners for distance sensing.

Platform MICA wheelchair Brumby (UAV) Onboard Computer PC104-Pentium (266MHz) PC104-486 (133MHz) Range Sensing SICK LMS200 Radar Inertial Sensing 1-axis Fiber Optic Gyro IMU (2*3-axis,2*4-axis) Vision USB-camera/CCD-camera CCD-camera Power 2*12Volt Petrol Engine, 12V OS Linux Linux Table 2.1: Hardware setup for different mobile robots.

Lusar AT-486 IBEO laser CCD-camera 2*12V QNX

C HAPTER 3 Frames, Attitude Representations and Inertial Units This chapter will explain the basics of a frame. How coordinates are related between different frames and how to describe an attitude of a system using different types of angular representations.

3.1 On Frames and Coordinates It is important to know the position and attitude of a vehicle to be able to control and move it. The position of a vehicle is expressed by coordinates bounded to a frame. A good proposal of frame notation can be found in [48, 62]. A point is fixed in space but its coordinates differ between different frames; as the frame changes the coordinates points will change as well. A good frame example is longitude, latitude and altitude which are used to describe a position relative to the earth. Examples of different frames: p

Frames with polar coordinate representation. p

Frames with Cartesian coordinate representation. p

Sensor Frame; Fixed to a sensor. p

Vehicle or Body Frame; This frame is fixed to a body. p

Local Navigation Frame; usually described in Cartesian coordinates. p

Geocentric Frame; longitude, latitude and altitude. p

Earth Centre Earth Fixed Frame; Origin is the Centre of the Earth.

It is possible to add an orientation to a body, which is called attitude. It can be the heading relative to the north-pole, the elevation and roll angle relative to the horizon. Those angles will 17

18

O N V EHICLE P OSE

give the view of what a person will see at a specific place if he looks in the direction described by the angles. Any place on Earth can be pointed out with longitude, latitude and altitude by using the geocentric frame. If the frame is set to Earth centered Earth fixed frame there will be other coordinates and attitude angles to the point described above.

3.1.1 The Body Frame Representation on Ground Vehicles A body frame is fixed to a vehicle or robotic system. The frame origin can be the centre of gravity, or as usual on four wheeled vehicles, in the middle of the axis between the two rear wheels. On the MICA wheelchair the origin is between the two front driving wheels. The body frame is used to express the surrounding positions and sensor measurements relative to V  the vehicle origin. The frame bounded to the MICA wheelchair is expressed as with axes   |ƒ „ |ƒ …  |   | " " „| . We set the -axis to point forward. The -axis to point to the left, and  |ƒ† … axis to point up, Figure 3.1.

X

W

Y

W

YG

G

Pw

αw

XG

B‡ by the circle in the middle. The heading is marked with a line Figure 3.1: The wheelchair is represented # in the circle. The wheelchair position is expressed in the wheelchair navigation frame ˆ%‰bŠ , and # the heading angle ‹ is marked in the figure by an arrow.

3.1.1.1 The Body Frame Representation used in Avionics V



The frame of the MICA wheelchair is . The axes setup of that frame are the most commonly used for a ground vehicle. For comparison an aerial vehicle has a different frame !" setup and that frame is expressed as , Figure 3.2. As stated in the German flight standard

3.2. T HE IMU - I NERTIAL M EASUREMENT U NIT

19

 (LN9300 [1]) the vector from the centre of gravity to the nose  of the aircraft is the  

vector from the centre of gravity to the right wing tip is the a right hand orthogonal system by pointing down.

-axis and the

 



-axis. The -axis completes

3.1.2 Wheelchair Navigation Frame - ŒBkŽ
V | O$# 3&4  V  where X is the heading. The  and indicates a change from wheelchair -20&1 < frame to wheelchair navigation frame . The heading of the wheelchair at time #0&1 is expressed as X . 

 |

O

# 

P

`/ &1 I



# 3&1/ X  è # 3&1/ X

¥2¦)§

0&1 I

§+¨­©

P

`+ &1

F #è0&1 ·

†

§+¨­© ¥2¦)§

^ |

The laser range measurement -20&1 becomes: 

£¤

# 0&1+ X  è # 3&1/ X ^

«¬ ^ ^

(4.50) q 

expressed in navigation frame coordinates |

 O

# 

0&1

| P

`+ &4 

` ü± q



 u  î  ­N  î

q



P

`+ &1

at time (4.51)

4.6.4 Wheelchair Control The control program on the MICA wheelchair is used to make the vehicle follow a predefined path or move in the environment. Lots of methods are available in this area and a simple one is proportional control [43, 20]. A virtual steering wheel is introduced to the wheelchair to make

4.6. M APPING

37

it three wheeled [30]. The desired heading angle and speed is set on the virtual steering wheel which is used to calculate the left and right wheel velocities. The controller has two states and tries to control the steer angle error and path offset to zero.

4.6.5 Data Fusion and Mapping Result The system gives a distant user the ability to download measurements from the sensors onboard the MICA wheelchair. In the following two examples, the vehicle run was initiated in MATLAB. Then the wheelchair went on a short respective long run. As the wheelchair stopped the measurements were downloaded into the MATLAB environment, where a script processed the measurements and plotted the figures. Two different databases where used; one that collects position estimations and one that collects the laser scans. Since every laser scan is time stamped it can be fused with the position estimations to get the vehicle pose and position where the laser scan was taken. With this information the correct pose of the laser was calculated and then the laser measurements were plotted in 3D by the use of equation 4.51. 4.6.5.1 Data Fusion-Short Run On the short run the wheelchair started at position marked as "Start Position" and went on a slightly curved path to the end position marked in the Figure 4.5 as "End Position". The estimated positions are plotted as circles, the estimated floor as continuous lines and the estimated walls as dots. In Figure 4.6 the shadow areas are pointed out with markers which occurs since the laser is tilted and looking forward. The MATLAB script that polls sensor information and creates the plot has approximately 50 lines of code.

4.6.5.2 Data Fusion-Long Run The wheelchair went on a longer test in the basement of Luleå University. Figure 4.8 was processed in the MATLAB environment as soon as the wheelchair stopped. The sensor data were collected from the embedded PC and a script was used to process and plot the data. The figure is made of about 800 laser scans. During the test the wheelchair lost the Wave-LAN connection to the access-point. The strength with the MICA testbed is that the sensor servers continuously collect sensor measurements even if there is no Wave-LAN connection. The data can be downloaded as the wheelchair becomes accessible.

38

S ENSOR

AND

DATA F USION

Figure 4.5: An example of data fusion. Sensor information from the wheelchair is fused to plot a corridor at Luleå University. The estimated vehicle path is plotted with circles. The floor with lines and the estimated walls with dots.

4.6. M APPING

39

Figure 4.6: Same situation as in Figure 4.5 but with another view angle. Near the right wall there is a bench. The vehicle path is plotted with circles. The floor is drawn as lines and the walls are plotted as dots.

Time differance between LaserScan and position estimation on wheel chair

0.25

0.2

o=measurements

time differance/[s]

0.15

0.1

0.05

0

−0.05

−0.1

0

50

100

150

200

250

Scan/[nr]

Figure 4.7: Plot that shows the time difference between the time stamp on laser scans and the time stamp on the estimated positions. The script polled laser scans with a 5 Hz polling rate.

40

S ENSOR

AND

DATA F USION

Figure 4.8: In this case the wheelchair drove in the basement for about 2 minutes. There is no visible drift in the heading since the estimates of the walls are straight. The tilt angle of the range laser was approximately %'& s .

4.6. M APPING

Figure 4.9: The tilt angle of the range laser was approximately À) s ¿ run.

41

!( s

. The MICA wheelchair completed a

42

S ENSOR

AND

DATA F USION

Figure 4.10:) The tilt angle of the range laser was approximately !( s . The accumulated dead-reckoning error of a  ¿ meter run is approximately estimated to  meter. There is a small error in orientation, › but it is not esitmated from the figure. From the right wall we can identify an offset in direction of approximately ¿*+( meter.

C HAPTER 5 Paper Summary 5.1 Summary of Contributions 5.1.1 Paper A The paper explains in detail how the developed sensor platform on the MICA wheelchair works. How sensors can be added to the system and be accessed over WLAN from distant clients. The clients can run in almost any environment since they were written in Java. We focus on the MATLAB environment as the developing environment from which all the sensors can be accessed almost directly if we neglect the network delay and software delay. The sensor server programs continuously store incoming measurements and time stamp them with the Coordinated Universal Time (UTC). When the programs work in MATLAB, they are ported into Java and thereby still run code in the same environment. When the algorithms work as required they are moved to the mobile platform where the Java code then uses the local network and we can neglect network lags and time jitter. An edited version of the paper will be published at the conference.

5.1.2 Paper B This paper describes the implementation of a CAN server that acts as a CAN tool to a client. It can be used to monitor, observe and send messages to a distant CAN network over IEEE802.11b (WaveLAN). The CAN server is controlled by one or several clients that can connect to it by TCP/IP. CAN bus messages can be read and sent over WaveLAN from the MATLAB environment. The CAN server collects CAN messages and stores them into a ring buffer. The messages in the ring buffer are classified by their identifier and stored into a database. The TCP/IP client/server is used to remotely control and monitor the CAN bus on the MICA wheel chair. The CAN tool has been used in a demonstrative application example that consist of a wheelchair. In the example the wheelchair was programmed to run in a square. The positions obtained by the CAN messages are compared with the position from the navigation system onboard the wheelchair. 43

44

S UMMARY

5.1.3 Paper C A state estimator was implemented on an Unmanned Aerial Vehicle (UAV). It uses an extended information filter (EIF) in order to estimate the position, velocity and attitude of an aerial vehicle. GPS position and attitude observation are fused with the acceleration and rotational velocities measured by the inertial sensors.

5.2 Future Work The coming work will focus on the flash cameras for use in navigation. The idea with this is to keep the beacons that already are mounted in a navigation environment. The MICA wheelchair will go on end user test to Rovaniemi where two test areas for disabled people are located. One nursing home for elderly people and a complex for people with muscular dystrophy.

References [1] . Normstelle luftfahrt:ln9300. Leinfelden, 1970. [2] Abidi Mongi A. Data Fusion In Robotics And Machine Intelligence. Adacemic Press Inc, 1992. [3] Schoenwald David A. Auvs: In space, air, water and on the ground. IEEE Control System Magazine, 20(6):15–18, Dec 2000. [4] Danaher-Motion Särö AB. Danaher-motion. http://www.ndc.se, Sep 2003. [5] Aerovironment. Av pointer. http://www.aerovironment.com/pointer, Jan 2004. [6] Folkesson John B. and Christensen Henrik I. Robust slam. In IAV2004 Proceedings, Jun 2004. [7] Fraleigh John B. Abstract Algebra. Addison Wesley, 5 edition, 1994. [8] Yaakov Bar-Shalom and Xiao-Rong Li. Estimation and tracking : principles, techniques and software. Artech House, 1993. [9] Andersson B.D.O. and Moore J.B. Optimal Filtering. Prentice Hall, 1979. [10] S. M. Bennett, R. Dyott, D. Allen, J. Brunner, R. Kidwell, and R. Miller. Fiber optic rate gyros as replacements for mechanical gyros. In Collection of Technical Papers. Pt. 2 (A98-37001 10-63). AIAA Guidance, Navigation, and Control Conference and Exhibit, Boston, Aug 1998. [11] Blackman and Samuel S. Multiple tracking with radar applications. Artech-House, 1986. [12] Johann Borenstein. Experimental evaluation of a fiber ic gyroscope for improving deadreckoning accuracy in mobile robots. International Conference on Robotics Automation, May 1998. [13] Robert Bosch. CAN Specification http://www.can.bosch.org,2003, Sep 1991.

2.0.

Robert

Bosch

GmbH,

[14] G. Bourhis, O. Horn, O. Habert, and A. Pruski. A robotic wheelchair for crowded public environments. IEEE Rob. Automat. Mag., 7:20–48, 2001. [15] Eli Brookner. Tracking and Kalman filtering made easy. Wiley, 1998. 45

46

S UMMARY

[16] Robert Grover Brown and Patric Y.C. Hwang. Introduction to Random Signals and Applied Kalman Filtering. Wiley, 3 edition, 1997. [17] C.K. Chui and G. Chen. Kalman filtering : with real-time applications. Springer-Verlag, 2 edition, 1991. [18] Hakyoung Chung, Lauro Ojeda, and Johann Borenstein. Accurate mobile robot deadreckoning with a precision-calibrated fiber-optic gyroscope. IEEE transactions on Robotics and Automation, 17(1), Feb 2001. [19] PC/104 Consortium. Pc104. http://www.pc104.org, Dec 2003. [20] Ingmar J. Cox. Blanche-an experiment in guidance and navigation of an autonomous robot vehicle. IEEE Transaction on Robotics and Automation, 7(2):193–204, Apr 1991. [21] Michael Csorba. Simultaneous Localisation and Map Building. PhD thesis, University of Oxford, Department of Engineering Science, 1997. [22] Titterton D.H. and Weston J.L. Strapdown Intertial Navigation Technology. Peter Peregrinus Ltd, 1997. [23] Mills D.L. RFC-1305, Network Time Protocol Definition Version 3. University of Delaware, http://www.faqs.org/rfcs/rfc1305.html, March 1992. [24] Kalman Rudolph E. A new approach to linear filtering and prediction problems. Transactions of the ASME–Journal of Basic Engineering, 82(Series D):35–45, 1960. [25] Sydney University’s Aeronautical Engineering. Brumby http://www.aeromech.usyd.edu.au/wwwuav/uav_brumby_intro.html, Jan 2004.

uav.

[26] Ove Ewerlid, Claes Tidestad, and Mikael Sternad. Realtime control using matlab and java. Nordic MATLAB Conference, 1997. [27] Durrant-Whyte Hugh F. Integration Coordination and Control of Multi-Sensor Robot Systems. Kluwer Academic Press, 1988. [28] Giuletti Fabrizio, Pollini Lorenzo, and Innocenti Mario. Autonomous formation flight. IEEE Control System Magazine, 20(6):34–44, Dec 2000. [29] Australian Centre for Field Robotics. Anser-autonomous navigation and sensing experimental research. http://www.acfr.usyd.edu.au/projects/development/aerospace, Sep 2003. [30] Johan Forsberg. A construction robot for autonomous plastering of walls and ceilings. ISARC-14, pages 260–268, June 1997. [31] Johan Forsberg, Ulf Larsson, and Åke Wernersson. Tele-commands for mobile robot navigation using range measurements. unknown, 1998. [32] Welch G. and Bishop G. An introduction to the http://www.cs.unc.edu/ welch/kalman/kalmanIntro.html, Apr 2002.

kalman

filter.

R EFERENCES

47

[33] P. Mantegazza G. Quaranta. Using matlab-simulink rtw to build real time control applications in user space with rtai-lxrt. http://www.rtai.org, 2001. [34] Brian Gerkey, Richard T, Vaughan, and Andrew Howard. The player/stage project: Tools for multi-robot and distributed sensor systems. Proceedings of the 11th International Conference on Advanced Robotics, pages 317–323, June 2003. [35] Joel M. Grasmeyer and Matthew T. Keennon. Development of the black widow micro air vehicle. In AIAA, Aerospace Sciences Meeting and Exhibit,2001-0127, volume 39, Reno, Jan 2001. AIAA. [36] Mohinder Grewal and Angus P. Andrews. Kalman filtering : theory and practice using MATLAB. Wiley, 2001. [37] Mohinder S. Grewal, Lawrence R. Weill, and Angus P. Andrews. Global positioning systems, inertial navigation, and integration. New York:John Wiley, 2001. [38] Ben Grocholsky. Information-Theoretic Control of Multiple Sensor Platforms. PhD thesis, University Of Sydney, March 2002. [39] Jose E. Guivant. Efficient Simultaneous Localization and Mapping in Large Environments. PhD thesis, University Of Sydney, May 2002. [40] Sir William Rowan Hamilton. On quaternions, or on a new system of imaginaries in algebra. Philosophical Magazine, 25, 1844-1850. [41] Sir William Rowan Hamilton. On quaterions. In Proceedings of the Royal Irish Academy, volume 3, pages 1–16, Nov 1847. [42] Lars Henriksson. Ugglan sagem sperwer (1999). http://www.avrosys.nu/aircraft/UAV/970_Ugglan.htm, http://www.markop.com/htm/Und.htm, Jan 2004. [43] T. Högström. A semi autonomous robot with rate gyro supported control and a video camera. IFAC Intelligent Autonomous Vehicles, 1993. [44] Tomas Högström, Jonas Nygårds, Johan Forsberg, and Åke Wernersson. Telecommands for remotely operated vehicles. IFAC, Intelligent Autonomous Vehicles, 1995. [45] H.Utz, S.Sablatnög, S.Enderle, and G. Kraetzschmar. Miro - middleware for mobile robot applications. IEEE trans. Robotics and Automation, 18(4):493–498, August 2002. [46] Cox I.J. and Wilfong G.T., editors. Autonomous Robot Vehicles. Springer-Verlag, 1990. [47] Pioneer UAV Inc. Pioneer uav. http://www.puav.com, Jan 2004. [48] Craig John J. Introduction to Robotics, volume 2. Addison-Wesley, 1989. [49] Leonard John J. and Durrant-Whyte Hugh F. Direct Sonar Sensing for Mobile Robot Navigation. Kluwer Academic Press, 1992.

48

S UMMARY

[50] Hyyppä Kalevi. On a laser anglemeter for mobile robot navigation. PhD thesis, Luleå University of Technology,1993:117D, http://www.luth.se, Apr 1993. [51] Åke Wernersson, Mats Blomquist, Jonas Nygårds, and Tomas Hogstrom. Telecommands for semiautonomous operations. In Proc. Telemanipulator and Telepresence Technologies, volume 2351, pages 2–12. SPIE, 1995. [52] Jong-Hyuk Kim and Salah Sukkarei. Airborne simultaneous localisation and map building. In Proceedings of the International Conferance on Robotics and Automation, pages 406–411, Taipei, Sep 2003. IEEE. [53] Hall David L. and Llinas James, editors. Handbook of Multisensor Datafusion. The Electrical and Applied Signal Processing Series. CRC press, 2001. [54] Jet Propulsion Laboratory. Mars http://origin.mars5.jpl.nasa.gov/home/, Jan 2004.

exploration

rover

mission.

[55] Axel Lakenau and Tomas Röfer. A versatile and safe mobility assistant. IEEE Robotics and Automation Magazine, 7(1):29–37, Mar 2001. [56] Ulf Larsson, Johan Forsberg, and Å. Wernersson. Mobile robot localization:integrating measurements from a time-of-flight laser. IEEE Transactions on Industrial Electronics, pages 422–431, June 1996. [57] LazerWay. Lazerway. http://www.lazerway.com, 2004 Jan. [58] E.F. LoPresti, R.C. Simpson, D. Miller, and I. Nourbakhsh. Evaluation of sensors for a smart wheelchair. In Proceedings of the 25th Annual Conference on Rehabilitation Engineering (RESNA), pages 166–168, June 2002. [59] Evensson M., Kozmin K., Marklund A., and Åhsberg K. Ett kamerabaserat navigationssystem. Master’s thesis, Luleå University of Technology, 2002:091. [60] J. Manyika and H. Durrant-Whyte. Data fusion and sensor management : a decentralized information-theoretic approach. Ellis Horwood, 1994. [61] Manuel Maxo, J.Urena, J.C. Garcia, F.Espinosa, J.L. Lazaro, J. Rodriques, L.M. Bergasa, J.J. Garcia, L. Boquete, R. Barea, P. Martin, and J.G. Zato. An integral system for assisted mobility. IEEE Robotics and Automation Magazine, 7(1):29–37, Mar 2001. [62] Phillip John McKerrow. Introduction to robotics. Addison-Wesley, 1991. [63] Adams M.D. Sensor Modelling, Design and Data Processing for Autonomous Navigation, volume 13 of World Scientific Series in Robotics and Intelligent Systems. World Scientific, 1999. [64] David P. Miller. Semi-autonomous mobility verses semi-mobile autonomy. AAAI Spring Symposium on Adjustable Autonomy, Stanford, March 1999.

R EFERENCES

49

[65] Grewal M.S. and Andrews A.P. Kalman Filtering - Theory and Practice. Prentice-Hall Inc, 1993. [66] University of Sydney. Australian centre for field robotics. http://www.acfr.usyd.edu.au, Jan 2002. [67] John Pike. Rq-4a global hawk (tier ii+ http://www.fas.org/irp/program/collect/global_hawk.htm, Jan 2004.

hae

uav).

[68] The Player/Stage project. http://playerstage.sourceforge.net/index.html, 2003. [69] Thrun S. Robotic mapping: A survey. In G. Lakemeyer and B. Nebel, editors, Exploring Artificial Intelligence in the New Millenium. Morgan Kaufmann, 2002. to appear. [70] Thrun S., Burgard W., and Fox D. A real-time algorithm for mobile robot mapping with applications to multi-robot and 3D mapping. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), San Francisco, CA, 2000. IEEE. [71] Blackman Samuel and Popoli Robert. Design and Analysis of Modern Tracking Systems. Artech House, 1999. [72] SICK. Sick, lms200, laser measurement system. http://www.sick.com, Dec 2003. [73] Ulf Sterner. Sambandsuaver i mobila radionät. Technical Report R-99-01369-504–SE, ISSN 1104-9154, FOA, Dec 1999. [74] Lawrence D. Stone, Carl A. Barlow, and Thomas L. Corwin. Bayesian multiple target tracking. Artech House, 1999. [75] Åström K. Automatic mapmaking. In Proceedings of IAV93, 1993. [76] Röfer T. Using histogram correlation to create consistent laser scan maps. In Proceedings of the IEEE International Conference on Robotics Systems (IROS-2002), pages 625–630, 2002. [77] Baily Tim. Mobile Robot Localisation and Mapping in Extensive Outdoor Environments. PhD thesis, University of Sydney, Australian Centre for Field Robotics, Aug 2002. [78] TimeSys. Jtime, java technology for embedded and real-time developement. http://www.timesys.com/index.cfm?bdy=java_bdy.cfm, Jan 2004. [79] Aitken V.C. and Schwartz H.M. A comparison of rotational representations in structure and motion estimation for manoeuvring objects. In IEEE Transactions on Image Processing, volume 4, pages 516–520, Apr 1995. [80] Bar-Shalom Y. and Fortmann T.E. Tracking and Data Association. Academic-Press, 1988.

50 [81] H. A. Yanco and J. Gips. Preliminary investigation of a semi-autonomous robotic wheelchair directed through electrodes. In Proceedings of the Rehabilitation Engineering and Assistive Technology Society of North America Annual Conference (RESNA ’97), pages 414–416, 1997. [82] H.A. Yanco and J. Gips. Driver performance using single switch scanning with a powered wheelchair: Robotic assisted control versus traditional control. In Proceedings of the Rehabilitation Engineering and Assistive Technology Society of North America Annual Conference (RESNA ’98), pages 298–300, 1998.

Part II

52

PAPER A A MATLAB/Java Interface to the MICA wheelchair

Authors: Sven Rönnbäck, David Rosendahl and Kalevi Hyyppä Reformatted version of paper originally published in: The 1st IFAC Symposium on Telematics Applications in Automation and Robotics. 21-23 June, Espoo, Finland

,

c 2004, IFAC, Reprinted with permission.

53

54

PAPER A

A MATLAB/Java Interface to the MICA wheelchair Sven Rönnbäck, David Rosendahl and Kalevi Hyyppä EISLAB Luleå University of Technology {Sven.Ronnback,Kalevi.Hyyppa}@ltu.se [email protected]

Abstract In the MICA (Mobile Internet Connected Assistant) project a high tech wheelchair has been controlled remotely with MATLAB/Java software client-server implementations. Sensor servers that read, time stamp and store sensor device measurements to databases, runs concurrently on an embedded PC running Linux. The network clients used to read sensor values from the servers are written in Java. That gives them portability between different platforms and architectures. MATLAB client programs runs on stationary computers. These are used to process and visualize collected data from the wheelchair. MATLAB programs are also used to control and make the wheelchair run semi-autonomously. The combination of MATLAB/Java software is good for fast development, implementation and testing of algorithms in both education and research. With the client/server approach it is possible to have one computer to run a complex and power consuming control algorithm and another one to handle the GUI. Both computers execute MATLAB programs and they run concurrently and have mutual exchange of data over the Internet. When a program works properly in MATLAB it can be ported to Java for faster execution speed and portability.

1 Introduction Robots and robotic vehicles are usually controlled by software written in C or C++ that runs on local computer systems. MATLAB has nice plotting facilities to visualize data and the fast development loop leads to less bugs in the software. MATLAB is very efficient on matrix calculations and gives the user good insight in variable values. The MATLAB environment and its current support for the Java1 programming language has become an excellent environment for fast developing, implementing and testing of algorithms in research and education. Java has become a language that is supported in almost all computer environments from tiny-micro controllers to supercomputers and can be run in a span from Mobile phones to Internet Web browsers like the Internet Explorer under Windows. 1

Java technology,http://java.sun.com

56

PAPER A

The sensor clients are written in Java and new the MATLAB algorithms are ported to Java. So the new software is Java based, it would be nice if this software could run in a realtime environment with soft-realtime performance. Work to use the combination of Matlab and Java for remote control has been done by Asmo Soinio[9]. He demonstrated that it, from MATLAB using Java, is possible to remote control a Lego robot equipped with a camera. He used the Java-Lego-Network-Protocol-interface to send control commands to his robot. At the Signal and System group at Uppsala University a Java/MATLAB implementation was used on a Linux computer to do some advanced process control on a laboratory-scale plant[1]. The RTAI project also showed that it is possible to control a process in MATLAB-SIMULINK over a network using sockets, [3]. The MICA software is programmed using a client-server network approach, in a mix of Java, MATLAB, C and the C++ programming languages. The wheelchair can with this software be controlled over the Internet using MATLAB. This Paper will mainly describe the MICA server software on the MICA wheelchair. Similar wheelchair projects as MICA run all over the world, like the one Erwin Prassler describes in [7]. His wheelchair has an architecture that is similar to the MICA vehicle. It has a field bus for communication, a joystick, an emergency button and a motor controller connected to the common data bus on the wheelchair. It has an on-board computer that runs Linux and lot of hardware connected through serial ports. They do not have a Wave-LAN like in MICA and do no focus on using MATLAB as the developing platform on new algorithms. The paper is organized as follows. In section 2, the robotic wheelchair is presented as well as the hardware used in it. In section 3, the software is presented, both the client side and the server side. In subsection 3.1 you can find information about how the communication is done between the client/server programs. In section 4, you will find a description on how the Java programs works in MATLAB. In subsection 4.1 the reader can find information on how the synchronization of MATLAB programs over the network is done. In section 5, we present how the wheelchair is operated over the network. Here you will also find a short example of the information flow when some Java classes are used. In section 6 you can read about the results obtained in the tests and the approach used. In section 8 we discuss rather freely about things that maybe needed or ways that the MICA project might take.

2 Presentation of the MICA Wheelchair Platform The work has been done by using a high-tech wheelchair from a local company. We have equipped it with a lot of different sensors and other hardware devices, see Figure 1. In the figure we can see the wheelchair, the embedded PC and some of the sensors connected to it. Each sensor acts as an independent module that can be added or removed. We will only need electrical power and interface a new sensor to the embedded PC to be able to use it. On the embedded PC we need to start a server program (software module) that can communicate with the sensor device. The wheelchair is equipped with a CAN bus ( Controller Area Network). On the CAN

57

Figure 1: The MICA wheelchair. The tilted rectangular unit above the seat is a range scanning laser (A). A GPS antenna is mounted on the top (C), near the LazerWay[6] system. The wireless LAN accesspoint placed on the seat (B). The rectangular dot seen under the seat to the left of (E) is an inclinometer. Next to the inclinometer a rate gyro is strapdown mounted. At position (D) we have the PC104.

bus small messages are sent and read by other CAN nodes. The communication protocol that runs on the bus was specially made for the wheelchair and its hardware. The wheelchair has a manual control unit with setup facilities and a joystick. With the joystick it is possible for an able user to control the wheelchair with high precision. The vehicle comes with incremental encoders mounted on the shaft of each front wheel. They are used for feedback to the main micro-controller unit.

2.1 The PC104, an Embedded PC, Built on the PC104 Stack Architecture In this paper the embedded PC is called PC104 because it is a PC (Personal Computer) that has been built using the PC104 stack architecture. The mother board is a Pentium II class system and runs Linux kernel 2.4.20 with the RTAI2 real time patch. 2

RTAI, Real Time Application Interface, http://www.rtai.org

58

PAPER A

On the wheelchair several sensor devices are mounted, which are connected to the embedded PC or the CAN bus. Most of the equipment and sensor devices that are connected to the PC104 operates through serial port interfaces. To each sensor there exists a program that acts both as a Internet server and as a program that polls the sensor hardware for measurement and stores the information in a local database. EMBEDDED PC COMPUTER, PC104 LAPTOP

THREAD 5 COMMAND

NETWORK

SERVER

CLIENT 1

THREAD 3 COMMAND SERVER THREAD 2 HARDWARE DEVICE

DEVICE THREAD WITH DATABASE

NETWORK

WORKSTATION

THREAD 1 CONNECTION THREAD

CLIENT 2

THREAD 4 STREAMING DATABASE

NETWORK

DATA SERVER

CLIENT PDA

CLIENT 3

Figure 2: An example of a device server that runs on the PC104. The clients are arbitrary ones; one can run in MATLAB and another one by a Web browser. The hardware device measurements are logged to the database and controlled using thread 2. Thread 1 is created first and waits for clients to connect. For each connected client a new thread is created, e.q. thread 3 and 5. A streaming thread (thread 4) is created by thread 1 after a request from client 2 and is used to send measurements continuously to the client. Client 3 is just about to connect to the server.A command thread executes commands sent by the connected client and sends the result back to the client. The communication is bidirectional. When a client disconnects the corresponding thread on the PC104 will terminate automatically.

3 Software Description The device servers were programmed in C++ and compiled in the Linux GCC environment. The reason why C++ was chosen on the server side is better hardware access and reuse of code from previous work where the code is based on C++ and the RTAI (Real Time Application Interface) version of Linux. Most of the hardware is connected through serial ports. The current server modules run as different programs and are started separately.

59 REMOTE USER

INTERNET

MAC

XP

LAN

LOCAL ENVIRONMENT WLAN WIN98 PC104 WLAN

-$ -$ ..$ ..$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -$ .- -..-.-$-$ -.$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - .$ - ..$ .$ .$ .$ .$ .$ .$ .$ .$ .$ .$ .$ .$ .$

WIRELESS LAN ACCESSPOINT

LAN

LAN PDA ....... ...

LINUX

..

Figure 3: System architecture overview. The figure illustrates how a computing network nearby the wheelchair can look like. From an end user point of view, the small hand held pocket PC can be used to help a disabled person to handle the wheelchair in difficult situations. The other clients can be research computers with lots of computing power used to do for instance path planning. One client computer can be a remote operator at an emergency Centre or a maintenance company or why not any user on the Internet.

Java servers may be written and compiled with the GNU java (GCJ3 ) compiler to generate pure executable machine code. Java is not a language to use when the software needs hardware access. Java do have support for serial port communication with the Java Communication API 4 , so most of the sensors could be accessed with that. If we need pure hardware access like IO pins and AD/DA cards and realtime performance, Java in standard version does not support that, therefore C++ and C is the better choices on the server side. There are solutions for realtime Java, like the integrated Real-Time specification of Java called JTime [10]. This is the reference implementation for the Real-Time Specification of Java and offers a C/C++/Java cross platform developing environment. On the client side the language is Java. As Ewerlid[1] mentioned Java has a lot of advantages, possibilities and features like:

3 4

p

Java has built in generic network support that makes networked communication easy. p

Java has support for multi-threaded programs, which makes it possible to create advanced programs. p

Java is a rather simple language, and comes with built in garbage collection, so memory leaks is not a big problem. The language is taught to undergraduate students and they can therefore start immediately to create programs in Java.

http://gcc.gnu.org/java/ http://java.sun.com/products/javacomm/

60

PAPER A p

Java is a portable language, which makes it possible to run the programs on different types of computers. p

Java is a good language especially on the client side when the programs need a graphical interfaces due to the rich support for graphical user interface (GUI) components. p

Java programs can be compiled to executables programs using an ahead-of-time compiler such as the GNU (gcj) front end compiler to the GNU C compiler.

The server software talks with the Java clients using TCP/IP sockets over wireless LAN.

3.1 Client-Server Communication The communication between the programs and processes is done via a computer network using sockets and the TCP/IP (Transmission Control Protocol/Internet Protocol) which is a form of peer-to-peer client-server communication. The data transfer over the network is done using pure readable ASCII text (American Standard Code for Information Interchange), by this choice the transferred data can easily be analyzed and debugged by a human. If multiple lines are sent the transfer starts with a BEGIN command and ends with an END command. The received strings are parsed and put into a Java object. The data can also be sent in a binary format, this is more efficient with respect to time and bandwidth but it will make it harder to debug the information. The binary transfer was specially introduced as an option to transfer laser scans faster to the laser clients due to the huge amount of data that needs to be sent. It also reduced the CPU usage a lot when the software does not need to convert integer numbers to ASCII text.

3.2 Description of the Software on the Server Side The servers are written in C and C++ and all of them have the same structure. The idea has been to make the code portable between real time Linux and normal Linux. The structure is such that the actual scheduling and frequency of a thread is handled by a central node. This makes it easier to change the operating system, to for instance QNX, since only the scheduling node has to be ported. Mutual exclusion of data is done using a semaphore class. Only this class has to be updated with a new semaphore structure, to work on a new realtime OS. Each server module or program runs several threads. At the beginning only two threads are active, one thread communicates with the sensor or the actual hardware and one waits for clients to connect. The sensor thread collects measurements from a sensor and puts them in a database. All measurements get an increasing index and are time stamped with the current system time of the PC104. When the measurements from all the sensors are time stamped in the same way, it is possible to actually do some interesting sensor fusion of the information provided. The exchange of the information between the threads is secured using a semaphore class written based on POSIX 1003.1b5 semaphores. Our semaphore class contains some abilities to debug the use of the semaphore object. 5

http://standards.ieee.org/regauth/posix/

61 Another thread on the sensor side waits for clients to connect to a specific given socket port. As a new client connects yet another thread is created to assist the client requests. By this way it is possible for a lot of clients to connect to the same server and the server can serve them all. For example, in the CAN server, a connected client can put itself as a master and then commands from other clients will be ignored. This option is implemented for safety reasons so that the automatic control can be overtaken by a human with a remote control unit.

3.3 The Java Clients All the clients are written in Java. When a program is written in Java it is very easy to use the code in almost any environment or modern computer architecture. The data, returned to a client, is encapsulated in an object. That holds an index with incremental values used to identify the measurement, the time when the measurement was made and the actual measurement. In previous works the clients were written with the Mex-file support in MATLAB. When this was done the clients were locked to the MATLAB environment and hence the code is more bounded to the computer architecture.

4 JAVA in the MATLAB Environment One nice property when Java objects are called in MATLAB is the possibility to return matrices of both integer and double type. For example a double array return type will result in a matrix in MATLAB. In the same way if a Java function is called with a MATLAB matrix array as an argument it will come as a two dimensional double array in Java. The Java classes are controlled using M-files in MATLAB. Functions to initiate, call the functions inside a class is efficiently implemented using M-files.

4.1 MATLAB Client Synchronization Over The Network One way to synchronize the MATLAB clients over the network would be to use network semaphores with a server maintaining the semaphore status. We could also use some sort of memory with shared variables that can be accessed mutually by MATLAB clients. But the solution selected for this was to implement a network pipe with the implemented functions put and get. With this it is possible to send a matrix, vector or scalar from one MATLAB client to another. One client starts the server and waits for a client to connect, the pipe-client and the pipe-server act as a pair and it is only possible to send information one way. If we want to send information both ways we need a pipe-server and a pipe-client on each side, see Figure 4. In the figure there are two MATLAB programs running, and they exchange information with two data pipes.

62

PAPER A COMPUTER ONE WITH MATLAB RUNNING OBJECT 1

SOCKET COMMUNICATION PORT 8888

OBJECT 3 DATA PIPE

DATA PIPE SERVER

SERVER

OBJECT 6

OBJECT 2 DATA PIPE CLIENT

SOCKET COMMUNICATION PORT 8889

DATA PIPE SERVER

THE PC104 PORT 8887

OBJECT 4

STREAMING CLIENT

SERVER

DATA PIPE

CLIENT

DATA PIPE

OBJECT 5

COMPUTER TWO

WITH MATLAB RUNNING

PORT 8887 STREAMING CLIENT

PORT 9009

CAN SERVER

Figure 4: An example where two MATLAB programs are running. They use the network data pipe implemented in Java to exchange matrix information. The data pipe client(object one) sends matrix data to the server (object three). In the current example a answer can be sent back by using the object two and four. The PC104 streams laser scan data with a set frequency to object five so that MATLAB computer one who processes that information and sends the result to computer two. From the PC104, odometric data and gyro data are streamed to object six in MATLAB computer two which runs a control algorithm and sends control commands to the PC104 though port 9009.

5 Soft Realtime Operations of the Wheelchair from MATLAB When we talk about robots and robotic system we want to control them in some way, we can talk about telecontrol and telecommands. A very simple example in this case of telecontrol is just to put the steering joystick on the wheelchair on another system, like in a MATLAB program or on Pocket PC computer. But this comes with risks as well, the longer distance there is to the robot the more risky it will become due to the transportation time of the signals. One way to bypass this problem is to implement telecommands. As the developing process of new algorithms is done in MATLAB the dynamics of the mobile robot is restricted to slow moments and low speed. This is not a problem as long as we run the system slow mode. A discrete time period of 0.5 seconds has been used on some MATLAB implementations. Most of that time are consumed in the polling process of new measurements over the network. The MICA wheelchair has support for telecommands. Telecommands have been implemented by for instance Högström [4, 2, 5]. Here is list of commands found in the MICA system: p

Set the speed of the wheelchair p

Turn the wheelchair p

Set turn rate with use of the rate gyro p

Set the heading

63 p

Calibrate inertial navigation system p

Take a laser scan p

Drive the vehicle



  meters with velocity ( Ü   ) and heading ( Ü0/  ), 

p

Follow a path segment of length X with coordinates p

Stop the wheelchair or Emergency-break p

Add, change, remove, and execute way-points p

Run down the corridor p

Follow a wall p

Explore the environment



$ à#è



  W #



*  I

ü

$ ^ N 

 1  ,  Ü

* ü

$ ^ N 

*

A CAN client has been implemented in Java, that has some driving commands that can be used to remote control the wheelchair. It uses a special CAN protocol called Bore-Can 6 to send CAN messages on the wheelchair. 5.0.1 An Example of Remote Wheelchair Operations An example of MATLAB code for remote operation and data collection from MATLAB. The example code is written with a special notation to indicate the return value or type of a variable, Table 1.1.

1 2 3 4 5 6 7

>> >> >> >> >> >> >>

oBore=BoreCan oBore.bConnect oBore.bDriveDirect(0.21,15*pi/180) oControl=Control oControl.bConnect oControl.rSetHeading(90*pi/180) oControl.rSetVelocity(0.2)

6 Bore is taken from Boden Rehab, the company that designed the wheelchair. Bore-Can is the special protocol for CAN messages on the embedded CAN bus in the wheelchair.

Notation Explanation Example a Array aoScan b Boolean oBore.bConnect o r

Object Real

oLaser arVector

Example explanation An array of laser scan objects A method in the oBore object that has a Boolean return type A Java instance of the laser client The variable is an array of Double values

Table 1.1: Notation used in the example code, Section 5.0.1.

64

PAPER A

8 9 10 11 12 13 14 15 16 17 18 19 20 21

>> >> >> >> >> >> >> >> >> >> >> >> >> >>

22 23 24 25 26 27

>> >> >> >> >> >> >>

oEncoder=Encoder oEncoder.bConnect rTime=oEncoder.rGetSystemTime aoEncObj=oEncoder.aoGetEncoderByTime(rTime-10,10) oLaser=Laser oLaser.bConnect aoScan =oLaser.aoGetScanByTime(rTime-10,10) oGps =Gps oGps.bConnect aoGpsPos =oGps.aoGetNavUsrByTime(rTime-10,10) oGyro=Gyro oGyro.bConnect aoRate = oGyro.aoGetRateByTime(rTime-10,10) % plotting and analyzing of data , % calculate waypoint data from the measurement % The waypoint has the desired path % velocity and headings % until end point is reached. % Create a waypoint instance called oWay % store needed information in oWay oWay = WayPoint(heading, velocity, path, limits); nWay=oControl.nAddWayPoint(oWay) oControl.nRun(nWay) oGyro.disconnect;oGps.disconnect; oLaser.disconnect;oEncoder.disconnect; oBore.disconnect oControl.disconnect

On line 1 we create an object from the BoreCan class, we call the function that connects to the PC104 on the wheelchair in line 2 and it has a Boolean return type to indicate a success in the connection or not. If there was a success we will have a TCP/IP connection to the CAN server and in the server a new thread has been created to serve the requests and command from the client object. On line 3 we give a command to t drive the wheelchair forward with speed of ^ u s q m/s and an angular velocity or turn-rate of q /s, given in hexadecimal format. On line 4-7 we use the controller on the PC104 to set the heading and the velocity of the wheelchair. On line 8-11 we use the encoder class to receive odometric data objects from the wheelchair. The system time is returned to a variable in line 10. On line 11 we poll the server for collected odometric data the last 10 seconds. On line 12 we create a range laser client object from the Laser class. We poll the laser server for collected scans the last 10 seconds. On line 15-17 we create a GPS client object and in the same way polls the GPS server for collected GPS position measurements the last 10 seconds. On line 18-20 we create a Gyro client object and polls the gyro server for measurement objects. On line 21 we do the calculation and analyzing on the V ’•W received data. On line 22 we add a new way point node to the controller. The instance  has all the information the controller need to reach the end position. Both the heading and velocity during the whole run are represented as fourth order polynomials in the instance. The desired path segment and end position is read from the way point node. The identifier of the V ’,W V ’,W node is found in variable 2 . On line 23 we tell the controller to run node 2 . On line

65 24-26 we disconnect the clients from the servers.

6 Results It has been shown in this work that the combination of Java and MATLAB is a good environment for fast developing and testing of new algorithms in the robotics field. The MATLAB environment gives the user full insight in variable values and extensive plotting facilities, and therefore it provides fast debugging of the code for the user. The Java programming language makes it possible to reuse the clients as standalone programs and the feasibility to make applets that can run under Web-browsers and other applet viewers. The communication between the processes, both clients and servers uses the network, this makes it possible to distribute information and computer power. Under the developing process of new algorithms we write distributed programs that has several clients that use and process data over an Internet connected network. This means that a lot of that is shuffled and send between nodes over a network that has limited bandwidth. If data is send between threads using the localhost, the bandwidth of the data transfer is set by the operating system. The localhost network has no bottle neck set by Ethernet or Wave-LAN (IEEE802.11b). If programs and threads run on the localhost the bandwidth is very high in comparison with Wave-LAN and the data transfer is not affected by time jitter and the network performance. On Wave-LAN the communication is very dependent on the environment because it is based on radio waves and can therefore it can easily be jammed and have net disturbances that affect the data transfer.

7 Conclusions It is possible to use Java as a programming language in the MATLAB environment. A robotic wheelchair has been remote controlled with simple control algorithms written in MATLAB. The MATLAB environment is suitable for extensive algorithms like map building, localization, path finding, wall extraction and obstacle detection. As the MATLAB algorithms work properly, they are ported to the Java language for more efficient computing (especially nested loops). This gives the possibility to keep them in MATLAB or run them on any network computer client or put them as a separately program on the PC104. For example: Hough’s transform is used to extract walls and lines from laser scans. This algorithm is implemented in MATLAB and polls the SICK laser server for laser scans. The Hough transform code should be ported to Java and then used to write a Hough transform server that runs on the PC104. The program to control the speed and turn rate of the wheelchair should run locally on the PC104. If it is run on a remote client we can get very hard and difficult problems with the network lags and disturbances.

8 Discussion and Future Work Implementation of a better GUI in Java that makes it possible to teleoperate the vehicle both with telecommands and manual control using a joystick. The GUI can then be run in MATLAB

66

PAPER A

or even in a modern Mobile phone or any environment that supports Java. The Player/Stage project [8] is a big project for distributed multiple robotic systems, lot of researchers around the world contribute to this project. It comes with simulators for multiple robots and a model of the MICA wheelchair should be implemented in Stage. Stage being the simulator and Player being the robotic software. The MICA device servers should adapt to have a mode so that Player clients can connect to those servers and collect device measurements. The big difference is that in Player/Stage project there seems to be one writing thread and one reading thread. In MICA, new algorithms are developed in MATLAB. This makes it easy to share the code to other and reuse previous MATLAB implementations.

9 Acknowledgments The MICA project is partly funded by Interreg IIIA Nord. The writers want to thank the robotics project students, year 2003 for helping to implement and use the ideas with clientserver Java-MATLAB approach in research.

References [1] Ove Ewerlid, Claes Tidestad, and Mikael Sternad. Realtime control using matlab and java. Nordic MATLAB Conference, 1997. [2] Johan Forsberg, Ulf Larsson, and Åke Wernersson. Tele-commands for mobile robot navigation using range measurements. unknown, 1998. [3] P. Mantegazza G. Quaranta. Using matlab-simulink rtw to build real time control applications in user space with rtai-lxrt. http://www.rtai.org, 2001. [4] Tomas Högström, Jonas Nygårds, Johan Forsberg, and Åke Wernersson. Telecommands for remotely operated vehicles. IFAC, Intelligent Autonomous Vehicles, 1995. [5] Åke Wernersson, Mats Blomquist, Jonas Nygårds, and Tomas Hogstrom. Telecommands for semiautonomous operations. In Proc. Telemanipulator and Telepresence Technologies, volume 2351, pages 2–12. SPIE, 1995. [6] LazerWay. Lazerway. http://www.lazerway.com, 2004 Jan. [7] Erwin Prassler, Jens Scholz, and Paolo Fiorini. An autonomous vehicle for people with motor disabilities. IEEE Rob. Automat. Mag., 7:38–45, 2001. [8] The Player/Stage project. http://playerstage.sourceforge.net/index.html, 2003. [9] Asmo Soinio. A lego-robot http://www.abo.fi/fak/ktf/rt/robot/, 2003.

with

camera

controlled

by

matlab.

[10] TimeSys. Jtime, java technology for embedded and real-time developement. http://www.timesys.com/index.cfm?bdy=java_bdy.cfm, Jan 2004.

67

68

PAPER B

PAPER B Remote CAN Operations in MATLAB using a TCP/IP Client Server Solution over IEEE802.11b

Authors: Sven Rönnbäck, Kalevi Hyyppä, Åke Wernersson To be published

69

70

PAPER B

Remote CAN Operations in Matlab using a TCP/IP Client/Server Solution over IEEE802.11b Sven Rönnbäck, Kalevi Hyyppä and Åke Wernersson EISLAB Luleå University of Technology {Sven.Ronnback,Kalevi.Hyyppa,Ake.Wernersson}@ltu.se

Abstract This paper describes the implementation of a CAN server that acts as a CAN tool to a client. It can be used to monitor, observe and send messages to a distant CAN network over IEEE802.11b (Wave-LAN). The CAN server is controlled by one or several clients that can connect to it by TCP/IP. CAN bus messages can be read and sent over Wave-LAN from the MATLAB environment. The CAN server collects CAN messages and stores them into a ring buffer. The messages in the ring buffer are classified by their identifier and stored into a database. The TCP/IP client/server is used to remotely control and monitor the CAN bus on the MICA wheel chair. The CAN tool has been used in a demonstrative application example that consist of a remotely controlled wheelchair. In the example the wheelchair was programmed to run in a square. The positions obtained by the CAN messages are compared with the position from the navigation system onboard the wheelchair.

1 Introduction In the early 1980’s the Robert Bosch GmbH[3] company invented the Controller Area Network (CAN) to meet real time transfer requirements in the automobile industry. It can operate up to 1 Mbit/s and has good error detection. CAN is a distributed network with no central unit and is flexible in size. It is a good design for nodes that sends information in bursts. Messages sent by one node are broadcasted to all other nodes. It can be used in real time task since an identifier with lower number automatically gets higher priority on the bus. CAN is often used to connect micro-controllers over a simple network. CAN messages are short with an 11- or 29-bit identifier and a payload of maximum eight data bytes. If a collision is detected by a node during the send process the node looses its arbitration and tries to send the message after a delay. All nodes acknowledges frames by a flag and any node flags if a transmit error occurs. In the MICA (Mobile Internet Connected Assistant) project we have a robotic wheelchair. The robotic wheelchair is equipped with a CAN bus. We want to implement navigation algorithms in MATLAB using camera modules and other CAN nodes on the wheelchair as sensors. A camera module will report distance and angle to beacons coded in a CAN message. To make sensor information available in MATLAB we have programmed a client/server solution that can handle CAN information over TCP/IP. We operate the CAN bus from the MATLAB

72

PAPER B

environment by a CAN Java client that has a TCP/IP client/server solution that works over Wave-LAN. MATLAB comes with a Java Virtual Machine (JVM) that can execute Java byte code. We use this and have support under MATLAB since the CAN client software is written in Java and then the CAN bus is functionally available in any MATLAB program. There exists lots of different software and hardware solutions to poll and send information 43 to the CAN bus from a PC (Personal Computer). The CANbus toolset can be together with the appropriate hardware used to create an interface between MATLAB and the CAN bus[10] . This hardware has no Wave-LAN support so it is difficult to remotely analyze CAN messages from a moving vehicle. This product provides no solution to our project. TCP/IP (Transmission Control Protocol/Internet Protocol) is communication peer-to-peer. TCP/IP does not have a deterministic delivery time. Theoretically, the delivery time of a message can be unbounded if there is a collision at each attempt to send the message, so Wave-LAN under heavy traffic is a risky solution for a real time network. If a collision is detected a WaveLAN node backs off and tries to send the package after a short period of time[8, 14, 13, 2]. The IEEE802.11 (Wave-LAN) is somewhat unreliable when terminals or nodes lose and re-acquire line of sight very suddenly[2]. When a WLAN node receives a packet to be transmitted, it first listens to ensure no other WLAN node is transmitting. It transmits the packet if the channel is clear, else it backs off and randomly selects the amount of time the it must wait until it retries to transmit the packet. The back-off factor is selected in such a way that the probability for two nodes to get the same factor is low. Collision detection, as in Ethernet, cannot be used for Wave LAN since a node is deaf when it transmits data. A Wave-LAN node first sends a request-to-send packet (RTS). The RTS frame contains the length of the data and the destination. If the receiving node hears it, a clear-to-send (CTS) packet is sent back to the node. If the CTS frame is not received it assumes a collision and sends the RTS frame again. As the first node receives the CTS package it transmits its data packet. As a packet is received successfully, the receiving node sends back an acknowledgment (ACK) packet. TCP on Wave-LAN guarantees the delivery of all data but not the delay or the rate of delivery[5]. Carnegie Mellon University (CMU) has a project with the name ROSES (Robust SelfConfiguring Embedded Systems) that seeks methods for flexible, robust systems with built in graceful degradation that improves operational availability of an embedded system. Within this project they built a system that remotely reads CAN data from an automobile[12]. Our CAN server works almost like the one in the ROSES project. It gives us the possibility to log and analyze CAN data on line. The paper is organized as follows. In section 1.1 the wheelchair is presented and some of the hardware on it. Section 2.1 describes basics of the CAN server. Section 2.2 presents the CAN client. In section 3 and 4 we have some results and discussion.

73

Figure 1: The MICA wheelchair. The tilted rectangular unit above the seat is a range-scanning laser (A). The flash camera system is right below the LazerWay[7, 4] navigation system on the top (C). The wireless LAN access point is placed on the seat (B). The box under the seat, left relative to (E), is an inclinometer. Next to the inclinometer a fiber optic rate gyro is strap down mounted. The embedded PC computer (PC104), with the CAN interface, is mounted at position (D).

1.1 The wheelchair and its CAN bus The MICA1 wheelchair shown in Figure 1, is a research platform equipped with a CAN bus and an embedded PC. The CAN bus operates at 250 Kbits/s and is the common communication link for the different modules such as the joystick and the main micro-controller. The wheelchair has a manual control unit with setup facilities. The joystick makes it possible for an able user to control the wheelchair with high precision. The vehicle has incremental encoders mounted on the shaft of each front wheel. They are used as feedback to the micro-controller unit. The micro-controller controls the translation and rotational velocities of the wheelchair. It listens to the CAN bus for driving commands and broadcasts encoder information. 1

MICA wheelchair Web page, http://rullstol.sm.luth.se, 2003

74

PAPER B

The embedded PC (Personal Computer) is called the PC1042 since it is built on the PC104 stack architecture. The mother board is a Pentium II class system that runs Linux kernel 2.4.20 with the RTAI3 patch. The PC104 operates at 266 MHz with 64 MB of memory and has a hard disk with Red Hat4 Linux installed on it. The hard disk gives the possibility to develop software and log data locally. The camera sensor modules send out infrared flashes and detect reflective tapes after image processing which is done in custom built hardware. This will produce estimates of angle and distance to reflectors which are streamed out on the CAN bus[9]. Each sensor is an independent module that can be added or removed to the system, Figure 2. We need electrical power and an interface to be able to use it. On the PC104 we start a program that communicates with the sensor device and acts as a server. We will in this article talk about the CAN client/server solution. On the wheelchair 50-600 CAN messages per second are collected by the CAN server. Most of them are time ticks, drive commands, flash camera estimates, LazerWay[1, 7] navigation information, messages with incremental encoder information and messages from the manual control unit.

2 The CAN server When the server program starts, it directly starts to collect messages from the CAN hardware and store them in a database, Figure 3. CAN messages are added as long as the PC104 still has unallocated memory or the maximum size of the database is not reached. If the maximum size is reached the oldest information is released and replaced by new. This is done with a ring buffer. When the maximum size of the ring buffer is reached it wraps around and starts from the beginning. We can easily with this system do our experiments, store all measurements in the database and after a test recall the data into the MATLAB environment. We need to know the start time and end time of the test or the indexes to recall the data needed. If we want to be sure that our measurements will be kept in the database we can stop the collection of new data.

2.1 The CAN Server Threads The server runs several threads. In the beginning three threads are running, Figure 4. One thread that communicates with the CAN232 interface, see Figure 6 and [15]. The polling thread reads messages from the CAN bus and puts them into a database, Figure 5. After each poll the status flag is read to check for errors. In the database, Figure 5, the incoming messages are classified with respect to the identifier. For every new identifier a new list is created and added to the database. The list has a set of messages with the maximum length of 10000 messages by default. Encoder information is streamed onto the bus 40 times per second. So the list with encoder values will be full after 2

PC104, http://www.pc104.org, 2003 RTAI, Real Time Application Interface, http://www.rtai.org, 2003 4 Red Hat, http://www.redhat.com, 2003

3

75 MICA, WHEEL CHAIR Battery 12V

Battery 12V

FLASH CAMERA SYSTEM

Encoder

Encoder Shaft MOTOR

MOTOR

Camera

Camera

Central Unit Camera

Micro−controller

Camera

Power Electronics

Manual

CANBUS

LAZERWAY NAVIGATION SYSTEM Sensor

WLAN

CAN dongle

PC104

CAN232

Linux RS232

GPS

RS232

CAN BUS SICK

Scanning Laser

RS232

Rate Gyro

RS232

GYRO

RS232

HL

Electronics

Inclinometer

Figure 2: Two powerful electrical motors drive the wheelchair. Using a joystick it is possible for the operator to control the wheelchair. Commands from the manual control unit (joystick) are sent over the CAN bus to the micro-controller that executes them and controls the motors on the wheelchair. On the shaft of each motor an incremental encoder is mounted. The encoders are read by the micro-controller and odometric information are streamed to the CAN bus. The cameras process the images and stream out the estimated position and distance to reflective beacons.

about 250 seconds, however the maximum list length can be set to an arbitrary number during operation. CAN messages older than 15 minutes will automatically be deleted. If there exists four unique CAN identifiers there exists four different CAN message lists, as shown in Figure 5. It is also possible to create lists that trigger on the identifier using simple Boolean algebra when using code and mask registers. The server software keeps track on encountered errors and the number of sent and received messages. A connected client can put itself as a master, commands from other clients will then be ignored. This option is implemented for safety reason so that a human can overtake the auto-

76

PAPER B EMBEDDED PC COMPUTER, PC104 THREAD 3 TRANSMIT THREAD WITH RING BUFFER

LAPTOP

THREAD 6 COMMAND

NETWORK

SERVER

CAN HARDWARE

CLIENT 1

THREAD 4 HARDWARE SEMAPHORE LOCK

THREAD 2

POLLING THREAD WITH A RING BUFFER

COMMAND SERVER

NETWORK

WORKSTATION

THREAD 1 CONNECTION THREAD

CLIENT 2

THREAD 5 DATABASE with a ring buffer

STREAMING

NETWORK

CAN SERVER

CLIENT PDA

CLIENT 3

Figure 3: The CAN server on the PC104. The CAN messages collected and stored in the database by thread 2. Thread 1 is created first and waits for clients to connect. For each connected client a new thread is created, e.q. thread 4 and 6. A streaming thread (thread 5) is created by thread 1 after a request from client 2 and is used to send incoming CAN messages continuously to the client. Client 3 is just about to connect to the server. A command thread executes commands sent by the connected client and sends the result back to the client. The communication is bidirectional. When a client disconnects the corresponding thread on the PC104 will terminate automatically.

matic control with a remote control unit. The server also has a message transfer database. CAN messages the clients want to send can be placed in a list. The client sets the start time when the message should be sent at the first time, its duration and repetition frequency. The message will stay in this database as long as the duration option imposes or until it is deleted by a command. A fourth thread checks for connecting clients. When a connection is made a new thread is created to serve the requests from the client. There is no upper limit on the number of clients that can connect, except for the limit set by the amount of memory.

2.2 The CAN client Our CAN client is written in Java. We decided that MATLAB is a good environment to develop software and algorithms in. MATLAB is not a fast executing language since it interprets the program lines. However we speed up the development cycle (planning, coding, testing and an-

77 THREAD 1

THREAD 3

THREAD 2

MAIN THREAD

TRANSMIT THREAD

POLLING THREAD INITIALIZE DEVICE

INITIALIZE

WAITS FOR A CLIENT TO CONNECT

INITIALIZE

WAITS FOR MESSAGES IN THE TRANSMIT DATABASE

POLL THE CAN232 FOR MESSAGES

STORE IN DATA BASE

CREATE SERVING THREAD

SEND MESSAGE TO THE CAN BUS

Figure 4: In the server 3 threads run from the beginning. One waits for clients to connect. One polls messages from the CAN bus. One checks the transmit database for messages to be sent. CAN SERVER LINKED LIST DATABASE HEAD MESSAGE LISTS ID:0x31 INDEX 4

ID:0x90

NULL CAN MSG 13

CAN MSG 9

CAN MSG 8

LENGTH=4 ID=0x31

CAN MSG 5

LENGTH=4 ID=0x90

NULL CAN MSG 6

CAN MSG 2

CAN MSG 1

INDEX 3

ID:0x212344

NULL CAN MSG 12

CAN MSG 11

CAN MSG 10

CAN MSG 7

CAN MSG 3

INDEX 5

ID:0x32139B

NULL

CAN MSG 4

LENGTH=4 ID=0x212344

LEN=1 ID=0x32139B

INDEX 1 RING BUFFER INDEX=13 NULL

RING BUFFER FOR NEW MESSAGES

1

2

3

4

5 6

7

8

9

10 11 12 13

Figure 5: View of the linked list structure used in the CAN server database. This example has a head list that holds four different CAN identifiers. The first identifier has four stored messages in the tail. On the bottom the ring buffer, that keeps a copy of the latest messages received, is shown.

alyzing) by using MATLAB. Sensor information is directly available over Wave-LAN because the server in the PC104 buffers measurements. In MATLAB we use the Java client to connect to the CAN server since Java can run in that environment. Java code is a portable language and

78

PAPER B

Figure 6: The CAN bus interface used is a dongle called CAN232. It operates through a standard serial interface (RS232) with a maximum transfer rate of 230400 Baud and a CAN bus speed of 1 Mbit/s. The dongle operates either in ASCII or binary mode and time stamps the messages when they are read.

it is very easy to reuse code in other applications and environments. The client instance has a lot of methods that can be used to manipulate, poll and send messages to the CAN server. The connect method is used to make a TCP/IP socket connection to the CAN server. As the connection is made a new thread is created to serve requests from the client. 2.2.1 A MATLAB script that polls CAN messages from the CAN server An illustrative MATLAB script that dumps CAN messages to the screen. 1 2 % % 3 4 5 6 7 % 8 9 10 11

>> oCan=CanClient; >> oCan.bConnect(’rullstol.sm.luth.se’); Sequential poll, 1000 requests are sent to the server >> nIndex=oCan.nGetIndex; >> for n=1:1000 >> oCanMsg=oCan.oGet(nIndex-1000+n); >> oCanMsg.toText >> end; One request that asks for the last 1000 CAN messages >> aoCanMsg=oCan.aoGet(nIndex-1000,1000); >> oCanMsgs=CanMsgs(aoCanMsg); >> oCanMsgs.toText >> oCan.disconnect;

The MATLAB script opens a TCP/IP connection to the server, line 2. On line 3 it sends a request to the server for the current cyclic buffer index. Line 4-7 are used to poll 1000 CAN messages and print them as text on the screen. Line 8 asks, in a single request, the server for an array of 1000 CAN messages. Line 9 encapsulate the CAN array in an object. Line 10 calls the toText method which dumps all the 1000 CAN messages as text to the screen.

79 In the two given examples above we polled CAN messages from the ring buffer by using the ring buffer index. It is also possible to request CAN messages by their identifier or by the time stamp. In the developing process of new algorithms that uses the CAN interface, it can be very useful to get statistics about the CAN server. The statistics are directly available through a method in the CAN client. With the CAN client it is possible to drive the wheelchair and get odometric and camera sensor information that we need for data fusion. The LazerWay navigation system is also accessible trough the CAN interface, by this we can compare our navigation algorithms with a known working system. 2.2.2 MATLAB script that sends a CAN message to the CAN server in two different ways To send a CAN message in MATLAB we create a message instance and send it using the CAN client. An illustrative example how this can be done follows below: 1 2 3 4 5

>> >> >> >> >>

oCan=CanClient(’rullstol.sm.luth.se’); oCanMsg=CanMsg(513,[34 12 01 255]); oCan.bSend(oCanMsg); % Sends the CAN msg, oMsg oCan.bSend(oCanMsg,5.0,0.1); oCan.disconnect;

Line 1 connects the CAN client to the CAN server on the wheelchair. A CAN message with the identifier 0x201 and data field ([0x22 0x0C 0x01 0xFF]) is created in line 2. On line 3 the CAN message is sent over Wave-LAN to the wheelchair using the bSend function that blocks and returns a Boolean value of the operation. On line 4 we send the message using the send database in the server by adding two extra arguments. They say we want to send the message for 5.0 seconds every 0.1 second. Line 5 disconnects the client from the CAN server. 2.2.3 Some CAN Server Commands We have implemented some functions and command that make it possible to operate the CAN server over TCP/IP. All received messages on the server get a unique increasing index and are time stamped using the system clock. When 100 messages have been received the index is 100 in the ring buffer. Some of the implemented CAN functions are listed below: Get system time Returns the system time on the server. It is taken form the clock and used to time stamp messages. Get database status Returns information about number of collected and transmitted messages. The number of different types of messages with respect to identifier and the number of messages for each identifier. Search for ring buffer index using time It is possible to search for the ring buffer index for message stored in the ring buffer. Send Immediately Send a message directly to the CAN bus without using the transfer database.

80

PAPER B

Get Status Returns the client status and statistics about the CAN server. Will return information about the number of connected clients, the number of errors encountered, the number of messages received and sent and the number of messages in the receive database and transmit database. It will also send a list of different messages sorted with respect to CAN identifier. Get message by index The client asks the server for a message with a specific index. If this index is bigger then the ring buffer index the server will block until that index is reached. If the requested index is zero the most recent message will be sent back to the client. Get messages by identifier and time Returns an array of message objects with an specific identifier. The array has a selected start and end time. Get message by identifier and list index The command returns the message with the desired index from the list holding the messages with a specific identifier. Send message through transmit database A command used to send messages with the transmit database. It is possible to set the duration of the message, with what frequency it should be broadcasted and the time when the first transmission starts. Remove message from transmit database Removes a message from the transmit database. Get index Return the index for the most recent message in the database. Set code/mask filter Makes it is possible to set both the code and the mask registers in the controller chip. It is used to mask out messages on the CAN bus using the hardware. Set software code/mask filter in the database All received messages that are not masked away will be inserted in a message list. This command is very useful to collect all messages that belong to the drive unit such as maneuver commands and messages with wheel encoder information. The wheelchair has two different encoder messages. One with absolute encoder values and one with differential encoder values, both of them can be masked into the same list. Send message and wait for identifier Sends a message to the CAN bus and waits for a special identifier. It is possible to mask the answer with the implemented software filter that works like the acceptance filter used in the CAN controller, SJA1000 5 by Philips. The acceptance filter compares the received identifier with the acceptance filter values, and decides if the message is valid or not. The acceptance filter can be described using Boolean algebra. If the following statement is true we have filtered away a correct identifier and a 11ü± bit CAN message will be accepted: NOT (((ID XOR CODE) OR MASK) ’65  J  - P07 Jv XOR 0x3FF) Ü In the list above some commands were mentioned which poll the server for messages or message arrays using the index. It is also possible to use other commands to poll the server for CAN messages from a selected start time and end time. These commands are very useful when 5

http://www.semiconductors.philips.com/pip/SJA1000.html

81 experiments are done since we can poll the server for messages using a negative time value, the server will then give us the CAN messages that were polled under the experiment. A time value less then 100000 are treated as an offset from the current time, this means that 1000 will result in a wait for 1000 seconds. 2.2.4 CAN Server HTTP Interface The CAN server also gives the user an opportunity to check the CAN statistics using a WWW browser using HTML code. It is possible to check both the incoming and the outgoing message queues. The incoming CAN identifiers are listed in ascending order with the possibility to check for the last read messages in an identifier list. It is easy, through the WWW page, to check for errors and different message identifiers in the message database. This is very useful in a debugging process when we quickly can check the incoming and outgoing databases.

2.3 CAN Server database Status as Text In MATLAB it is neat to get a view of the different messages in the CAN database. It is therefore possible to call a function that polls the CAN server for database information that returns a Java instance. The instance contains the topic of each identifier list: The identifier, current index and length of each list. In the list we also see the most recent received message for that list, presented as the topic.

2.4 Can Bridge Over TCP/IP Using Can Servers It can be useful to connect physically isolated CAN networks over some network like Ethernet or IEEE802.11b [6]. A setup has been made to connect two CAN networks using TCP/IP client/server solution, Figure 7. A program was coded in Java that mutually checks and polls each CAN server for new messages and sends them to the other network. It cannot guarantee real time but it is a solution that gives a system where all messages sent by a CAN bus node in one network appear in the other network. The CAN server works as a link between the CAN bus and the network. The CAN@net from Nohau [11] is a CAN Ethernet gateway. This product has a program interface that works in the Windows operating system.

3 Results The example shown in Figure 8 uses the Java CAN client in MATLAB to plot the accumulated position of the vehicle using CAN messages with encoder information. It is a illustrative example showing how easily CAN information can be accessed from MATLAB. In the example a set of telecommands were sent to the wheelchair navigation system that K sequence is: makes the vehicle move in a square. The ð t ¸ Move forward q  m, stop and turn ^K to the left. ¸ Move forward ^ r m, stop and turn ^ to the left.

82

PAPER B CAN BUS 1

RS232

CA

CAN SERVER 1

uC

TCP/IP

uC=micro controller CA=CAN adapter, CAN232

TCP/IP

Client Computer WWW

uC

CAN BRIDGE

uC TCP/IP

CA

CAN SERVER 2

TCP/IP DATA LOGGER uC

CAN BUS 2

uC

Figure 7: The "CAN BRIDGE" Java program acts as a bridge between two physically isolated CAN networks. Server 1 has a remote connection to the CAN adapter (CA) using a serial link connection over the Internet and TCP/IP protocol. This makes it possible to use the CAN232 on a very tiny computer with lack of memory while the information is streamed back to the server. With the CAN bridge, microcontrollers (uC) on different buses can broadcast messages to each other. ð t

¸ ^

K

Move forward q  m , stop nd turn to the left. ¸ Move forward ^ r m, stop and turn ^ to the left. And it is back at the start. The wheelchair uses the dead reckoning system to navigate based on odometric and rate gyro information. It was running for about one minute. The encoder information was broadcasted from the central micro-controller in the wheelchair with a frequency of 10 Hz, which gives about 600 CAN messages. The CAN messages that has encoder information was polled from the CAN server and run through a loop to calculate the wheelchair position. The incremental encoder information were converted into differential odometric values. A model for a differentially driven vehicle was used in the example. The wheelchair veloc-20&1 V  ity expressed in vehicle frame at discrete time is: |

|

#è3&4

|

#0&1 I ô

K

|

#4AS0&1 # h 0&1 ö

I u

q ô

g o #è0&1 P o #

giÿ #3&1 P%ÿ # · ^ ö V



(1)

: The wheelchair velocity expressed in wheelchair frame . P o# : The left wheel radius P%ÿ # : The right wheel radius giÿ #3&4 : The measured angular velocity of the right wheel calculated from CAN messages with encoder information. g o #è0&1 : The measured angular velocity of the left wheel calculated from CAN messages with encoder information. -20&1 : Is the time obtained from the CAN message.

83 

#3&4

Angular velocity or turn rate ( g ), expressed in navigation frame calculated by the use of encoder information becomes: 



g #è3&1 I

giÿ #0&1 P%ÿ #



g o #0&1 P o # † #


 !" and the body frame . The navigation frame is fixed to the surface of the earth. It has    the UAV is presented in the navigation frame. a north, an east and a down !" axis. The position of          I  body frame axes   The coincide with the axes of  the aerial vehicle.     The -axis is pointing to the nose, -axis to the right wingtip and the -axis completes a right hand orthogonal system. This method to have a fixed frame to the airframe and one navigation frame is commonly used in strapdown inertial systems.

3 Global Positioning System(GPS)-Receivers and Attitude Determination The GPS receivers use the transmitted signals from GPS satellites to estimate the attitude, velocity and position of the aerial vehicle. With a radio link between the UAV and the ground station, the GPS data were corrected using the DGPS (differential GPS ground station). The receivers also give a velocity estimate with high accuracy. >  ( NED Frame Figure 3 shows the GPS positions of the aircraft in the navigation frame  ‘ " „ ‘  ‘ =north, =east and =down) during a flight. With three GPS receivers it is possible to determine the attitude of an aerial vehicle, four receivers were used to make the system redundant. The actual information about the GPS attitude implementation can be found in [12] and some theory in [9]. The four GPS-antennas were mounted on the airframe, one on each wing, one near the nose and the fourth and last one near the tail. The four GPS receivers were placed inside the vehicle. The GPS receiver sends epoch raw data with a rate of 10Hz which is used to calculate the position, velocity and attitude estimates of the vehicle.

4 Vehicle Attitude and Euler Angles The attitude of the aerial vehicle can be represented using three angles called the Euler angles, $l  n  m * . In aeronautics these angles are called roll, pitch and yaw. They are defined in the regions: †¯®±° l³² ® (1) †

®

u †¯®±°

°

n

®

²

m´² ®

u

(2) (3)

94

PAPER C

Figure 3: The estimated flight path and the GPS position plotted in three dimensions. A trained eye can see the estimated position error especially as the vehicle make turns. n

l

m

The angle represents wingtips up/down (roll), is nose up/down (pitch) and is the heading angle (yaw) relative to the north vector. When all angles are equal to zero, the vehicle is heading north, the wings are in horizontal position and the nose pointing toward the horizon. This is normally the starting position of the UAV before each flight session.

4.1 The Direction Cosine Matrix, 9: ? à to navigation frame. In the equation below and 2 . In the terms of ¥2¦)§ §+¨­© and the direction cosine matrix is expressed as ??:

‘

‘ O

l  n  m  I 

£¤

 / B A  / DA †  / †

!CDA !C0!A ·

·

 C> C>

/ B A / DA EC> / †

EC>DA EC0!A ·

·

! C C>

/ B A / DA !C0 / «¬

(4)

95

Figure 4: Roll, pitch and yaw angles. The yaw is the angle relative to the north. Pitch is an angle relative to the horizontal plane.

The Euler angles can be calculated from the direction cosine matrix with:

GFBH m I

n

l

ML

I_§/¨ª© 

GFBHv© I

‘

JI

v© 



‘ ‘

I

†

$ïu  O

$ O

‘ ‘

O

O

O

q



$:î 

* q

K * q q

ON *

(5) (6)

$'î  u!* $'î  îÈ*

K

(7)

The limitation of the Euler angle parameterization is when one tries to calculate the Euler l · †ë¸ ^,s angles, from the direction cosine matrix. When  I there is a singularity since ¥%¦)§ n I_^ equation (5) and 7 are not numerical stable, as .

4.2 Attitude Representation using a Quaternion Quaternions is another angle representation based on abstract algebra and was invented ‘ by Sir ‘ William Rowan Hamilton[5, 4]. His idea was to introduce a number called quaternion H%“ I & Å $:JLK  J  JL(  JLM  JDPD*  G  I !Ä  ` Å  c  Å  , that has one real part and three imaginary parts . More theory and mathematical rules about quaternions can be found in [3] and [1]. One ad† · ¸ ^ s vantage with the quaternion representation is that it has no singularity at pitch .

96

PAPER C

The relations between Euler angles and corresponding quaternion are: ‘ £Ì Ì ‘

¤ H2“èI

«Í

J

¬

‘

JLK JL(



£Ì

Í

C

/

A

·

¥%¦)§ ( ¥%¦)§ ( ¥%¦v§ ( Ì ¤

C(

§+¨­©

I

C C(

/

¥%¦v§ ( ¥2¦)§

¥%¦)§ ( §+¨­©

JLM

¥%¦v§

/(

/

¥2¦)§

¥%¦v§ ( §/¨ª©

A( A(

§+¨­©

†

¥%¦)§

·

A(

†



C(

§+¨­©

C( C(

( §/¨ª©

§+¨ª©

§+¨­©

/

/(

A( A(

/(

§+¨­©

«Í ¬

A( A(

¥%¦v§ ( §/¨ª©

C(

§+¨­©

/

§+¨­©

¥2¦)§

Í

(8)

(

When a quaternion is used for attitude representation the -norm should always be one. If this  is not the case the quaternion can be normalized using equation (9), where H!“ is the normalized quaternion. $'JLK J JL( JLM+* H% “I

(

þ



J K ·



J



(



( J ( · ·

(

(9)

J M

4.2.1 The Direction Cosine Matrix Written with Quaternion Elements Normally a direction cosine matrix is expressed with §+¨­© and ¥%¦v§ terms. Trigonometric functions require lots of CPU instructions to be calculated. The direction cosine matrix, equation (4), has many ¥%¦)§ and §/¨ª© terms and is more complex to calculate than if it is expressed with quaternion elements. If it is calculated using quaternion elements it is restricted to simple algebra. ‘ ( ( ‘ O

£¤

I

u10J ( ·



† q u ÉJ 1



u1ÉJ

J M 

JL( ·

JLK7JLM[

JLM †

JLK7JL(

u40J



JL( †

(

JLK7JLM7Êu10J JLM · ( · J M Êu10JL(7JLM †

† u1ÉJ q u 0JL(7JLM · JLK7J 4



 q

u10J †

«¬

JLK7JL([ JLK7J

(

·

(

J ( 



(10)

4.2.2 Conversion From Quaternion Representation to Euler angles The relation between quaternion attitude representation and Euler angles can be derived from (10) and (4). Hence: ‘

‘ £¤

«¬ l n

m

Ì

£Ì ¤

I

(SRUTWVT ÒYX T Ó TWZ [ HQ+¥FBHv© L T VÓ  T ÓÒ  T ÓÓ X T ZÓ N H+Q ¥2§+¨­© 3u1ÉJ(SLRUTWK7VJLT\( Z † T JLT M7[ J X Ò Ó H+Q ¥FBvH © L T VÓ X T ÓÒ  T ÓÓ  T ZÓ N

«9Í +

¬

Í

(11)

4.2.3 The Quaternion Change Rate When the aerial vehicle changes the attitude in either roll, pitch or yaw it will be measured by the strapdown mounted rate gyros. Three gyros are used to register changes in all three attitude angles. The rotation rates are measured along the body frame axes. The angular velocities $ A  *  obtained from the rate gyros are g“ g“”h g“.j . They are‘ used to calculate the quaternion $ JLË K  J Ë  JLË (  JLË M+*  H Ë “ÆI elements, equation (12). The quaternion change rate, tells us how fast the attitude of the vehicle changes to an$ observer in navigation frame. This can be related to lË  nË  m Ë * angular velocities in roll pitch and yaw, .

97 ‘ Ì

£Ì ¤



JLË ( JLË M

‘ £Ì

«9Í

JLË K

¬

Í I

Ì

† J

u

q

¤ 

JLK



† JL(

«9Í

JL(

¬

† JLM

JLM † JL(

JLK J

J †

JLK



JLM †

£¤ Í

g“ A g“”h



g“.j

«¬

(12)

5 The IMU Inertial Measuring Unit An strapdown inertial measuring unit is placed inside the airframe, it has three accelerometers and three rate gyros. More information about strapdown inertial sensors is given in [3]. The unit was mounted in such way that the sensor axes coincide with the axes of the air  frame. The IMU was sampled with 450Hz and the measurements were processed to get rotation  $ A  *  rates and the acceleration vector. In Figure 5, the 3 registered rotation rates g“ g“”h g“.j $ A  *  are plotted. The registered acceleration components are represented as ’,“ ’,“”h ’,“.j , plotted in Figure 6.

 A š

š

¾

Figure 5: The plots shows the angular velocities º ] “ ] “”h ] “.j in a coordinate system fixed to the vehicle. From 100 sec to about 150 sec the vehicle is taxing on the ground and is at 150 sec lining up for take-off. At 140 sec the aerial vehicle did a !L¿ s turn and this can be seen in the yaw rate. There are some clutter ( spikes) in the measured yaw rate. It might be caused by an erroneous use of a status bit in the analog to digital converter.

98

PAPER C

_^ 

“ Figure 6: The acceleration expressed in body frame obtained from inertial sensors inside the IMU. s Until about 150 seconds the ^ vehicle is taxing on the runway. At 140 sec the aircraft is turning !L¿ and is lining up for acceleration for take-off. From the plot we see that the vehicle bumps a lot during the takeoff acceleration. The “.j ( down direction) acceleration is strongly colored by the engine vibrations, Figure 5.

6 The INS - Inertial Navigation System If we want to control an UAV, there is a need for a fast update rate of the states. The inertial navigation system (INS) integrates the measurements from the IMU to get the attitude, velocity and position of the vehicle. The implemented INS can work with different vehicle attitude ‘ models, the quaternion model, the rotation matrix and Euler angles. When the rotation matrix O is used, it is updated as the aerial vehicle changes attitude.

7 The Nonlinear System Model In a nonlinear system the state transition matrix is built of nonlinear functions such as trigonometric functions, higher order polynomials or maybe square root terms. The nonlinear system is represented by: Ë @-/ I ? .-/  7 @-/  -/ Ü (13) The input vector is the registered IMU  measurements, in this case

7

@-/ I

$ ’,“ A  ’,“”h  ’•“.j  g“ A  g“”h  g“.j * 

99





Figure 7: Power Spectra Diagram of the measured acceleration in ˜ direction. The PSD of the whole flight is shown. During the flight from take-off until landing we see that the acceleration measurements are colored with a frequency of about 120Hz which probably is caused by engine vibrations. In the beginning there are vibrations as the UAV taxes on the runway.

. The input vector 7

.-/

consists of the ideal signal 7 `

7

7`

@-/ I

R [

.-/

.-/ ·

a

é

with a noise term a

R [

é

R [ ì

. (14)

ì

The noise term a é is modeled to G be@-/Gaussian distributed with an expected zero mean. The  ì process noise covariance matrix is . 8 8

a

R [@*

Ie^

ì

R [ é R \ [ * a

$ é Z

a

$ é

I

Y7Z:\ G

(15) ` 

(16)

8 The UAV Motion Model The equations (10) and (12) are used to form a nonlinear transition function that describes ‘ the motion of the UAV. The Ü model function is a nonlinear vector holding TWc ÿ the velocity “ ’,“ , acceleration . and the quaternion derivatives. The Earth gravity @-/ $ ( b  ( ìe d ­)N  influence KD*N I   must be subtracted from the vehicle acceleration. The state vector =‘ ‘_f  ‘ ‘_f  $ “  H “ *  “ “ , the velocity “  has the position and the attitude H “ represented as a quaternion. The position is given in navigation coordinates.

100

PAPER C ‘





!

“ The velocity vector of the vehicle in navigation frame rotated into body frame  $ A *   “ “ “”h “.j “A  is = . The component is the forward velocity ( in the direction of the    “”h nose), side velocity in the same direction as the axis, and “.j is the down velocity. $ ’,“ A  ’•“”h  ’,“.j *  Ó The acceleration vector obtained( by the accelerometers is ’•“I ‘ f  ã ã j I includes T\c ÿ µ ¸  ’ í)- J  J - P ` J%- ’65@’ = J 5 J Pv’ - `  “Wg h TYi 5 I j ÿ 5 g“lk r4q m/s 2 Ý 2 2 |a Earth gravity b j j e ì d “ 5 [6]. The symbol 5 PL“ 5 is the radius of curvature as the UAV turns. Both the centripetal acceleration and Earth gravity must be subtracted from the measured accelerations to get the ’4“Wg c i acceleration along the flight path. That tangential acceleration should point in the same ì direction as the aerial vehicle. The measured acceleration vector expressed in its acceleration components is:  

 £¤

«¬

’,“ A

£¤

’,“”h

«¬

g“ A

£¤

’,“.j

’,“

’)ç 

ì

«¬

Ë “A g

·

Bg TWc

·

’,“ £¤

“”h

g“.j

Wg c i

·



«¬

“A

k

g“”h I

Wg h T\i

’,“I 

£¤

‘ «¬ ·

PL“”h

gË “.j

“.j

(17)

ed ì

PL“ A

k

Ë “”h g

ÿ



£¤

 O

«¬ ^

TWc

PL“.j

^

(18)

ÿ

b ìed c i W g The tangential acceleration is solved from the equation 18 and used in the nonlinear ì ’S“

model for the UAV. Another term in the acceleration vector is caused by Coriolis force which occur since the Earth is rotating. That term has been neglected since the flight was restricted to a local area. 8.0.4 UAV Motion Model, the Nonlinear 10 State Transition Function

The nonlinear motion model uses a quaternion to represent attitude. The function generates a vector that has the size 10x1. Ë @-/ I ? .-/  7 @-/  -/  (19) Ü

f

£Ì

‘ Ì Ì

‘ Ì Ì

‘ Ì Ì



Ì

 Ì Ì



f

£Ì

f

˓A

Ì Ì Í Ì Í Ì

Ë “.j ˓A

Ì Í

‘ Ì Ì ‘ ¤ ‘

‘

JË JLË ( JLË M



£¤ Ì

Í Ì Í

’•“”h Ì

’,“.j

Ì

Í I Í

Ì

¬

Í Ì Í Ì

‘ Ì

£Ì

Ì ¤

Í Í

(

¤ 

Í Í 

£¤ †

“A

«¬

£¤

k

g “A 

“”h

JLK



† JL(

JLM † JL(

J

† J

†

¬

JLK



Í

£¤

g“ A

‘ O

Í £¤



«¬

TWc

Í «¬

^

b



«9Í

JL(

JLK

«¬

g“.j

† JLM

† JLM

‘

g“”h

“.j

† J Ì



«¬

’ “A ,

Ì

Í

Í

“.j

Ì Í

Í

“”h



Í

«Í «¬

“A

O

Ì Í

Ë “”h

Ë .“ j JLË K

Ì

‘

«9Í

Ë “”h

Ì

£¤

Ì

Í Í Í

^ ÿ

Í

(20) Í Í

ed ì

Í Í Í ¬

Í

g“”h g“.j

8.1 Observation Model A K f has its A navigation system add information from different sensors, sensor own char  ieü Ä MÕeach ü > J 5  ’ --   acteristics and a corresponding sensor model model Ý 2 . The

101

i

observation Á

has a noise term a j\m with zero mean and is modeled as Gaussian. Á

i.-/

I

i @

Ý

a \j m

 -/ ·

R [É* ^ a m ì I_ f ü > O 3&4ni    The observation covariance matrix is 2

(21)

$ j

8

5 J



(22)

 ’ --

lated each time because it is dependent of the states.

. This matrix must be recalcu-

8

RZ [  R\ [ * a Wm a jWm

$ j

The noise in the input vector 7 uncorrelated.

i

po

Y[Z]\ O 1 `   I

.-/

Y[Z:\ I

(23)

and the noise in the observations 8

R [ j m R [ * a ì

$ é

a

I_^ ‘

ì

«¬

“A

“”h

‘

MÕA7M

rq I

ø

O

Á

i

are assumed to be (24)

“

The velocity of the airframe in navigation frame is ‘ through £¤

^

` f I c `¯If a c q

which is related to the state vector

PA7M q



ú 0&1

(25)

“.j =

The position, velocity and attitude observation matrix at time-step k, 0&1 c , will be as follows: ì9ì

3&4 =

3&4 =

=

áI

ø Û

MÕA7M

MÕA7M 1 sT o  I ø ^ A7M 3&4 c Iutó^v ì9ì

‘ s ^

A7M

P w R ER [{[ dBxyzw y T\V = 0&1 T o  O

ú

A7M ^ A Ñ

ú

w

dBxyzw y T

RïAER Ñ [{[

ïR AER Ñ [}[ T w d!x|yzy Ó

w

Ò

w

=

3&1

R AER Ñ [{[ \ T w dBxyzy Z

1sá,

~

=

0&1

 T o and

(26)

The velocity observation matrix, is derived from equation (25). The attitude observa= 0&1 c tion matrix, is derived from equation (11) through differentiation with respect to the ì9ì quaternion elements.

8.2 Complementary Filter with Feedback The INS keeps track of the states of the aircraft. The INS integrates inertial sensor measurements with a rate of 100Hz. It is used to keep track of the fast dynamics of the UAV, like fast turns, acceleration and rapid change in attitude. The integrated errors in the INS are corrected by the error estimator which uses a GPS receivers for absolute observations of the position and attitude. Since the INS uses accelerometers to calculate the position a small error will grow over time. The integrated INS position, calculated from inertial accelerometer measurements  is: ‘ ‘

Wg €

F “ ~‘

.-/ I

‚

êÃì ê K K

€ 7

OÆ~‘



’,“

M

7

 í

7 í ƒ

(27)

The INS position error growth can be approximated as q since there@-/is a double integrator ‘ and since the error growth in attitude is approximated to be linear q . The INS attitude / ~ H2“Wg ‘„ ‘ ‘ is updated with IMU angular rate measurements with equation (12). The INS attitude OÆ~‘„ 7  H2“Wg ~/‘„ is used to calculate the INS direct cosine matrix . @-



102

PAPER C Y



The difference Á between the INS states and the observed states are used to estimate the Y error, . The estimated error is feed back to correct the states in the INS, see Figure 8. The Kalman filter matrices are calculated using the states in the INS. The nonlinear filter is -20&1 $ 0&1+  7 3&1* linearized at a given time at the working point read from the INS. We have Y 0&1 external observations of the states, after each step estimated state error is estimated and feed back to the INS. After each feedback step the filter matrices are recalculated and a new prediction of the state vector is done. Position velocity attitude

INS

IMU

output

Feedback for INS correction

-1

Observation

KALMAN FILTER

+

Position velocity

GPS

attitude

Figure 8: A complementary filter with feedback. The INS integrates measurement from the IMU, it runs at 100Hz. The extended filter runs with 10Hz and is used to stabilize the drift in the INS. The position, velocity and attitude observation are read from the GPS software.

8.3 Linearizing of the Nonlinear System Function The equations 21 and 13 are linearized and used in the Extended Kalman Filter (EKF)  for error @-/ estimation. The system function is differentiated with respect to both the state vector and .-/ the input vector 7





Y  Ë @-/ IÐÞ Ü Y .-/ ·

The observation models Ý

i1@-/%@ @-// 

ü

2 Y Á

Þ

i

@-/

f



> I

J  Þ Ý Þ

5

Þ Ü Y

7

Þ  ’ --

i

@-/

@-/

Y

7

@-/

(28)



are linearized. (29)

8.4 Linearized System Discrete in Time We now have a discrete system .-/ ready7 to@-/ implement in a computer system. & This is a first order linearizion at working point and . The filter is discrete in time and indicates the time

103 &

slice. The time at

is

-20&1 I

- á &

áI

and Ü



Y 3& ·

is the update rate of the error estimator.



 I q

w - á q

0&1 Y 3&1 · ;

7

0&1 Y

Suggest Documents