5.2.2 Building Environment â Android Studio. 5.3 User Interface. 5.3.1 Design ...... 3D Printing is a fast and cheap s
CSIS 0801 Final Year Project Implementation of An Intelligent Hexapod Robot
Final Report FYP Account: fyp14013 FYP Website: http://i.cs.hku.hk/fyp/2014/fyp14013/ Member ● Lai Chak Yui (Arack) ● Lee Siu Man (Sherman) ● Ng Siu Lung (Bryce) ● Yip Sit Wun (Terry) Supervisor ● Dr. Tam Anthony, Mr. Lee David
Brief Content 1 Project Background and Objective 1.1 Project Background 1.2 Objective 1.3 Work Allocation and Collaboration 2 Architecture 2.1 Ab s tract 2.2 Components 2.3 Brief Overview - How all things work together 2.4 Circuit Design 2.5 C++ Program Design 3 3D Printing 3.1 Basic Steps of 3D Printing 3.2 Design of the Robot 3.3 Other important note 3.4 3D Printing Conclusion 4 Kinematics 4.1 Kinematics Algorithm 4.2 Locomotion 4.3 Design & Implementation 4.4 Conclusion 5 Application 5.1 Introduction 5.2 Development 5.3 User Interface 5.4 Technique Used 5.5 Future Development 5.6 Conclusion 6 Reference
HKU CS FYP14013 Implementation of An Intelligent Hexapod
1
Detailed Content 1 Project Background and Objective 1.1 Project Background 1.2 Objective 1.3 Work Allocation and Collaboration 2 Architecture 2.1 Abstract 2.2 Components 2.2.1 Skeletons 2.2.1.1 Body 2.2.1.2 Legs 2.2.2 Micro Computers 2.2.2.1 Introduction 2.2.2.2 Raspberry Pi 2.2.2.2.1 Pin Allocation 2.2.2.3 WRTnode 2.2.2.4 Firmware and Operating System 2.2.3 Servos 2.2.3.1 Introduction 2.2.3.2 Control Principle 2.2.4 Accessories 2.2.4.1 Speaker 2.2.4.2 Web Camera 2.2.4.3 Ultrasound Sensor 2.2.5 Electronic Components 2.2.5.1 Battery 2.2.5.2 Power regulator 2.2.5.3 Power Relay 2.2.5.4 Connectors and Wires 2.2.6 Kinematic Algorithm
HKU CS FYP14013 Implementation of An Intelligent Hexapod
2
2.2.7 Mobile Application Controller 2.3 Brief Overview - How all things work together 2.4 Circuit Design 2.5 C++ Program Design 2.5.1 Introduction 2.5.2 Main Object 2.5.3 GPIOClass Object 2.5.4 Point Object 2.5.5 ServoController Object 2.5.5.1 How to convert degrees into PWM signal? 2.5.5.2 How to convert program calculated degrees to servo actual degrees? 2.5.6 Kinematic Object 2.5.6.1 Implementation of Leg Inverse Kinematic Algorithm 2.5.6.2 Implementation of Body Inverse Kinematic Algorithm 2.5.6.3 How does the hexapod walk in a smooth gait? 2.5.6.4 How does the hexapod move with different altitudes of the center? 2.5.6.5 How does the hexapod move in different speed? 2.5.7 UltrasoundSensorController Object 2.5.8 Speak Object 2.5.9 PowerClipController Object 2.5.10 XmlrpcServer Object 2.5.10.1 Methods interface for the controller 3 3D Printing 3.1 Basic Steps of 3D Printing 3.1.1 3D Modelling 3.1.1.1 What is 3D Modelling? 3.1.1.2 Why Sketchup? 3.1.1.3 Download objects from 3D WareHouse? 3.1.1.4 Download objects from Thinkiverse? 3.1.1.5 How to edit the imported STL? 3.1.2 Convert to STL 3.1.2.1 Why convert to STL? 3.1.2.2 How to export to STL in Sketchup?
HKU CS FYP14013 Implementation of An Intelligent Hexapod
3
3.1.3 Fix-up 3.1.3.1 What is fix-up and why it’s needed? 3.1.3.2 When will we encounter fix-up error? 3.1.3.3 How to fix the fix-up error? 3.1.4 Slicing 3.1.4.1 What is slicing and why it’s needed? 3.1.5 Print 3.1.5.1 How the 3D Printer prints? 3.1.5.2 Why we need to print support? 3.1.5.3 How to printing objects with support? 3.1.5.4 Are there any alternative other than printing support? 3.1.5.5 What to do if the plastics don’t stick on the platform? 3.1.5.6 What to do if Nozzle is stuck? 3.1.5.7 What to objects from shrinking? 3.2 Design of the Robot 3.2.1 Body Design 3.2.1.1 Single layer vs. Sandwich? 3.2.1.2 Embedded eletronics design 3.2.2 Clamp Design 3.2.2.1 Loose Clamp Design 3.2.2.2 Tight Clamp Design 3.2.2.3 Tight + Space efficient Design 3.2.3 Leg Design 3.2.3.1 Short leg joint (Femur) 3.2.3.2 Legs design 3.2.3.3 High density leg joint 3.2.3.4 Both side handle of the legs 3.2.4 Design Tips 3.2.4.1 Design is about trails 3.2.4.2 Always draft first 3.3 Other important note 3.3.1 Assemble also takes time 3.3.2 Collaboration with team is needed
HKU CS FYP14013 Implementation of An Intelligent Hexapod
4
3.3.3 Be Patient with the Printer 3.3.4 Filaments are expensive 3.3.5 Search for extension 3.4 3D Printing Conclusion 4 Kinematics 4.1 Kinematics Algorithm 4.1.1 Forward Kinematics 4.1.1.1 General Approach 4.1.1.2 Analytical Approach 4.1.2 Inverse Kinematics 4.1.2.1 General Approach 4.1.2.2 Analytical Approach 4.1.3 Body Kinematics 4.1.3.1 Body Translation 4.1.3.2 Body Rotation 4.1.3.3 Rotation Matrix 4.1.4 Leg Kinematics 4.2 Locomotion 4.2.1 Alternating Tripod Gait 4.2.2 Metachronal Gait 4.3 Design & Implementation 4.3.1 Coordinate System 4.3.1.1 Body Frame 4.3.1.2 Leg Frame 4.3.2 Implementation of Leg Kinematics 4.3.3 Implementation of Body Kinematics 4.3.4 Implementation of Gait 4.3.4.1 Tripod Walking Straight 4.3.4.1.1 Forward 4.3.4.1.2 Backward 4.3.4.1.3 Right 4.3.4.1.4 Left 4.3.4.2 Tripod Wheeling
HKU CS FYP14013 Implementation of An Intelligent Hexapod
5
4.3.4.2.1 Forward Right/Left Wheel 4.3.4.2.2 Backward Right/Left Wheel 4.3.4.3 Tripod Rotating 4.3.4.3.1 Right Rotating 4.3.4.3.2 Left Rotating 4.3.4.4 Implementation of Loop 4.3.4.4.1 Initial Design 4.3.4.4.2 Final Design 4.4 Conclusion 5 Application 5.1 Introduction 5.2 Development 5.2.1 Building Platform – Android 4.0.4 (API15) 5.2.2 Building Environment – Android Studio 5.3 User Interface 5.3.1 Design Strategy 5.3.2 Material Design 5.3.3 Landing Page 5.3.4 Basic Mode 5.3.5 Dance Mode 5.3.6 Immersive Mode 5.3.7 Settings 5.4 Technique Used 5.4.1 XML-RPC 5.4.2 MJPEG Decoder 5.4.3 Accelerometer 5.5 Future Development 5.5.1 WebCam Video Analysing 5.5.2 Full Support on lower Operating System 5.5.3 Support on other Mobile Operating System 5.6 Conclusion 6 Reference
HKU CS FYP14013 Implementation of An Intelligent Hexapod
6
1 Project Background and Objective 1.1 Project Background Robots now play an important role in our lives. HKU/QMH used da Vinci robotic surgeries since 2007 and had saved more than 500 patients life. With the development of 3D Printing technology, it is easier than ever to produce 3D objects. Therefore, the four of us is taking the challenge to build an intelligent hexapod robot using our skills learnt in CS.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
7
1.2 Objective We want to build a hexapod robot controlled by mobile apps to walk naturally and avoid obstacles automatically.
In our project, due to limited time, we only implement a basic hexapod. The hexapod will consist of 6 legs and each leg is of 3 degree freedom of movement. It will be with camera, speak and ultrasound sensor.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
8
1.3 Work Allocation and Collaboration The work is mainly divided into 4 parts (i.e. Architecture, 3D Printing, Kinematic and Mobile Application). Arack is responsible for Architecture part. He designed the whole architecture of the hexapod from outermost physical building to innermost programing design and connected all components required to make the hexapod. Sherman is responsible for 3D Printing. He designed the 3D model for the robot and print it out. Terry is responsible for Kinematic. He studied the Kinematics and Gait theories and designed a feasible solution to apply on the hexapod. The implementation of the Kinematics and Gait Algorithm was done together by Terry and Arack. Bryce is responsible for Mobile Application and also studied the feasibility of implementing OpenCV into the application. He is also responsible for purchasing necessary materials for the robot. The work allocation and collaboration between us are shown below: ● Architecture (Arack) ○ Design the whole architecture of the hexapod ○ Discuss the 3D printed holders design for electronic components and micro-computers with Sherman ○ Implement general functions and Kinematic Algorithm in C++ program ■ Discuss the Kinematic Algorithm with Terry ○ Discuss methods required to control the hexapod with Bryce.
● 3D Printing (Sherman) ○ Body case design ■ Electronics positioning (+Arack) ○ Leg & joint design ■ Length of the joint (+Terry) ○ Speaker/Camera/ultrasound case design ○ Clamp design
HKU CS FYP14013 Implementation of An Intelligent Hexapod
9
● Kinematics (Terry) ○ Study and design of Kinematics solution ○ Study and design of Gait algorithm ○ Implementation of kinematics and Gait algorithm (+Arack) ● App (Bryce) ○ Application Design and Implementation ○ Design required Remote Procedural Call with Arack ○ Order necessary products for robot building
HKU CS FYP14013 Implementation of An Intelligent Hexapod
10
2 Architecture 2.1 Abstract This section is going to explain the whole architecture of the hexapod from outermost physical building to innermost programing design. First we will go through all components we need to build an intelligent hexapod. Then we will see how those components work out together. After that, we will go to the Circuit Design to see how all electronic components be connected. Finally, we will see how the C++ program is designed to interact with different components.
2.2 Components 2.2.1 Skeletons Skeletons to support the hexapod are printed by 3D printer. Those shapes can be easily customized to suit our need. (Details of 3D printing please refer to Section 3)
2.2.1.1 Body It is a connection for 6 legs. Two microcomputers and most electronic components will be placed inside in it.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
11
2.2.1.2 Legs They give the hexapod moving ability. Each leg will hold 3 servos to support 3 degree freedom of movement.
2.2.2 Micro Computers 2.2.2.1 Introduction Two different microcomputers are used on this hexapod. Each of them is responsible for different duties. Heavy duties are distributed to two to reduce workload of each.
2.2.2.2 Raspberry Pi The Model B+ is the final revision of the original Raspberry Pi. It is small and powerful enough to be the brain of the hexapod. [2.1] 1. It has 40 pins which can support 18 servos and some GPIOs for sensors. 2. It has a Micro SD card socket which can support our self-customized firmware and operating system. 3. It has low power consumption which can reduce the demand of power supply.
The main duty of Raspberry Pi is to run the main program: 1. Controls servos 2. Controls the ultrasound sensor 3. Controls the speaker 4. Launches a XML-RPC server to receive requests from the controller.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
12
2.2.2.2.1 Pin Allocation
Raspberry
Pi
has
26
General
Input
&
Output
(GPIO).
GPIO
2,3,4,17,18,27,22,23,24,10,9,11,25,8,7,5,6,12,13,19 are allocated as Pulse Width Modulation (PWM) to control servos.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
13
2.2.2.3 WRTnode WRTnode is a open source hardware for OpenWrt. It is a mini Linux plus Wi-Fi and it provides completed IDE board. [2.2] 1. It has a embedded linux operation system which provides us a easy-use development platform. 2. It is small (45mm * 50mm) which can be placed into the body of the hexapod easily. 3. It consumes low power so that one battery is good enough to power up Raspberry Pi and WRTnode. 4. It is with Wi-Fi so that the controller can connect to it easily. The main duties of the WRTnode: 1. Emits Wi-Fi as an access point to the hexapod. 2. Connected to Raspberry Pi by LAN port. So that the controller can access Raspberry Pi by Wi-FI. 3. Launches a video-streaming server to capture the video in front of the hexapod by connecting a webcam to it via USB port.
2.2.2.4 Firmware and Operating System
OpenWrt Barrier Breaker 14.07 with self-customized firmware is used as the operation system for the micro-computer. OpenWrt is described as a Linux distribution for embedded devices. It just like a normal linux operation system. And it provides a fully writable filesystem with package management. The help us from the application selection and configuration provided by the vendor and allows you to customize the device through the use of packages to suit any application. OpenWrt is user-friendly and perfect for customization. [2.3]
HKU CS FYP14013 Implementation of An Intelligent Hexapod
14
2.2.3 Servos 2.2.3.1 Introduction A classic servo can rotate from 0 to 180 degree. It has 3 wires, 2 for power supply (red and brown) and 1 for signal communication (yellow or orange). The hexapod has 18 Tower Pro MG996R servos and 2 Power HD-1900A servos. Tower Pro MG996R is larger and has much more stall torque. We use them to drive legs’ servos of the hexapod. Power HD-1900A is smaller. We use them to drive the clip movement of the hexapod.
2.2.3.2 Control Principle
Pulse Width Modulation (PWM) is used to control servos. PWM is a technique used to encode a message into a pulsing signal. And it is mainly used to control electrical devices like servos. [2.4] Whenever a servo receives different frequency of pulse, it will rotate differently. For our servos, they can rotate from 0 degree to 180 degree by receiving from 500,000 to 2,500,000 ns duration of pulse. One Raspberry Pi has 26 General Purpose Input & Output (GPIO). 20 GPIOs are assigned for PWM output to control servos.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
15
2.2.4 Accessories The hexapod have 3 main accessories to make it more intelligent.
2.2.4.1 Speaker A speaker is connected to 3.5mm jack of the Raspberry Pi. eSpeak linux command is used control the speaker to speak words we want. eSpeak is a compact open source software speech synthesizer for English and other languages, for Linux and Windows. eSpeak uses a "formant synthesis" method. This allows many languages to be provided in a small size. And it is available as a command line program (Linux) to speak text. [2.5] The default audio output for Raspberry Pi is 3.5mm headphone jack. We made a connector with the speaker and the 3.5mm jack so that sounds can be played through the speaker.
2.2.4.2 Web Camera
MJPG-streamer software is used to control the web camera. MJPG-streamer takes JPGs from Linux-UVC compatible webcams, file system or other input plugins and streams them as M-JPEG via HTTP to web browsers, VLC and other software. [2.6] After installation of the software in WRTnode and the camera is connected to WRTnode via USB port, we can launch a stream video server. Then stream video captured by the camera can be accessed in “ http://IP:PORT/?action=stream ” where IP is the IP address
of
WRTnode,
PORT
is
the
server
port.
E.g
“http://192.168.4.1:9090/?action=stream”
HKU CS FYP14013 Implementation of An Intelligent Hexapod
16
2.2.4.3 Ultrasound Sensor
Ultrasound Sensor is used to detect the distance between the forward obstacle and the hexapod itself. Once it detects a distance which is smaller than certain amount, it can give a signal to the robot to avoid from crashing with an obstacle.
There are 4 pins in an Ultrasound Sensor.
two pins for power supply (5V and ground)
one GPIO pin for trigger
one GPIO pin for echo
2.2.5 Electronic Components There are numbers of electronic components we need to power up and connect abovementioned components.
2.2.5.1 Battery We used two types of battery. Xiaomi battery for microcomputers and Power Relay. Li-PO battery for servos. We intentionally separate microcomputers and servos into two different power source because of the following reasons: 1. Xiaomi battery can provides stable 5V and is user-friendly. It is a perfect power source for microcomputers. 2. Xiaomi battery can only output around 2.1A which is far not enough for servos. 3. Power sources to microcomputers and servos are separated to enhance power stability.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
17
2.2.5.2 Power regulator As the LiPo battery provides too high voltage for servos, we need a power regulator to stepdown voltage.
2.2.5.3 Power Relay Power Relay likes a electronic switch. It can switch off or on power by receiving electronic signal. By using it, we can control the power supply to servos in program efficiently.
2.2.5.4 Connectors and Wires We needs a lot of wires and connectors to connect abovementioned components.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
18
2.2.6 Kinematic Algorithm
Kinematic Algorithm is regarded as one of the most important component in a hexapod. It will be implemented in the C++ program to tell the hexapod how to move naturally like a spider. The details of Kinematic Algorithm will be discussed in section 4.
2.2.7 Mobile Application Controller
It is a user-end component. It gives users to control the hexapod easily and efficiently via XML-RPC call. XML-RPC is a remote procedure call (RPC) protocol which uses XML to encode its calls and HTTP as a transport mechanism [2.7]. A XML-RPC server is launched in the Raspberry Pi. And the server provides a list of method calls. The controller can command the hexapod easily by calling different methods. The details of Mobile Application Controller design will be discussed in section 5.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
19
2.3 Brief Overview - How all things work together Step 1: Prints the leg, body of the hexapod and other holders for other abovementioned components by a 3D printer.
Step 2: Assembles the hexapod with all components.
Step 3: Implements the kinematic algorithm and other control functions in a C++ program.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
20
Step 4: Launches the program in Raspberry Pi as a server to receive command from the controller. Step 5: Connects the controller to the hexapod via WiFi. Step 7: Sends XMLRPC requests from the controller to the server to instruct the hexapod to behave.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
21
2.4 Circuit Design A hexapod consists of many electronic components. Below is a complete circuit diagram to illustrate how all components be connected.
Points to be noted: 1. Connector 0, Connector 1 and Connector 2 have to be common-grounded by wire Ground 0 and wire Ground 1. 2. Two power regulators are used to reduce current load in each regulator. 3. Two signal wires from Raspberry Pi to Relay 0 and Relay 1 are omitted. The program will control Raspberry Pi to send signal to the relays to control switch on/off.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
22
2.5 C++ Program Design 2.5.1 Introduction The design approach is object oriented. There are 9 objects in the program:
Each object are dedicated to specific jobs. We will go through them one by one.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
23
2.5.2 Main Object It initializes other objects and controls the action of the robot.
After initializing all objects. The Main will launches the XML-RPC server in a new thread while there will be a while loop in the originally thread to keep checking the action variable to ask Kinematic to perform different actions. For example, say now Main find that the current action is MOVE_FORWARD. Main will ask Kinematic to perform move_forward function once. After finishing the action once, Main will check the current action again and so on.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
24
To ensure both Original thread and XML-RPC server thread reference the same action variable, shared memory technique is used. Shared memory is memory that may be simultaneously accessed by multiple processors with an intent to provide communication among them or avoid redundant copies [2.8]. Declare shared memory: const key_tipckey = 24568 ; const intperm = 0666 ; size_tshmSize = 4096 ; kine_action_shmId =shmget(ipckey,shmSize,IPC_CREAT | perm);
Retrieve shared memory: int *pointer =( int * ) shmat(kine_action_shmId, NULL , 0 ); * pointer =action;
2.5.3 GPIOClass Object It is responsible for controlling GPIOs. Step to control a GPIO: 1. Export a GPIO : write gpio number to /sys/class/gpio/export 2. Set the direction of the GPIO: write in/out to /sys/class/gpio/gpioXX/direction where XX is the GPIO number. 3. Read or write the value of the GPIO from/to the file /sys/class/gpio/gpioXX/value where XX is the GPIO number. 4. Unexport a GPIO: write gpio number to /sys/class/gpio/unexport
2.5.4 Point Object It transfers a (x,y,z) coordination into a object. It also calculates a new point from one point by rotation matrix algorithm. Consider the right diagram and say (x,y) is the original coordination of the endpoint of a leg in x-y plane. We want the endpoint to go to (x’,y’) by rotate θ degree anti-clockwisely. We can calculate the new coordination by , .
HKU CS FYP14013 Implementation of An Intelligent Hexapod
25
But in our case, it will be more complicated as a leg consists of 3 joins. We call them coxa (red), femur (blue) and tibia (green). (For details of leg joins and the hexapod coordination system, please refer to 4.1.4 and 4.3.1 respectively).
Let’s imagine viewing a leg like the above graph. We want to calculate the new coordinate (y’,z’,x’)after rotating the femur join “A” degree anti-clockwisely. y’ = (y-k)cosA - zsinA + k z’ = (y-k)sinA + zcosA x’ = x Point Object use this methodology to implement coxa_rotate, femur_rotate and tibia_rotate functions.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
26
2.5.5 ServoController Object It aims at passing signals to Raspberry PI pins to control servo rotation.
2.5.5.1 How to convert degrees into PWM signal? As mentioned before, our servos can rotate from 0 degree to 180 degree by receiving
from 500,000 to 2,500,000 ns duration of pulse. The ServoController will calculate corresponding duration of pulses for different degrees (servos is indicated by predefined indexes).
doubletmp =degree / 180.0 ; doublepulse; pulse =tmp * 2000000.0 + 500000.0 ; introundedPulse =round(pulse); this-> setPulse(index , roundedPulse);
And then the setPulse function will write the pulse to a file
“/sys/class/pwm/gpio_pwm: XX /duty_ns” to generate signal to the corresponding servo where XX is a GPIO number. 2.5.5.2 How to convert program calculated degrees to servo actual degrees?
The coordinate system in the program may be reversed to the servo itself see the following diagram:
HKU CS FYP14013 Implementation of An Intelligent Hexapod
27
Besides, a servo actual rotation may be different from our expectation. Say we need to input 87 degree in order to make a servo really look like moved to 90 degree. The (-3) degree difference is the offset. So before asking the servos to rotate, we need to calibrate the degrees first.
Pseudo code: offset , isReversed if isReversed degree -= offset; degree = 180-degree; else degree += offset;
2.5.6 Kinematic Object It calculates of mapping from hexapod movements to servo degrees with the help of leg inverse kinematic and body inverse kinematic algorithms. (Details of theory of kinematic algorithms please refer to section 4). It can also control the speed of movement.
2.5.6.1 Implementation of Leg Inverse Kinematic Algorithm Each leg have its own coordination system. Given an endpoint, with the mathematical inverse leg kinematic in section 4.1.4, we can calculate required servos’ degree to drive the leg to move to that endpoint.
2.5.6.2 Implementation of Body Inverse Kinematic Algorithm There is one universal coordination system for the whole hexapod and there are 6 variables to illustrate body center change. They are three variables for body translation (please refer to 4.1.3.1): posX, posY, and posZ. There are three variables for body rotation (please refer to 4.1.3.2): rotX, rotY and rotZ. With them, we can calculate all legs’ new endpoint coordinates when the body center is changed.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
28
Consider the above graph. There are two coordination system in the graph X1-Y1 and X2-Y2. (Details of coordination system of the hexapod please refer to 4.3.1) (0,0) (red point) is the original body center of the hexapod. (x,y,z) (blue point) is the original coordinate of a endpoint of a leg in X2-Y2 system. The blue point in X1-Y1 system should be (x+h,y-k,z). Now, say we want the body center to shift to right with a new coordinate (posX, 0) in X1-Y1. Then the new coordinate for the blue point should be (x+h-posX,y-k,z) in X1-Y1 system, (x-posX,y,z) in X2-Y2 system. Similarly, we can get 5 remaining endpoints’ coordinates of legs when the center of the hexapod shifts to (0,posX). Then each coordinate of endpoint of leg in its coordination system is passed to leg inverse kinematic to obtain corresponding servos’ degree.
The above example only show the effect of posX. The effects of posY and posZ can be done in the similar manner. While the effect of rotX, rotY and rotZ can be done by rotation matrix which will be discussed in section 4.1.3.3.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
29
2.5.6.3 How does the hexapod walk in a smooth gait?
First, we need to predefine a sequence of position list. By Leg Inverse Kinematic Algorithm, we can obtain all servos degrees required for different positions. And then, the degree list is passed to servo controller to perform the movements. For example, we predefine the following sequence of position list. Each list has 6 points as the hexapod consists of 6 legs: Sequence 1 : [point A1, point B1, point C1, point D1, point E1, pointF1] Sequence 1 : [point A2, point B2, point C2, point D2, point E2, pointF2] Sequence 3 : [point A3, point B3, point C3, point D3, point E3, pointF3] Sequence 4 : [point A4, point B4, point C4, point D4, point E4, pointF4] By Leg Inverse Kinematic Algorithm, we will get a sequence of degree list. Each list has 18 degrees as the hexapod has 18 servos: Sequence 1 : [degree 1A, degree 2A, degree 3A, … , degree 18A] Sequence 2 : [degree 1B, degree 2B, degree 3B, … , degree 18B] Sequence 3 : [degree 1C, degree 2C, degree 3C, … , degree 18C] Sequence 4 : [degree 1D, degree 2D, degree 3D, … , degree 18D] Then we can pass those degree lists to the servo controller to perform the movements.
Pseudo Code: positionList sequenceDegreeList Kinematic -> leg_kinematic(positionList,sequenceDegreeList) for ( i = 0 ; i < sequenceDegreeList.length ; i++ ) ServoController -> runAllServo(sequenceDegreeList[i]);
HKU CS FYP14013 Implementation of An Intelligent Hexapod
30
2.5.6.4 How does the hexapod move with different altitudes of the center?
Like the above pictures, the hexapod can move in different altitudes.
Say the loop of endpoints of a leg to perform move forward is (A->B->C->A->...).
It is obvious that coordinates of A, B, C should change with different altitudes. If we predefine a sequence of position list for different altitudes, the outcome will be hard-coding and not general enough. So, we need a general approach to calculate the coordinates of A, B, C after changing the altitude of the hexapod precisely.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
31
Point R is the reference endpoint. The initial coordinate of R is (0,coxa length+femur length,tibia length). It will changes with altitude. When the altitude is changed, it will be re-calculated by Body Inverse Kinematic. Then point A, B and C make use point R to find out their coordinates.
With the help of Point Object functions (mentioned in 2.5.4), point A, B, C can be re-calculated when the altitude is changed. pointA = pointR.coxaRotate(degreeA) pointB = pointR.femurRotate(degreeB) pointC = pointR.coxaRotate(degreeC)
As a result, the hexapod can move in different altitude without hard-coding different coordination sets.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
32
2.5.6.5 How does the hexapod move in different speed? The hexapod can move in a range of speed. The methodology is to partition the movement (here we assume that the servo can rotate to any degree in a negligible time). Say a servo needs to run from 0 degree to 90 degree to perform a movement. And you want the duration of this movement is 0.5 second. Then:
partition = (90-0)*delay/duration where delay is the time wait to perform the next rotation. For example delay is set to 0.1 second. Then partition = 18 degree.
When the servo is going to rotate from 0 degree to 90 degree, it will wait 0.1 second first, then rotate to 18 degree and so on until it reaches 90 degree. The total time elapsed = 0.1 * (90/18) = 0.5 which is equal to our duration requirement. By setting different duration lengths, the program will calculate corresponding partitions so that we can control the movement speed of the hexapod.
Pseudo Code: partition = abs(toDeg - fromDeg)* delay / duration numOfLoop = duration / delay currDeg = fromDeg for ( i = 0 ; i < numOfLoop ; i++ ) if currDeg < fromDeg if currDeg + partition > toDeg currDeg += toDeg - currDeg else currDeg += partition else if currDeg > toDeg if currDeg -partition < toDeg currDeg -= currDeg - toDeg else currDeg -= partition sleep(delay); ServoController -> runServo(currDeg);
HKU CS FYP14013 Implementation of An Intelligent Hexapod
33
2.5.7 UltrasoundSensorController Object It switches on or off the sensor and performs calculation of the distance between the sensor and the forward obstacle. Step to calculate the distance: 1. Write “1” (high state) and then wait 10 us to write “0” to trigger pin to trigger the sensor to send out a ultrasonic burst 2. The sensor will be ready to detect the reflected burst. Whenever it detected a reflected burst, it will set the echo pin to “1” (high state). 3. Calculate the duration (T) in microsecond of the echo pin that keep remaining high state. 4. Calculate the distance (in centimeters) of the object by T/58.
trig_gpio -> setVal( "1" ); usleep( 10 ); trig_gpio -> setVal( "0" ); string echoVal; structtimeval tv_start; structtimeval tv_end; intstart; intend; while((echoVal = echo_gpio -> getVal()) == "0" ) { gettimeofday ( & tv_start, NULL ); start =tv_start.tv_sec * 1000000 +tv_start.tv_usec; } while((echoVal = echo_gpio -> getVal()) == "1" ) { gettimeofday ( & tv_end, NULL ); end =tv_end.tv_sec * 1000000 +tv_end.tv_usec; } doubledistance =(end - start) / 58.0 ; returndistance;
HKU CS FYP14013 Implementation of An Intelligent Hexapod
34
2.5.8 Speak Object It controls the speaker to speak a given string. voidSpeak :: speakOnce(string s ) { string tmpS = "espeak \"XXX\" -p100 -s100 --stdout | aplay -D \"default\"" ; popen(tmpS.replace( 8 , 3 ,s).c_str(), "r" ); }
popen function is used to directly call the unix command in the C++ program. It responds fast and will not block the program thread.
2.5.9 PowerClipController Object It controls the action of the hexapod clip.
It has 2 functions only. One for releasing the clip. Another for tightening the clip. Both are achieved by sending PWM signals to corresponding servos.
2.5.10 XmlrpcServer Object It is a server in an independent thread to receive command from the controller continuously. Whenever it gets a request, it will accomplish it by asking other components to perform corresponding actions.
2.5.10.1 Methods interface for the controller The implementation of methods is completely invisible and isolated to the controller. All methods only require simple primitive type data as arguments like int, double and string. It
HKU CS FYP14013 Implementation of An Intelligent Hexapod
35
helps abstraction of code and the maintenance ability. The server provide a comprehensive list of method so that the controller can fully control the hexapod easily. There are 4 components of methods to control the hexapod. Power Control, Accessories control, Servo Control and Kinematic Control:
Power Control: servo_power_off: shut down the power supply to servos servo_power_on: resume the power supply to servos Accessories Control: ultrasound_sensor_read: read the distance between the hexapod and forward obstacle once pcc_tight_method: command the clip to be tightened pcc_release_method: command the clip to be released speak : speak out the given string Servo control: servo_run : control an individual servo to move to certain degree leg_move : control an individual leg to move to certain position Kinematic setting: kine_set_action : change the action of the Kinematic (mentioned in 2.5.2) kine_set_delay: set the delay in Kinematic (mentioned in 2.5.6.5) kine_set_interval: set the duration in Kinematic (mentioned in 2.5.6.5) kine_set_gait_point_delay: set the delay between each point in a gait kine_set_posX: set the center of body in x-axis kine_set_posY: set the center of body in y-axis kine_set_posZ: set the center of body in z-axis kine_set_rotX: set the center of body rotation along the x-axis kine_set_rotY: set the center of body rotation along the y-axis kine_set_rotZ: set the center of body rotation along the z-axis
HKU CS FYP14013 Implementation of An Intelligent Hexapod
36
3 3D Printing The hexapod robot requires a highly customize shape due to the electronic components we bought. For example, we need to customize the holder based on the size of the servos. Also, we need to drill hole that can fit into different components like ultra-sound sensor and speaker. In the coming session, we will discuss how we use Sketchup to do simple 3D Modelling(Section 3.1.1). As a beginner, we can download existing design from the web and modify. Then we can print directly using a the printer software. The software will do fix-up and slicing to enable the 3D Printer to print.
3.1 Basic Steps of 3D Printing 3.1.1 3D Modelling 3.1.1.1 What is 3D Modelling? It
is
the
process
of
developing
a
mathematical
representation
of
any
three-dimensional surface of an object. The model can also be physically created using 3D printing devices. [3.1]
HKU CS FYP14013 Implementation of An Intelligent Hexapod
37
3.1.1.2 Why Sketchup? Sketchup’s primary usage is for civil and mechanical engineering, architectural and interior designs. It can enter exact measurement using the metric system (e.g. millimeters, centimeters) or the imperial system (inches, feet). There is another 3D Modelling software called Blender. Its primary usage is for visual effect, interactive application and video games. It’s good at lighting effects. It’s default scale uses “Blender units”. Sketchup is not only easier to use and it’s also more suitable for accurate measurement of the objects. Therefore, we choose Sketchup.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
38
3.1.1.3 Download objects from 3D WareHouse? To save time, we can start by searching 3D model in the built-in 3D WareHouse that’s directly editable in Sketchup.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
39
3.1.1.4 Download objects from Thinkiverse? Thinkiverse is a website resources that people using different 3D modelling software can upload to. Usually they will upload the STL file which can be imported to Sketchup but not directly editable. To import the STL, we need to download “Sketchup STL” from Extension Warehouse.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
40
3.1.1.5 How to edit the imported STL? To edit the imported STL, we can download “CleanUp” extension to clean out unnecessary lines.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
41
3.1.2 Convert to STL 3.1.2.1 Why convert to STL? We need to convert the SketchUp file (.skp) to .stl for the 3D Printing Software to read it.
3.1.2.2 How to export to STL in Sketchup? We need the “Sketchup STL” extension to allow export of STL
3.1.3 Fix-up 3.1.3.1 What is fix-up and why it’s needed? During fixup, the STL to be printed is often checked for errors. (They can’t translate directly to a 3D print if the structure is invalid.) [3.2] When we click “Print” or “Export”, it will go through Fix-up and Slicing.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
42
HKU CS FYP14013 Implementation of An Intelligent Hexapod
43
3.1.3.2 When will we encounter fix-up error? When there is empty space in the model, fix-up error will occur.
3.1.3.3 How to fix the fix-up error? First, we can try the “Auto repair” option provided by the printer software.
If the problem still exist, we need to export each components and find out which one has problem. After identifying the problematic object, we can either amend the object or simple re-draw it.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
44
3.1.4 Slicing 3.1.4.1 What is slicing and why it’s needed? Slicing will convert .stl file to g-code file. It converts the model to series of thin layers for printer to complete the print operation. The STL file format has become the Rapid Prototyping industry's defacto standard data transmission format. STL format approximates the surfaces of a solid model with triangles. For a simple model such as the box shown in figure 1, its surfaces can be approximated with twelve triangles, as shown in figure 2. [3.3]
G-Code is a language in which people tell computerized machine tools what to make and how to make it. The "how" is defined by instructions on where to move to, how fast to move, and through what path to move.” Slicer automatically generates the G-Code required by your printer when slicing a part. [3.4]
HKU CS FYP14013 Implementation of An Intelligent Hexapod
45
3.1.5 Print 3.1.5.1 How the 3D Printer prints? 3D printing is a form of additive manufacturing technology where a three dimensional object is created by laying down successive layers of material. [3.5]
HKU CS FYP14013 Implementation of An Intelligent Hexapod
46
3.1.5.2 Why we need to print support? Some model is not able to print because of the top layers are “on the sky” without support. We can solve this problem easily by enabling the supports option before print.
3.1.5.3 How to printing objects with support? In order to print the object with support, we can enable the “Support“ option in the printer software.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
47
3.1.5.4 Are there any alternative other than printing support? Yes. We can print the split the objects into multiple parts and drill holes. Then we can use screws to assemble the objects. This is prefered if you want to save expensive filament and avoid the trouble to remove the support layers. We use this method to print our robot clamps.
3.1.5.5 What to do if the plastics don’t stick on the platform? This situation will happen if we didn’t clean the platform properly after applying a layer of glue. So we have to make sure we clean and glue the platform after each print. Usually we only need to apply glue on the platform for large objects (cover >70% of the platform) to avoid shrinking problem.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
48
3.1.5.6 What to do if Nozzle is stuck? Heat up the nozzle and brush it with a metal brush.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
49
3.1.5.7 What to objects from shrinking? Clean and Glue the platform.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
50
4.1.9.6 Rotate it to right orientation Remember the gravity problem. The one on the left will not be able to print. rotate to the correct orientation(right) when you print it.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
51
3.2 Design of the Robot 3.2.1 Body Design 3.2.1.1 Single layer vs. Sandwich?
This is a single layer approach. It’s a simple design and the squared holes allow the servos to hold tightly in the centre. However, the body may be bent and lead to inaccurate/unstable hexapod movement.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
52
This is a sandwich design. We need to align the upper and lower layer with extra design work and it’s also heavier. The benefit of this design is it’s that it will not be bent easily because both layers act as a support of the whole body. This design also extra space in the middle to carry out complex electronics.
3.2.1.2 Embedded eletronics design I work closely with the architecture and kinematics team mates to embed the eletronics in the body. Thanks to the flexibility of 3D Printing, we can firmly hold the different parts together. It can also be reproduced easily.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
53
3.2.2 Clamp Design The Clamp is a difficult part of the robot. It has to hold two servo tightly between the two body layers. The challenge is to make a clamp design that’s robust and space efficient.
3.2.2.1 Loose Clamp Design At the beginning, we try to make the design by our intuition which leave out a lot of space in between the servos. It has to print with support which is not filament friendly and time consuming to remove. The most fetal design problem is that it will easily break.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
54
3.2.2.2 Tight Clamp Design Then, we design a clamp that doesn’t require a support and is hold closer to each other.
3.2.2.3 Tight + Space efficient Design This is the final design of the clamp. it has the closest possible servo spatial arrangement and without needed to print a support.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
55
3.2.3 Leg Design 3.2.3.1 Short leg joint (Femur) We tested and collaborated with the whole team to designed a shorter joint to allow larger force (moment = force * distance) as suggested by Arack and Terry. (Detail can refer to ) Therefore, having a more stable walking performance.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
56
3.2.3.2 Legs design We made a more beautiful design for the leg gradually.
3.2.3.3 High density leg joint We print the leg joint with 50% density(default 10%) to avoid bending problem during walk.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
57
3.2.3.4 Both side handle of the legs We found that the robot is a bit unstable with the one-side joint design. It will shake and bend during the walk movement. Therefore, we designed a cover and make the whole joint movement more robust.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
58
3.2.4 Design Tips 3.2.4.1 Design is about trails Difference in one millimeter of diameter and thickness can lead to failure. Multiple failure is expected.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
59
3.2.4.2 Always draft first We should always measure the actual length and draft on a paper first. So that we can directly type the exact size in Sketchup. Since printing takes a long time, we will waste time adjusting if we don’t know the exact measurement.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
60
3.3 Other important note 3.3.1 Assemble also takes time This robot needs at least around 150 screws to assemble. It’s recommended to buy a drill and confirm the design before assembling all the parts.
3.3.2 Collaboration with team is needed There is no doubt that the design of the whole robot requires whole team’s continuous effort. Designing a suitable robot for the kinematic movement takes a long time to test. There are big issues like the length of the legs to the look and feel of the robot. Always print out prototypes to verify with the team.
3.3.3 Be Patient with the Printer We encountered quite a number of issue with the printer as mentioned above in the printing step. I spent the first 2 months to print things smoothly. There is usually no direct online solution available. This job trained me to be more patient. We need to be calm to find the root cause of the problem and be creative and proactive to find out the solution.
3.3.4 Filaments are expensive We used 5 filaments. Each cost $300. 5*300 = $1500. Print it with least density and minimal critical part when doing prototype.
3.3.5 Search for extension There are many useful extension that can save you a lot of time. For example, the Cleanup extension.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
61
3.4 3D Printing Conclusion 3D Printing is a fast and cheap solution when printing a small amount. It can also print almost any shape we want. It’s a great way to verify ideas. I believe 3D Printing technology will advance to be cheaper, faster and more accurate in the coming years. Therefore, I think there are a lot of potential for future enhancement.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
62
4 Kinematics The primary goal of this project is to make the hexapod robot being able to walk. In order to make this goal achievable, there are two components that we have to study and implement, combining them and make them working together. The first component is Kinematics and the second one is the Gait.
4.1 Kinematics Algorithm Kinematics, which is a branch of Mechanics, is the study of motion of objects. In our case, what we concern and focus most are the Kinematics of the legs and the body of the Hexapod Robot. Kinematics is important not only because we need to know how the legs and the body should move, but also keeping the balance of the Hexapod Robot so that it would not fall during any body motion. In our robot architecture design, we make the robot moves by controlling the servos directly, sending a signal to the servo and it will then turn to our desired angle (detailed design and control please refer to 2.2.3 Servos by Arack). In this way, we only need to deal with the different joint parameters (servo angle) when we are implementing the Kinematics solution. To keep the robot balance when moving, we have to consider the centre of mass of the robot. The centre of mass should always be above and in between the centre of pivot, which is the centre of the edges where the end-effectors (endpoint of the legs) touch the ground. The Kinematic problems here are that we need to know exactly where the legs are and where they need to be next when walking the robot. There are two different approaches in solving the Kinematics problem, one is Forward Kinematics and the other one is Inverse Kinematics, which are both commonly used in real life robotics product. In the following, we then study and discuss the ideas of the two Kinematics theories.
4.1.1 Forward Kinematics The basic idea of Forward Kinematics is that when we are given all the joint parameters (servo angles) and all the length of links (robot arms), we can compute the exact coordinate (with respect to the Origin of the leg) of its end-effector (endpoint of the leg), for details of the coordinate system of the robot please refer to 4.3.1 Coordinate System . We will then always have only one solution for the problem using Forward Kinematics and there are two different approaches to solve the problem.
4.1.1.1 General Approach A general approach of solving the problem of any robot arms of n links uses a matrix called D-H Matrix. It is derived by Denavit and Hartenberg in 1950’s that they found the
HKU CS FYP14013 Implementation of An Intelligent Hexapod
63
representation of a general transformation of a link (between two joints) require four parameters, which are known as D-H parameters. The coordinate transformation is represented as follow:
Where d is the distance measured between the two perpendiculars along the joint axis, θ is the joint angle about vertical axis measured between the two perpendiculars, r is the distance measured between two joint axis along the mutual perpendicular, α is the joint relative twist (angle) between the two joint axis measured about the mutual perpendicular. d, θ, r, α are the D-H parameters. The four operations are then represented as follow:
And together it gives the follow finale representation:
However, this general solution is not trivial and seldom the most computationally efficient, as in real life practice it need more specialization in order to fit the Hexapod Robot. We now look at another simpler approach.
4.1.1.2 Analytical Approach When there are only two or three links in an robot arm, it might be possible to solve the problem analytically by using geometry and trigonometry. The simple example below can demonstrate this idea clearly. Diagram below is a robot arm lying in a 2D-plane. There are two links, Arm1 and Arm2 and two joint parameters, θ1 and θ2 . The red point is the end-effector where (x, y) is our desired location
HKU CS FYP14013 Implementation of An Intelligent Hexapod
64
By using simple trigonometry, x = L1cosθ1 + L2cos(θ1 + θ2) y = L1sinθ1 + L2sin(θ1 + θ2) we then have exact coordinate (x, y) But having the exact coordinate (x, y) does not really help in our case. We want the Hexapod Robot walks and that means we want the end-effector moves to a specific coordinate. The change of coordinate of the end-effector usually cannot be achieved by simply changing one single joint parameters. This make the forward kinematics not very useful as we have to input the joint parameters one by one and test if the end-effector reach our desired coordination. Therefore, we want a opposite approach and that is the Inverse Kinematics.
4.1.2 Inverse Kinematics Inverse Kinematics is the inverse of Forward kinematics. The idea is that when we are given the desired end-effector coordinates and all the lengths of links, we can then use these inputs to compute the required joint angles to achieve the goal. Again, there are two different approaches to solve the problem using inverse Kinematics.
4.1.2.1 General Approach The general idea of solving the problem is that when given the current coordinate and the desired coordination of the end-effector, we start with the joint nearest to the end-effector. We rotate the nearest joint in the way that the end-effector moves toward to the required coordinate. We then repeat it with the next joint in an iterative manner until the base joint is rotated. Such idea can be implemented by using the Jacobian matrix. However, in our case, this method is not what we focus on as the solution is not as trivial as the analytical approach and the implementation of matrix and vector does not give a satisfying performance. HKU CS FYP14013 Implementation of An Intelligent Hexapod
65
4.1.2.2 Analytical Approach Same as the analytical approach used in solving Forward Kinematics as mentioned before, we can solve the problem using trigonometry. Below is a 2D-plane example: The green dot is the desired coordination where the end-effector located, L1 is the length of link and γ is the required angle.
Given the desired coordination (x, y) , we have the required angle computed by γ = tan−1( yx )
4.1.3 Body Kinematics We now know that there are two different Kinematics theories and two different approaches to solve the Kinematics problem, however, only knowing the equations doesn’t help in achieving our goal. We need to consider applying the Kinematics equation on the body of the hexapod robot to achieve all our desired body motions. Although our primary goal is to make the hexapod robot walk, there are two more body motions other than just walking that we need to concern. These body motions help balancing the body and they are Body Rotation and Body Translation.
4.1.3.1 Body Translation Body translation is the shift of centre of the body horizontally to X-Y plane or vertically to Z plane without changing any of the coordinate of all the end-effectors. Diagrams below show all the Body Translation along three axes.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
66
HKU CS FYP14013 Implementation of An Intelligent Hexapod
67
We can see that the actual position of all the endpoint of the legs remain the same in the two diagrams, however, the centre of the body did shift and there is a relative change in the coordinate of the legs but the actual position actually remains. The approach on solving the Body Translation is pretty straightforward, the idea is that we can pass the change of coordinate of the body (Origin of the body taken at the centre of mass of the body) to the Leg Kinematics. Let’s say if we want the robot do a Body Translation along Z-axis, the value of z can be directly passed to the Leg Kinematics of the six different legs, then the six different legs will move together with the same displacement along Z-axis. Thus, achieving the desired Body Translation along Z-axis.
4.1.3.2 Body Rotation Similar to Body Rranslation, Body Rotation is a kind of body motion that moves only the relative position of the legs but not the actual position. There are three kinds of Body Rotation and the diagrams below show these three different kinds of body motion along the three different axes. a. Roll (Rotation about Y-axis) b. Yaw (Rotation about Z-axis) c. Pitch (Rotation about X-axis)
HKU CS FYP14013 Implementation of An Intelligent Hexapod
68
HKU CS FYP14013 Implementation of An Intelligent Hexapod
69
For all of the three Body Rotations, we can see that the endpoint of all legs of the hexapod robot remain the same actual positions throughout the motion. There are multiple usages of Body Rotation and Translation, a simple usage would be a demo of the hexapod robot “dancing” with the beats of music using these body motions. A more practical usage is that these Body Kinematics actually provide the foundation of achieving the goal of a walking robot. Unlike Body Translation, we don’t have a direct and simple approach on solving Body Rotation. What we need is the Rotation Matrix.
4.1.3.3 Rotation Matrix One solution to those Body Rotations is the Rotation Matrix. Rotation Matrix is a matrix that is used to perform a rotation in 3D-space, there are three basic rotations about the three axes and they are shown as follow.
Where θ is the angle rotated. When considering our cases, we want a general equation representing the overall rotation of the body. Such equation can be given by as follow:
Where α, β, γ is the yaw, roll, pitch angle in our case respectively (as we take X-axis as the axis along the head-tail of the body and Y-axis as the left-right axis, details please refer to 4.3.1 Coordinate System ). Thus, by expanding this equation we have the following:
HKU CS FYP14013 Implementation of An Intelligent Hexapod
70
So, given the current coordinate and the desired yaw, roll, pitch angle, we can compute the new desired coordinate by the following derived equation.
Where x′, z′, y′ is the new desired coordinate, x, y, z, R are the previous coordinate and the Rotation Matrix defined above. We can take body rotation, Yaw, as an simple example of how to resolve Body Kinematics problem. Say if we want the Hexapod Robot to yaw for 30 degrees, the robot needs to move its legs to a new position. The new desired coordinates of each leg can be easily found by using trigonometry, sin(30) or cos(30) in this case, or more precise by using the equation above, then by passing the new desired coordinates to the Leg Kinematics, the Leg Kinematics will actually calculate every required joint angles for moving all the legs to the new coordinates. Thus the Body Rotation - Yaw can be achieved. We will look into how the Leg Kinematics works in the following.
4.1.4 Leg Kinematics Leg Kinematics is needed for achieving all the desired body motions. The working principle of Leg Kinematics is that given any desired coordinate of the end-effector, the Leg Kinematics can then calculated all the required joint parameters to achieve such position. We start to solve our leg kinematics problem by analysing the proposed design. Our hexapod robot consists of six legs, each leg has three links and three joints (servos), of which two joints lie on the vertical plane and the other one joint lies on horizontal plane. The three link are namely Coxa, Femur and Tibia. The three joint parameters are namely γ − Gamma (horizontal), α − Alpha (vertical) and β − Beta (vertical). The graph below on the left is the sketch of one of the leg.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
71
Now we take a look at the graph on the right. It demonstrates how we solve the problem using Inverse Kinematics. First, we solve joint parameter γ by viewing the problem from above. The colour blue, red and green represent three different links and the green dot represents the position of the end-effector. Similar to the example we mentioned in 4.1.1.2 Analytical Approach , we can obtain the required angle γ and perpendicular length of end-effector to the base of link L1 easily by using Trigonometry and Pythagoras Theorem as follow: γ = tan−1 yx L1 = √x2 + y2 Then we take the front view of this leg as shown below. Blue colour link is link Coxa, where red one is Femur and Green one is Tibia. To make the calculation easier, we divided required joint parameter α into α1 and α2 .
HKU CS FYP14013 Implementation of An Intelligent Hexapod
72
We first measured the length of all three links, C oxa, F emur, T ibia and the height of Z offset . The L=
length
can
L
be easily
found by
using
Pythagoras
Theorem:
√ (L − C oxa) + Z 1
2
2 offset
Then together with Cosine Law, we can find the two required joint parameters easily. Z −1 Femur2+L2−Tibia2 ) , β = cos−1(Femur2+Tibia2−L2 ) Where α1 = cos−1( offset L ) , α2 = cos ( 2×Femur×L 2×Femur×Tibia and α = α1 + α2 . So by using all these equation, when we input a desired coordinate (x, y, z) , the implemented function will output the required joint angle α and β . However, since our derived equation may give one or more solution or no solution to the inputted desired coordinate, we have to set some limits with respect to the physical constraint of the robot. For instance, the physical length of link C oxa, F emur, T ibia . is the physical limit that the leg can stretch to its farthest. We now have the solution of of Body Kinematics and Leg Kinematics, but the Hexapod Robot still would not be able to walk without the last major component, the solution of Gait.
4.2 Locomotion
After studying and designing the Kinematics solution, we come to the study of Gait. The reason why we need Gait study is that Kinematics solution only solves the problems of
HKU CS FYP14013 Implementation of An Intelligent Hexapod
73
finding where the leg should be and how it should be, by determining the required joint angles. It did not solve how the robot should walk and how it can walk. The problem is that only moving all the legs from one coordinate to the desired coordinate does not mean it can achieve the desired body movement. Let’s say when we give the new desired coordinate to the Leg Kinematics, it can only solve the required joint parameters for moving the legs to the desired position. For instance, if we want the robot walk forward, the legs will only shift forward but not rising and putting again like what a natural walking motion should be. Gait study is the solution to this problem. Gait study is the study of pattern of movement of the limbs of animals/insects. Most animals use different gaits and we will study some of them and decide which one to implement so that the hexapod robot can walk naturally. As the hexapod robot looks very much like a spider, there are a few common insects gaits that we can start with, one is Alternating Tripod Gait and one is Metachronal Gait.
4.2.1 Alternating Tripod Gait The Tripod Gait is the fastest gait found in insects. When walking, the insect always has three legs on the ground (two legs on one side and one leg on the other side) so that it forms a tripod for balancing. That means the insect is moving three legs in the form of tripod in the same time. The figure below shows a hexapod walking in the Tripod Gait.
4.2.2 Metachronal Gait The Metachronal Gait is the slowest gait found in insects. When the insect is moving, it moves only one leg at a time and keeping the other five in contact with the ground.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
74
The graph above has shown the comparison between a) Metachronal Gait and b) Tripod Gait. Both gaits give great stability as the centre of mass stays consistently within the centre of pivot during the leg movements.
4.3 Design & Implementation After going through all the basic ideas on the Kinematics and Gait, we now come to the design and implementation of those solutions towards our problems. In order to achieve all our desired robot movement, we need to implement the Gait, Body and Leg Kinematics in a way that all these essential components would work together. The initial idea is to first apply Inverse Kinematics on the leg and body of the Hexapod Robot, that the first goal would be achieving all desired leg and body motion. Then we would implement the Gait and use the Leg and Body kinematics to make it work. After studying the two different Kinematics and the two different approaches in solving the problem, we decided to use the Inverse Kinematics together with Analytical Approach in this project. The reason is that the idea of Inverse Kinematics fits our needs better, as we want to the legs to reach a destined position but we don’t know how it can be achieved, Inverse Kinematics gives a way to solve the required joint angle out. Besides the benefits that the Analytical Approach solution is straightforward and trivial, it also have advantages in implementation. As this method requires only simple trigonometry function, we can implement it easily by using the standard library of many programming language. Also, the processing speed of trigonometry function is fast and it enhances the performance of our Hexapod Robot. So we made the decision in using Analytical Inverse Kinematics as our approach to solve the kinematics problem. To make the implementation easier to manage, we split Analytical Inverse Kinematics solution into two parts, the Body Kinematics and the Leg Kinematics. The idea of splitting it is that imagine if we want to control the Hexapod Robot walking straight forward, we should consider how the body motion is but not considering each leg separately. Let’s say we have the centre of mass of the body as the Origin (0, 0, 0) , Z-plane as the vertical plane and
HKU CS FYP14013 Implementation of An Intelligent Hexapod
75
X-Y-plane as the horizontal plane, moving the robot forward or backward or left or right should only have changes in X-Y coordinates. We can then pass these coordinate changes to the Leg Kinematics of the six different legs, resulting in changes of coordinate of all the end-effectors, so that each leg can then achieve it desired coordinate. This makes the Kinematics problem easier to solve and extends the usability of these Kinematics solution. Before we move on to the design and implementation of leg and body, we first talk about how we define our Coordinate System of our Hexapod Robot.
4.3.1 Coordinate System
Before we start designing the Leg and Body Kinematics solution, the first thing we need to solve is to set a Coordination System as the basis of all our design and implementation. As the Kinematics solution involves a lot of rotation matrix equation, we want a Coordinate System that will ease the implementation. After discussion with the other teammates, for the ease of calculation and implementation, we decided to set the Coordinate System as follow. Here we define different Coordinate System for the Body Frame and Leg Frame. A Frame is an Axis System. The reason why we do this is that the body and the legs are actually in a different frame. The body of the robot has its own Axis System while every legs also has its own Axis System.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
76
4.3.1.1 Body Frame
As shown in the graph above, this is the sketch of Hexapod Robot from the view above (vertical view of the horizontal X-Y plane). The red O is the Origin (0, 0, 0) of the Body Frame, which is located at the centre of the body of the hexapod robot. The distance between the Origin of the leg and the Origin of the body is indicated as the blue arrow and such distance is called the leg offset. The two red arrows indicated the positive direction of the X-Y axes. The six legs of the robot namely from leg0 to leg5 is indicated in the graph too. leg0, leg1 and leg2 belong to the the negative Y-Axis (the right side of the robot) on the graph where leg3, leg4 and leg5 belong to the positive Y-Axis (the left side of the robot). Where the legs lie on the positive side or negative is very important as this difference directly affects the calculation using rotational matrix and the implementation of the equations. The graph of viewing the robot from the front (horizontal view of Z-Y plane) is shown below. Z-Axis is the vertical axis.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
77
The scale of this coordination system is taken as 1 = 1 cm for the ease of design, implementation and testing.
4.3.1.2 Leg Frame After defining the Body Frame, we need to define the frame for all of the six legs. The graph below shows how we define the Leg Frame. We take the base of the leg (End point of link C oxa , which is connected to the body, as the Origin (0, 0, 0) of the leg. The green O indicates the Origin (0, 0, 0) of the leg and the green arrows indicate the direction of the X-Y Axis. The vertical Z-Axis is not shown in the graph but it is the same as the Body Frame, positive side being vertically up and negative side being down. We are defining this Leg Frame on all of the six legs.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
78
As mentioned in 4.1.3.3 Rotation Matrix , since we are using the Rotation Matrix to solve most of our Kinematics problem, we need to consider one important property of Rotation Matrix. The important property is that given a rotation angle θ , if θ is positive (i.e. θ = 90° ) then the direction of rotation is counterclockwise if θ is positive , and obviously, the direction is clockwise if θ is negative (i.e. θ = −90° ). So in our cases, when designing the Gait and Kinematics solution, we need to consider the following diagram for every Leg Frame and Body Frame in deciding the value of input and direction of movement.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
79
The above diagram show the value of rotation angle θ and its corresponding rotation direction in our Coordination System when applying on the Hexapod Robot.
4.3.2 Implementation of Leg Kinematics In our implementation, we implement the derived equation (please refer to the 4.1.4 Leg Kinematics ) directly using ACOS2, ATAN function with the four measured parameters, C oxa, F emur, T ibia, Z offset . When we input a desired coordinate (x,y,z), the implemented function will output the required joint angle α and β . As we mentioned, since our derived equation may give one or more solution or no solution to the inputted desired coordinate, we have to set some limits with respect to the physical constraint of the robot. For instance, our first prototype (just a single leg of the hexapod robot), we measured the link C oxa = 3 cm, F emur = 9.5 cm, T ibia = 16.5 cm . We take the coordinate scale as 1 = 1 cm , that is for example a desired coordinate (0, 29, 0) would be the physical limit that the leg can stretch (since the total length of C oxa, F emur, T ibia is 29 cm). Besides this physical limit of the leg, we also did some fine tunes in the implementation by providing different offset measured after assembling the leg. Such offset is essential as the assembling of servos onto the leg is not always perfect, we need to measure and test the offset of each component. However, when we tested the implementation of leg, we found that the degree calculated by the algorithm is different from what we expected to pass to the control of the servos. The angle of the servos are different from the angle calculated from the Kinematics algorithm. Adjustment is then made to solve the problem, details of the problem and solution please refer to 2.5.5.2 How to Convert Program Calculate Degrees to Servo Actual Degrees by Arack .
HKU CS FYP14013 Implementation of An Intelligent Hexapod
80
The implementation of the Leg Kinematics is pretty straight forward. After having it as the basis of Body Kinematics, we move to the implementation of Body Kinematics.
4.3.3 Implementation of Body Kinematics The implementation of Body Kinematics is again straightforward. We apply the derived equation in 4.1.3 Body Kinematics directly. As declared in 4.3.1 Coordinate System , we take the centre of the body as the Origin (0, 0, 0) of the Hexapod robot. However, for a better design in the sense of neat and tidy, we did not implement a general Body Kinematics solution. Instead, we implement it in a way that it will work on each leg separately, in connection to the implementation of Leg Kinematics solution. In our implementation, there are six required parameters, representing the two different kinds of body motion, Translation and Rotation (Yaw, Pitch, Roll angle), and they work together with the the current coordinate of the endpoint of the legs. The six parameters are as follow: translation_X (posX), translation_Y (posY), translation_Z (posZ), rotation_X (rotX), rotation_Y (rotY), rotation_Z (rotZ). These parameters store exactly the value in translation along three different axis and the degree in rotation about three different axis. Then together with the current coordinate of all the end-effectors, the implemented solution will return the new desired coordinates and pass them to the Leg Kinematics solution for solving. Detailed elaboration about these parameters please refer to 2.5.6.2 Implementation Body Inverse Kinematic Algorithm by Arack. Again, similar as implementation of Leg Kinematics in 4.3.2 , there are physical limitations on the degree of movement that the Hexapod Robot can perform. Such limitations are tested and we set restrictions according to the test results in our implementation. In our first implementation, we found that the Body Kinematics solution did not really work. After reviewing our implementation, we found that problem come from the missing of considering the offset of each leg to the centre of body.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
81
So in our fixed implementation, for each of the leg, we consider the current coordinate of the end-effector (with respect to the base of the leg) as well as the leg offset (distance between base of the leg and Origin of the body) as indicated as the blue arrow in the diagram above. The implementation of Body Kinematics together with Leg Kinematics is then able to perform all our desired movement. At this step, we finished the implementation of most of the body motion. However, the Hexapod Robot still does not know how to walk! Imagine if we want to move the robot by changing let’s say the X-coordinate of the centre of the body (as walking straight forward/backward is the same as changing X-coordinate), such change in coordinate would only result in the same shift of all coordinates of all the endpoint of the legs. That would only result in the body X-translation but not actual walking. With only the Leg and Body Kinematics the robot would not know how to walk, what we need now is the design and implementation of Gait, which perform the walking of the robot together by the Leg and Body Kinematics Solution.
4.3.4 Implementation of Gait In 4.2. Locomotion , we covered two different kinds of Gait. The Alternating Tripod Gait and Metachronal Gait. We decided to choose Alternating Tripod Gait to implement as it is easier in design and can achieve fastest speed in walking. Though Metachronal Gait is the best in the sense of balancing the body (since there is only one leg above the ground at one
HKU CS FYP14013 Implementation of An Intelligent Hexapod
82
moment), Alternating Tripod Gait can also achieve excellent balance with correct setting and Metachronal Gait is the slowest gait. The initial idea of how to implement the gait is that we will predefine a loop of sequences of steps (predefined coordinates). From the graph below you can actually see that the legs performing Tripod Gait is actually repeating its own sequence of steps.
So in our approach, first of all we define the Natural Position of the Hexapod Robot. The Natural Position is that when Tibia is vertically aligned to the ground and Femurs aligned horizontally to Coxa. This Natural Position is useful in calibration of the all the joints offset and also as the base of the loop of Gait. Assume we have a predefined loop of 10 steps, each leg is then assigned to start with different steps (assigned according to the step of Tripod Gait shown above). Then we can just loop the sequence of steps and achieve the desired body movement. There are 3 different kinds of walking that we want to achieve. a. Walking straight (forward, backward, left, right) b. Wheeling (forward and backward right, forward and backward and left) c. Rotating (right, left) And we are going to cover them one by one in the following section.
4.3.4.1 Tripod Walking Straight After studying the loop demonstrated in last section, we come up with the idea of designing two different loops in each motion. There are two loops of different phase as the Tripod Gait works in a way that two legs on one side and one leg on the other side are moving in the same phase, while the other three legs are moving in the other phase at the
HKU CS FYP14013 Implementation of An Intelligent Hexapod
83
same time. Consider the graph below:
For instances, leg0, leg2, leg4 are of the same phase (green circle) and leg1, leg3, leg5 are of the other phase (yellow circle). So when we just consider the leg on the two different sides, we have the graph below.
This graph is a draft of how the legs should move in the loop. The graph on the left is the leg on the right side of the robot, which is viewing from the right side of the body. The graph on the right is the leg on the left side of the robot, which is viewing from the left side of
HKU CS FYP14013 Implementation of An Intelligent Hexapod
84
the body. Axes directions are drawn in grey colour. The red dots are end point of the leg and the black arrow indicate how the endpoint of the leg should move in order to achieve the desired movement. A diagram of the actual Hexapod Robot moving its leg in loop A → B → C is shown as follow.
In our design, point D is the Natural Position of the leg, where point B is vertically above point D, and point A and C are horizontally align with point D. 4.3.4.1.1 Forward So by analyzing the graph in last above, we then figure out that we need two different loop of steps to make this movement feasible. They are as follow: Phase 1 (leg0, leg2, leg4): A → B → C → D → A Phase 2 (leg1, leg3, leg5): C → D → A → B → C The two different phases are assigned to the legs as shown. The loop is a complete loop which gone through every point as stated in the graph. 4.3.4.1.2 Backward It is basically the inverse of sequence of walking forward. Phase 1 (leg0, leg2, leg4): C → B → A → D → C Phase 2 (leg1, leg3, leg5): A → D → C → B → A 4.3.4.1.3 Right For this case, it is pretty different from the above two directions. We can consider the following graphs for design idea.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
85
This graph actually shows the loop of leg1 and leg4 when walking towards right. The two legs are walking in the same direction. For the overview of all the legs movement, see below.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
86
This graph demonstrates exactly how should each leg should move in their own loop in order to walk rightwards. Therefore, deriving from this graph, we have the following different loops. Right side Phase 1 (leg1): A → B → C → D → A Right side Phase 2 (leg0, leg2): C → D → A → B → C Left side Phase 1 (leg3, leg5): C → B → A → D → C Left side Phase 2 (leg4): A → D → C → B → A 4.3.4.1.4 Left For walking leftwards, we can just invert the flow of walking rightwards. Right side Phase 1 (leg1): A → D → C → B → A Right side Phase 2 (leg0, leg2): C → B → A → D → C Left side Phase 1 (leg3, leg5): C → D → A → B → C
HKU CS FYP14013 Implementation of An Intelligent Hexapod
87
Left side Phase 2 (leg4): A → B → C → D → A
4.3.4.2 Tripod Wheeling The idea of how to implement Tripod Wheeling is extremely direct and simple. We simply consider the Hexapod Robot walking same as in 4.3.4.1 Tripod Walking Straight, but with one side of the legs walking with smaller degree of movement while the other side of legs walking with greater degree of movement. Therefore, the Forward Wheeling has the same loop as walking straight forward in 4.3.4.1.1 Forward and the Backward Wheeling has the same loop as walking straight backward in 4.3.4.1.2 Backward . When we want to do Right Wheel, we can set the degree of movement of legs on right side (leg0, leg1, leg2) smaller and those legs on left side (leg3, leg4, leg5) larger. For Left Wheel, it is just exactly the opposite, larger degree of movement on legs on right side and smaller degree for those on left side. 4.3.4.2.1 Forward Right/Left Wheel Phase 1 (leg0, leg2, leg4): A → B → C → D → A Phase 2 (leg1, leg3, leg5): C → D → A → B → C For the diagrams of the steps, please refer to 4.3.4.1 Tripod Walking Straight 4.3.4.2.2 Backward Right/Left Wheel Phase 1 (leg0, leg2, leg4): C → B → A → D → C Phase 2 (leg1, leg3, leg5): A → D → C → B → A For the diagrams of the steps, please refer to 4.3.4.1 Tripod Walking Straight
4.3.4.3 Tripod Rotating
Tripod rotating is that we want the robot to rotate itself (i.e. changing the heading direction with no displacement made). This can easily be achieved by having one side of the body walking forward and the other side of the body walking backward, depending on which direction to rotate. For instance, if we want the robot does a Right Rotating, the legs on the right side of the body should do the same movement as walking straight backward and those legs on the left side of the body should do the same movement as walking straight forward. For Left Rotating, it is just the opposite. 4.3.4.3.1 Right Rotating So for the legs on right side we just use the loop of walking in 4.3.4.1.2 Backward and for legs on left side, we use the loop of 4.3.4.1.1 Forward. Right side Phase 1 (leg1): A → D → C → B → A Right side Phase 2 (leg0, leg2): C → B → A → D → C Left side Phase 1 (leg3, leg5): C → D → A → B → C Left side Phase 2 (leg4): A → B → C → D → A
HKU CS FYP14013 Implementation of An Intelligent Hexapod
88
For the diagrams of the steps, please refer to 4.3.4.1 Tripod Walking Straight. 4.3.4.3.2 Left Rotating For rotating left, it is again just the opposite of Right Rotating. Right side Phase 1 (leg1): A → B → C → D → A Right side Phase 2 (leg0, leg2): C → D → A → B → C Left side Phase 1 (leg3, leg5): C → B → A → D → C Left side Phase 2 (leg4): A → D → C → B → A For the diagrams of the steps, please refer to 4.3.4.1 Tripod Walking Straight.
4.3.4.4 Implementation of Loop We now have all the design of loop of steps to achieve different kinds of movement, but how do we actually implement the loop? The points of loop, A, B, C, D, are actually different coordinates of the endpoint of each of the legs, so a very direct approach would be directly inputting all the respective coordinates and store them as an array/list and use it for looping. 4.3.4.4.1 Initial Design We need a loop of different coordinates, so how we find those desired coordinates? Our initial approach is to solve this problem by empirical study. We simply measure the length of X, Y, Z of the endpoint respective to its Origin of leg, as our Coordinate System uses the scale of 1:1 cm. For instance, when implementing the Tripod Walk Straight Forward function, we have the following coordinates measured. Diagram shown below is an extraction of the implementation, details about the design of coding (i.e. Point Object) please refer to 2.5.4 Point Object by Arack.
Our approach is simple, we define the number of legs and the number of different points required, then we store the desired coordinate in the Point Object, we then use different arrays to store the whole loop of these steps. We test this code and we immediately engaged with a major problem, the legs were moving too fast from the current coordinate to the desired coordinate, that the robot was not
HKU CS FYP14013 Implementation of An Intelligent Hexapod
89
able to walk at all. The problem origins from the control of the servo and we solved it by dividing the motion between the current coordinate to desired coordinate into smaller slices such that we can control the moving speed of each servo. Detailed problem and solution please refer to 2.5.6.5 How does the hexapod move in a different speed? by Arack. After solving this problem, the robot is now able to walk. However, we found that the movement is unstable that the robot can hardly maintain its balance during walking. It keeps falling after walking several steps. Such problem is found to be caused by the lack of power of servos and also the design problems of the legs. After several discussion and review of the design, we suggest Sherman to shorten the length of link Femur, as the shorter the distance between endpoint of the leg to the centre of the body, the more powerful that the leg can stand. We also changed all the servos to a more powerful one such that it can provide a better balance, detailed problem and solution please refer to 3.2.3.1 Short leg joint (Femur) by Sherman. 4.3.4.4.2 Final Design After we solved all the control and physical problems that affect the body movement of the Hexapod Robot, our initial implementation already works and able to perform all the desired body movement. However, the initial implementation is not good in the sense that we actually hard-coded each of the steps in the loop that perform body movement. The problem with our initial design is that it solely relies on the physical properties of our robot (i.e. the length of each link of the leg and the coordinates we measured by hand). This caused the problem when we want to make changes to the design of the robot or the gait. We need a more flexible design that allow us making changes to the gait and robot without updating lots of the code. The approach we used here is that instead of measuring each desired coordinate of the loop by hands, we use the design data from 3D modeling (the length of all the links) as the basis for processing new coordinate. With those data, we can have the coordinates of the Natural position and use it as a reference point for calculating all other desired coordinates by using Forward Kinematics (detail please refer to 2.5.6.4. How does the hexapod move with different altitudes of the center? by Arack). The idea of using a loop of coordinates to perform body movement remains the same but the way how we obtain the desired coordinates differ from the initial approach. The code shown below is an extraction from our implementation.
You can actually see that we specified the joint angle for rotation of the reference point (natural_gait_right) and then the new desired coordinates is stored in the array of Point
HKU CS FYP14013 Implementation of An Intelligent Hexapod
90
Objects. By using the Forward Kinematics to calculated our desired coordinates, the result is much accurate that it also gives better accuracy in results given by our Inverse Kinematics Solution for solving the required joint parameters to reach these new desired coordinates. This is our final design and implementation of the Gait. It works together with Forward Kinematics and Inverse Kinematics Solution and a loop of those calculated coordinates to achieve our desirable movement.
4.4 Conclusion In conclusion of this part of our Hexapod Robot project, Kinematics is surely the major component or even the core of of our robot. Through the study of different kinematics problems and solutions, we are able find some applicable theories to solve our problem. The approach that we used to solve the Kinematics problem is simple and easy to follow. Future development on better and more Gaits implementation is also possible. There are more natural animal gaits than the two we mentioned in this report. In short, with the design and implementation of Body and Leg Kinematics and Gait interacting with each other, all the components are working together to carry out the goal of walking of the robot and we are glad to see the robot being able to perform all the desired motion.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
91
5 Application 5.1 Introduction Intelligent robot becomes more and more popular nowadays. We can see robots from factories to households. However, one of the criticisms of these intelligent robots is that it is somehow hard to control. It can take a long time to learn all the necessary command, and it is error-prone and frustrate most users easily.
In our project, one of the targets is to build a companion application for navigating our robot. The focus of our application is to bring easy to use interface yet responsive experience to the robot owners. To achieve the goal we set, we have implemented the application featuring several techniques, and follow Google’s newest guideline on user interface design. We hope we can build a user-friendly mobile app for our robot.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
92
5.2 Development 5.2.1 Building Platform – Android 4.0.4 (API15) Android is the most widely used mobile operating system nowadays, more than 57% smartphones preloaded with Android, which more than 85% of these devices preloaded with Android version 4.0.4 or newer version. [4.1][4.2]
HKU CS FYP14013 Implementation of An Intelligent Hexapod
93
As it is widely used and relatively easier to develop with, in our project, we develop our control application on Android platform of version 4.0.4, Android API 15. This version is chosen because we think this is the sweet point between compatibility and functionality. For devices of older version, we have prepared a version with basic functionality.
5.2.2 Building Environment – Android Studio The application is developed with Android Studio, version 1.1.0. Android Studio is Integrated Development Environment for developing on Android Platform with cross-platform support, including Windows, OS X and Linux etc. Compared to Eclipse with ADT, Android Studio has several features, which has benefited our development:
1.
It comes with Gradle-based build support, so we can add Gradle dependencies to the project simply with a line of command.
2.
It contains Live Layout, which provides Real-time App Rendering, where the user interface of different activities can be inspected easily. [4.3]
3.
It is the current official IDE for Android and it integrated with many Google Service like Google App Engine.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
94
The application is tested on following devices:
1.
Sony Xperia Z Ultra of Android 4.4 Kit Kat
2.
Sony Xperia acro S of Android 4.1 Jellybean
3.
Desire Eye of Android 4.4 Kit Kat
To check the consistency of the User Interface, the application is also test run on emulator of Nexus 5 and Nexus 6.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
95
5.3 User Interface
5.3.1 Design Strategy In designing the user interface, we have put several factors into consideration
1.
Clean and intuitive layout for user
2.
Design is consistent throughout the application
3.
User interface is coherent to that of some common apps the public use
4.
Integration of related features to minimize workflow
We believe if we put these ideas into implementation, the application will be more user-friendly and intuitive.
5.3.2 Material Design To simplify the control of the robot, we want to make our application as user-friendly as possible. We have observed that, for a user, an application is easier to be used if its user interface is coherent to other common apps they use, like Google Maps, Evernote etc. Therefore, in this project, we adopt the Material Design, a new design language proposed by Google for Android, which is widely used in Google mobile apps and some other chart topping apps.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
96
The principle of Material design is to provide a design framework of apps for different devices including Android smartphone, smart wearable, TV etc. First of all, instead of the grayish HOLO theme introduced in Android 3.0, Honeycomb, Material design features bold and energetic color scheme, which echoes the trend of flat, design and looks attractive. It also emphasis on the clear representation of information including the use of padding to highlight the necessary information, animation to illustrates the relationship between activities.
In our product, we have implemented some elements of Material.
1.
We adopted the bold color scheme where Teal is the main theme color.
2.
We implement a floating action button in the basic mode to capture photos streamed by the webcam as capturing picture is one of the highlight function.
3.
We implement a navigation drawer for mode-switching.
We use elements from material design as much as possible and we expect with the influence of Material Design, our application can be more intuitive and user-friendly.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
97
HKU CS FYP14013 Implementation of An Intelligent Hexapod
98
5.3.3 Landing Page The launcher activity of our application is the landing page, user is required to input the address of the robot in order to proceed.
Once the user finished typing the address and click ‘Send’ Button, the application will stored the URL as a sharedPreference which will be heavily reused by other components of the application. Moreover, the typed address will be stored so that when the application is opened in later time, users are not required to re-type the address again if the address is unchanged.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
99
5.3.4 Basic Mode Once the activity is created, the activity will connect the robot using XML-RPC. Only when the connection is built, the user will be directed to this activity. This part contains most of the functions of this application, including basic movement control, transition kinematics demonstration, webcam streaming, photo capture and also the text-to-voice function.
1)
Webcam Streaming When the user start the activity, the application will first connection to the robot, corresponding webcam and speaker. If the connection is made successfully, the media server will stream the video on the upper part of the application.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
100
2)
Picture Capture There is floating action button on the video streaming part. When the user click the button, that scene will be capture as an image and being saved to folder HexaPodControl on external SD card (for JellyBean or below version) or internal storage (for KitKat or newer version Android)
3)
Text-to-voice function When user input a string into this textbox, the robot will “speak” the string through its internal speaker. At this moment, only English text string is supported.
4)
Height adjustment slider The user can adjust the height of center of gravity of the robot using the slider by sending corresponding remote procedure call.
5)
Left Joystick: Basic movement control In previous builds, we use buttons for users to navigate the robot. Although the
When the user drag the joystick, the robot will move based the the position of the pointer head. Moving forward, backward, leftward, rightward can be performed by dragging the joystick.
6)
Right Joystick: Translation movement control When the user drag the right joystick, the robot will perform translation movement based the the position of the pointer head. The center of gravity of the robot will be translated.
7)
Auto Mode Once this mode is activate, the robot will move forward simultaneously. This function is to demonstrate the obstacle avoidance of the robot.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
101
8)
Pick Mode When the button is clicked, the clamp positioned on the front of the robot will activate
HKU CS FYP14013 Implementation of An Intelligent Hexapod
102
5.3.5 Dance Mode This activity is to demonstrate the rotation of the robot. This mode features the use of accelerometer. In this mode, user can tilt the angle of the smartphone used to control the tilting angle of the robot. The built-in accelerometer sensor continues records and return the XYZ orientation readings of the smartphone, and our application will analyze the readings to convert them into suitable coordinate for the robot to execute. Therefore, the result is that once the user tilt the smartphone, the robot will rotate to reflect the relative position of the robot. This control mode is to target the rotation in X, Y-axis. The user can also disable the dance mode in purpose by toggle off the switch.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
103
To demonstrate rotation in Z-axis, there is a joystick on this activity. When user control the joystick, the application will record the angle and call the related remote procedural call. Then the robot will rotate horizontally at the same angle of the joystick
5.3.6 Immersive Mode This activity is landscape in orientation to simulate first person control of the robot, and it facilitates the use of accelerometer.
This activity is inspired by FreeFlight 3, the official controller app of Parrot Helicopter, a famous retail helicopter. This activity is filled with the video streamed from the webcam, giving the user the view of the robot. On the bottom right part of the screen, there is a virtual joystick which allow user to control the movement of the robot.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
104
Moreover, within this activity, user can tilt the patrolling smartphone to control the motion of the robot. With the built-in accelerometer, the app will calculate the tilting angle so that it can send appropriate command to instruct how the robot moves.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
105
5.3.7 Settings This activity is to allow users to modify different settings of the application and robot. The information is stored as a shared preference so that once the modification is saved, the changes can be reflected in other activities and fragments.
User can modify the different address, including the address of the robot, webcam and also the speaker. The input is set to receive Uri only to prevent collapse due to wrong Uri formation. Modifying the value will modify the corresponding SharedPreference so the modification will take effect to the whole app. So when the robot address is updated, the value in the input box on landing page will be also updated.
The delay time of robot movement can be changed in the advance settings section. It allows user to change the speed of movement and also gait point delay. The modification will immediately sent to the robot to take effect.
The debug mode is also available to toggle on here in order to generate debug Log Cat in Android Studio when necessary. It allows developers to get more runtime message in the console of the Android Studio.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
106
HKU CS FYP14013 Implementation of An Intelligent Hexapod
107
5.4 Technique Used
5.4.1 XMLRPC The purpose of our application is to control robot by sending commands to it. In our project, we use XML-RPC as the protocol of communication between the robot and the mobile app.
XML-RPC is a remote procedural protocol which is encoded in XML, and transmitted in HTTP. It allows client (i.e. the mobile app) to send a method call request with multiple parameters. The relevant execution code is only stored in the server side and it is executed only when the call is received. As the picture illustrates, the package contains only the necessary information. Moreover, the server can return a value to client after execution. Therefore, the procedural call body is short in length so that it allows rapid and responsive communication.
We use aXML-RPC library to implement XML-RPC function. It is a java library which is lightweight XML-RPC client which performs great on Android platform.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
108
In our application, every communication between robot and application uses XML-RPC. From sending movement instructions, to receiving distance readings from ultrasound sensor, those actions are done with XML-RPC. For example, when the user activate the application, the application will send a XML-RPC request to the robot via the WiFi network. Once the WRTNode board receives the message, the XML-RPC server will execute stored method, incorporating with the ultrasound sensor to give the distance reading. Finally, the server will send a XML-RPC response to the application so to refresh the distance reading on the app.
A little problem faced when using the XML-RPC during development is that we did not know that Networking-related command cannot be executed on the main thread, which is the Ui thread. It takes some time to figure out the mechanism of Android thread.
5.4.2 MJPEG Decoder In our project, we installed a camera, or a webcam, on the robot to stream the surroundings images to the app.
In our project, we installed a camera on our robot in order to stream the surroundings images to the app. For most webcam, including the one we used, the video output is in MJPEG format which is a series of JPEG in a nutshell. However, there is no built-in MJPEG decoder for stock Android. To stream the video recorded by the Webcam, we need to use AsyncTask which decode the video on backend, the non UI thread, while playing the video on the UI thread. Therefore, even if the application is decoding the series of JPEG picture continuously in the backend, the user interface front end is still free for user to interact. The backend thread will then receive the JPEG stream
HKU CS FYP14013 Implementation of An Intelligent Hexapod
109
Flow of implementing MJPEG Streamer Assume the user has already input a valid webcam address in the settings. Once the user start the activity, the AsyncTask will take the webcam address receives the stream of JPEG. The backend thread will connect to the webcam address, and receive the video and
HKU CS FYP14013 Implementation of An Intelligent Hexapod
110
split the MJPEG into a series of bitmap continuously using the Bitmap library function. The series of bitmap will later be used by the UI thread.
The UI thread part will first initialize the MJPEG view part on the UI, setting the basic parameters for the webcam playback like the height and width. Then a thread is start, it will continuously obtain the bitmap from the backend thread and display it on the MJPEG view. If the user pauses the current activity and the streaming, a signal will be issued to join the thread.
5.4.3 Accelerometer Most android devices have accelerometer built-in. To better demonstrate the rotation of the robot and give more intuitive control to the user, we implement the use of accelerometer to control the rotation of the robot.
When implementing the accelerometer into use, we found that the generation of readings comes too fast that we cannot transform each reading received into a separate XML call. Even the slowest profile comes from stock Android profile is still too fast for us. We
HKU CS FYP14013 Implementation of An Intelligent Hexapod
111
first implement pass filter to filter out the values out of range, however, it just can’t help with the frequency problem but it actually helps us to eliminate the spikes of sensor readings.
We then implement a counter to filter some readings intervally but it will make the rotation not smooth as we expect. Finally, we use the most simple yet efficient approach, thread sleep, to solve the problem.
Flow of using accelerometer
Modern smartphones have built-in accelerometer, by calling getOrientation() we can get an array of current readings recording by the accelerometer. The array represent the value of azimuth (rotation around z axis), roll (rotation around x axis) and also pitch (rotation around y axis). When we called obtain the array of raw data, we need to do some arithmetic to get degree value of the smartphone’s tilting angle. A low pass filter is also applied on the calculated value to remove background noise and spikes. Then the array value is saved to an array for later use.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
112
5.5 Future Development
5.5.1 WebCam Video Analysing As we have attached a camera to our robot, we can make better use of the camera instead of only capturing the surroundings. At the mid-point of the progress, we come up an idea that with the power of OpenCV, the smartphone can analyse the webcam streaming for the purpose of object recognition. For example, once an image of an apple is stored in the app’s database, when the webcam “finds” an apple, the application will alert user for the presence of the object.
However, the current OpenCV for android only supports video capturing from smartphone's camera. Video source other than that is only supported in desktop version OpenCV.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
113
5.5.2 Full Support on lower Operating System Although the application is not that demanding, at this moment, due to lack of support from Android API, some of the functions cannot be implemented on device with Android version lower than 4.0.4. Therefore, we had implemented a Lite version with fewer functionality to get more devices supported. However, if time allowed, we can get more devices supported by using custom libraries.
5.5.3 Support on other Mobile Operating System With appropriate feasibility research, development of an iOS version of the application is possible as XML-RPC client is also available for calling functions in the robot. Before building an iOS application, an Apple Developer account is required.
5.6 Conclusion To provide a user-friendly way to navigate the robot, building a controlling application is one of the feasible solutions. Smartphones are now popular and powerful, allowing us to put more interesting ideas into the application and make itself a great companion to the robot. Implementing this application is valuable experience to me. However, as mobile development is very rapid nowadays, skills of 2-3 years maybe deprecated today. It takes time to catch up new technology while maintaining compatibility to old devices.
HKU CS FYP14013 Implementation of An Intelligent Hexapod
114
6 Reference [2.1]
Raspberry Pi 1 Model B+. Retrieved from http://www.raspberrypi.org/products/model-b-plus/
[2.2]
WRTnode. Retrieved from http://wrtnode.com/w/
[2.3]
OpenWrt. Retrieved from https://openwrt.org/
[2.4]
Pulse Width Modulation Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Pulse-width_modulation
[2.5]
eSpeak:Speech Synthesizer. Retrieved from http://espeak.sourceforge.net/
[2.6]
MJPG-streamer | SourceForge.net. Retrieved from http://sourceforge.net/projects/mjpg-streamer/
[2.7]
XML-RPC Retrieved from http://en.wikipedia.org/wiki/XML-RPC
[2.8]
Shared memory (interprocess communication) Retrieved from http://en.wikipedia.org/wiki/Shared_memory_(interprocess_communication)
[3.1]
“3D Modeling - Wikipedia” Retrieved from https://www.wikiwand.com/en/3D_modeling
HKU CS FYP14013 Implementation of An Intelligent Hexapod
115
[3.2]
“Make: 3D Printing: The Essential Guide to 3D Printers” Retrieved from https://books.google.com.hk/books?id=go9lAgAAQBAJ&dq=Fixup+3d+printin g&hl=zh-TW&source=gbs_navlinks_s
[3.3]
“Quickparts - What is an STL File?” Retrieved from http://www.3dsystems.com/quickparts/learning-center/what-is-stl-file
[3.4]
“G-code - Wikipedia” Retrieved from https://www.wikiwand.com/en/G-code
[3.5]
“3D Printing - Wikipedia” Retrieved from https://www.wikiwand.com/en/3D_printing
[4.1.1]
“Forward Kinematics - Rich's Robot Musings” Retrieved from http://www.learnaboutrobots.com/forwardKinematics.htm
[4.1.1]
“Forward Kinematics - Wikipedia” Retrieved from http://en.wikipedia.org/wiki/Forward_kinematics
[4.1.1.1]
“Denavit–Hartenberg parameters - Wikipedia” Retrieved from http://en.wikipedia.org/wiki/Denavit%E2%80%93Hartenberg_parameters
[4.1.2]
“Inverse Kinematics Basics Tutorial - OscarLiang.net” Retrieved from http://blog.oscarliang.net/inverse-kinematics-and-trigonometry-basics/
[4.1.2]
“Inverse Kinematics - Rich's Robot Musings” Retrieved from http://www.learnaboutrobots.com/inverseKinematics.htm
[4.1.2]
“Inverse Kinematics - Wikipedia” Retrieved from http://en.wikipedia.org/wiki/Inverse_kinematics
HKU CS FYP14013 Implementation of An Intelligent Hexapod
116
[4.1.3]
“Inverse Kinematics for Hexapod and Quadruped Robots - OscarLiang.net” Retrived from http://blog.oscarliang.net/inverse-kinematics-implementation-hexapod-robots/
[4.1.2.1]
“Jacobian matrix and determinant - Wikipedia” Retrieved from http://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant
[4.1][4.2]
“iitsii the Hexapod Robot - Robugtix” Retrieved from http://www.youtube.com/watch?v=kMKxwBRqtBw
[4.2]
“Gait - Wikipedia” Retrieved from http://en.wikipedia.org/wiki/Gait
[4.2]
“Insect Locomotion - MindCreators.com” Retrieved from http://www.mindcreators.com/insectlocomotion.htm
[4.1.3.3]
“Rotation Matrix - Wikipedia” Retrieved from http://en.wikipedia.org/wiki/Rotation_matrix
[5.1]
“iSuppli Predicts Windows Phone’s Edge Over iOS by 2015” http://www.macobserver.com/tmo/article/isuppli_joins_others_in_predicting_w indows_phones_edge_over_ios_by_2015
[5.2]
“Dashboards” Retrieved from https://developer.android.com/about/dashboards/index.html?utm_source=suz unone
[5.3]
“Download Android Studio and SDK Tools | Android Developers” Retrieved from https://developer.android.com/sdk/index.html
HKU CS FYP14013 Implementation of An Intelligent Hexapod
117