2017 IEEE 2nd International Conference on Signal and Image Processing
IR Sensing Embedded System Development for Prototype Mobile Platform for Autonomous Convoy
H. Bryan Riley and Mehmet Celenk School of Electrical Engineering and Computer Science Stocker Center, Ohio University Athens, Ohio, USA 45701 e-mail:
[email protected];
[email protected]
Abstract—Sensing techniques using varied configurations of infrared (IR) devices are rapidly becoming a proven approach for autonomous vehicles. In this paper, we present an investigation and corresponding results of embedding these sensors in a prototype robotic model and examine its performance. The results support IR sensing as a viable alternative to a RADAR based system for object detection and travel in a forward path direction. Proportional Integral (PI) control is implemented due to its simplicity and effectiveness in Python programming language on the Raspberry Pi microcomputer. In this research, we model the operational characteristics of IR sensors in accordance with the physical principal that the measured distance is inversely proportional to the IR sensor voltage output. This provides a device independent robust sensing methodology as demonstrated in the experimental section of the undertaken research. As a result, the proposed approach is expected to enable major manufacturers and corporations to have fully autonomous passenger cars and trucks on roads in the USA by 2021. [1]
Figure 1. Self-driving vehicle technology adaption.
Keywords-autonomous vehicles; self-driving cars; embedded systems; IR sensing
I.
INTRODUCTION
This paper examines the performance of an infrared (IR) sensing using a prototype mobile platform for autonomous vehicle convoy following. The IR spectrum sensing for autonomous vehicle applications is a compelling approach to detecting objects in the path of vehicle travel. Since cost, reliability, and detection performance are the critical criteria for technology production, the study of performance is the primary interest in this work. Beyond the practical benefits, autonomous cars could contribute $1.3 trillion in annual savings to the US economy alone, with global savings estimated at over $5.6 trillion [2] along with a technology adaptation time line as shown in Figure 1. Design of selfdriving or autonomous vehicles with smart sensing systems will positively impact planning decisions for optimal traffic flow, minimize traffic congestions, and alleviate human error leading to personal injuries or property damage. Halleck and Gannes of Google Incorporated introduced “Self-driving cars” at the Code Conference and stated they will be ready by 2020. [3], [4]
978-1-5386-0967-5/17/$31.00 ©2017 IEEE
441
As a vehicle subsystem, IR sensing has the capability to ascertain surroundings by measuring the temperature variation an object in motion. The vehicle sensor set with field of view (FOV) typically varying from 120o to 150o, and the respective processing software are imperative for safe operation of autonomous vehicles (AVs). Characterizing the performance of sensing to determine the location of the host vehicle relative to other moving vehicles, all in path targets and road lane boundaries are equally important. Human driver sensing reports that approximately 90% of the primary factors behind crashes are due to human errors according to National Highway Traffic Safety Administration (NHTSA) statistics in 2012. [5], [6] AVs with a high performance sensor suite are predicted to reduce crashes and injury rates by upwards of 50% as compared to non-AVs. Vehicle manufactures are motivated to contribute to transformative and highly beneficial technologies to support Intelligent Transportation Systems (ITS) and the future of the mobility industry. The state of the art reports the rationale for selection of IR sensors rather than ultrasonic sensors, RADARs, or cameras to perform the sensing function. The underlining research has proceeded with the design, implementation and testing of an IR sensor configuration in conjunction with a single board computer to follow a lead remote controlled (RC) scaled-model vehicle. Earlier work conducted by Kou et al. utilized two low cost sensors based on a kinematic model of a car-like mobile robot (CLMR). [7] Chao-Lin Kuok et al., developed to compare two non-linear model-based approaches for autonomous vehicles. [8]
operation characteristics (i.e., distance inversely proportional to output voltage reading), effective in short distance measurements, and not very effective in long distance measurements. [8] In this research, we model the operational characteristics of IR sensors in accordance with the physical principal as the measured distance 𝑑𝑖 is inversely proportional (α) to the IR sensor voltage reading 𝑣𝑖 ; namely,
Safety critical criteria for driver assisted systems and enabling technologies are studied by Carsten et al, and Englund et al. respectively. [9], [10] This paper is organized as follows. Section II provides a description of the system architecture and electrical interfaces to accomplish the autonomous tracking capability. The embedded system design is also presented. Section III develops the testing and data collection and analysis of the IR sensor configuration. An analysis of the measured test results is presented in Section IV. Section V provides a summary and conclusion of this work.
𝑑𝑖 𝛼
1
𝑣𝑖
II. IDESCRIPTION OF EMBEDDED SYSTEM DESIGN
Further, inverse proportionality is expressed as
This section describes the design and implementation of an embedded system onto 1/10 scaled remote controlled (RC) robotic vehicles. The system design for this embedded system began by conducting tradeoff study between the Arduino and Raspberry Pi single board computers. Base on ease of programing, a large suite software algorithm readily available in the open source domain the decision was made to select the Raspberry Pi Model B+. It a fully featured very compact computer operating at 700 MHz and uses the Raspbian Operating System (OS). The computer interfaces consist of: Four USB 2.0 Ports, Single SD card slot 24 Pin Header with GPIO, and An Ethernet connector, A CSI connector for a camera, and An HDMI output Power: supplied by micro-USB Connector
𝑑𝑖 =𝑘
1
𝑣𝑖𝑎
where k is a scaling constant variable and a is the degree of proportionality. Both of these constant variable are to be determined empirically in system implementation and experimentation. Since the Sharp GP2YOA21YK IR sensor was selected, the generalized mathematical relationship between distance and output voltage is given by 𝑑 = (
1 𝑐∗ 𝐴𝐷𝐶 +𝑏
) − 𝐾
where d is distance cm, K is a corrective constant, ADC is the digitized value of the sensor output voltage, b denotes a variable constant value to be determined from trend line equation, and c denotes another variable constant to be determined from the trend line equation. Notice that our proposed equation (2) has only one constant variable (k) to be determined whereas the governing sensor equation (3) has three variables (b, c, and K) that must be determined empirically from the measurement data. To this end, it is concluded that our model equation (2) accounts for a multiple sensor configuration and is easier to use and implement than the sensor governing equation (3), respectively as the RC vehicle convoy from a random starting location to another location. III.
TESTING AND DATA COLLECTION
This embedded system is defined by time-division multiplexing principles since the sensor data streams are formatted into a single signal by separating the signal into short duration segments. The sensor data is collected and stored in memory for processing. Each of the Sharp IR distance measuring sensors are independently selected after several tests to benchmark performance and verify the electrical interface with the Raspberry Pi Model B+ single board computer. Vcc is the source voltage and this device can tolerate a range of 4.55v to 5.5v. Figure 3 show the electrical interconnection block diagram of the Sharp IR sensor where the digital output Vo is directed to the digital I/O channel of a Raspberry Pi Model B+ single board computer. [12], [13]. The system is designed such so path of travel contains minimal deviations in radius of curvature, the acceleration,
Figure 2. IR sensors mounted on 1/10 scaled RC vehicle.
The transducer, a Sharp GP2YOD21YK IR sensor selected which operates at the IR wavelength of λ=780 nm. [11] It is an active data acquisition transducer consisting of a transmitter and a receiver unit both actively engaged transmitting and receiving operations continuous time. Figure 2 shows the mounting and configuration for 3 IR sensors to provide the forward looking operational FOV. The sensors are cost effective, easy to install, calibrate and operate, low voltage, less energy consumption, nonlinear
442
and speed are modeled by control variables for the following experiments. The software for IR sensing system as depicted in Figure 3 is written in Python version 2.7 software. The algorithm is designed per Proportional Integral Derivative controller (PID controller) for this closed loop system. For this application the PI algorithm is selected as the most instinctive, superior, and simplistic. [15] The servo motor speed of the robotic RC vehicle is the process variable. The expected tracking performance is be smooth. The principle of this algorithm is based on manipulating the error (e) which is defined as the difference between the Process Variable (PV) and the Set Point (SP). Figure 4 depicts the PI controller block diagram. A defined headway or fixed distance in Figure 3 are defined as the SP and computed distances, independent of the measurement sensor are the PVs. The MATLAB R2017A function “PIDSYS” is employed and returns the continuous-time PI controller in parallel form.
Proportional (P) corrects instances of error and the Integral (I) corrects the accumulation of error. As e (t) increases or decreases, the amount added to the Controller Output (CO) increases or decreases immediately and proportionately. Mathematically, the CO is governed by Equation 4 and is given by 𝐶𝑂𝑜𝑢𝑡 β⸱∫ 𝑒(𝜏) 𝑑𝜏 where β is the proportional gain required to get the controller output. Based on sensor voltages defined in Figure 3 and computing distances, independent of sensor specification, tracking by the host vehicle occurs per Equation 5. 𝑑1 > (𝑑2 + 𝑑3 ) ⇒ host tracks target to left 𝑑1 ≃ 𝑑2 ≃ 𝑑3 ⇒ host tracks target straight 𝑑3 > (𝑑1 + 𝑑2 ) ⇒ host tracks target to right
(5a) (5b) (5c)
Upon successful implementation and debugging of the PI software the control loop is tuned or adjusted by starting with low proportional and no integral, continue to double the proportional until it begins to oscillate, then reduce by ½. Next incorporate a small integral and double it until it starts to oscillate, then reduce it by ½. Continue to determine a reasonable headway distance (headway) and smooth following behavior as the hosts track the target. MATLAB “PIDSYS” returns the proportional and integral controller terms as Kp = 1.14 and Ki = 0.454 respectively. The corresponding step response for this PI controller is computed and the error or deviation of actual measured PV from the SP is shown in Figure 5.
Figure 3. Embedded system for autonomous convoy.[14].
Figure 5. Step response for PI controller.
IV. ANALYSIS OF TEST RESULTS Performing trial runs with the embedded system on the host vehicle showed not jerking effect and smoothly tracked the target.
Figure 4. PI controller block diagram.
443
unit standard deviation. Figure 7 provides the error curve for this prototype system.
Figure 6. Sensor output vs. distance (cm).
During several test trials, the change in sensor reading from 5 cm to 10 cm was much greater than the change in sensor reading from 30 cm to 35 cm. To account for the nonlinear characteristics of the IR sensors the embedded software was modified to approximate the exponential curve that was computed to be the performance. The host RC at different distances ranging from 10 cm to 40 cm while measuring the sensor output. Uniform testing was conducted per outline procedures and the data was plotted in Microsoft Excel. The plot results are shown in Figure 6. A Microsoft Excel data function is used to calculate an exponential trend line and is given by equation (6):
headway.
V. CONCLUSION
𝑑𝑖 = 43,000 𝑣𝑖−1.236 where 𝑑𝑖 is the measured distance in cm and 𝑣𝑖 indicates the IR sensor readings of the digital output. Review of the curve in Figure 3 for an actual case indicates for 𝑣𝑎 = 900 𝑑𝑎 =10cm. Alternatively the estimated distance is ̂𝑎 (900)−1.236 𝑑 The average error Ɛ in %, is given equation (8) and in this case Ɛ=
(𝑑̂𝑎 − 𝑑𝑖 ) 𝑑̂𝑎
(100)
4.054%
(8)
where da is the actual distance in cm based on 3 forward looking IR sensors. Recorded data is subjected to equation (1) via a MATLAB algorithm, converting the arbitrary sensor readings into real distance values. This led to the errors being generated in the PI loop to be linear with the distance and the control system’s performance improved significantly. The MATLAB function “POLYFIT” is utilized to return the coefficients of a 6th order polynomial that is a best fit in the least-squares sense to the error data where it is scaled to have
444
This paper proposes an embedded system design shown to be implemented and successfully tested to perform as a testbed for testing IR sensors for autonomous vehicle sensing applications. A system architecture that will allow of sensors of a varied range of specifications can be incorporated to ultimate demonstrate performance of maintaining a set headway and smoothly following a target vehicle. The process of starting a research project and seeing it through has many moving parts which the team was unfamiliar with but can now say they understand as they continue to learn. Since many real-world application are cost sensitive, lowcost sensors are intentionally implemented using scaled models to replicate a host vehicle-to-target vehicle following system. A resulting headway (i.e., following distance) is maintained to less than 5% error and the steady state response is measured to be on the order of 8 seconds. A comparison to other research results reported in the literature confirms similar result where the trade of is use of two sensors (i.e., narrower FOV) whereas this work utilized a configuration of three sensors ((i.e., broader FOV). Suggested next steps for this research shall include replicating real-world environmental conditions (i.e., fog) for testing and analysis of the "lead vehicle" and "sensor vehicle." The students continue to be enthusiastic regarding this project and interested to investigate more complex sensing techniques. An approach consists resolving distances based on the speed of light by measuring the time-of-flight (ToF) between the sensor and the targeted image using a ToF Camera or LIDAR. Robust sensing technologies will be enables for major corporations such as Tesla, Mercedes-Benz, Ford, Google, General Motors, and Uber have boldly asserted they will have fully autonomous passenger cars and trucks on public roads by 2021.
REFERENCES [1] [2] [3] [4] [5] [6]
[7]
[8]
[9] [10] [11] [12] [13] [14]
Figure 7. Host tracks target as per equation (5).
ACKNOWLEDGMENT The authors would like to thank members of the Digital and Image Processing and Machine Learning team, in particular, C. Schumann, J. Petersen and N. Dhinagar for their work in implementing this research project in The School of Electrical Engineering and Computer Science at Ohio University.
[15]
445
R. Shanker et al., “Autonomous Cars: Self-Driving the New Auto Industry Paradigm,” Morgan Stanley Blue Paper, 2013. A. Jonas et al., “Autonomous Cars: Self-Driving the New Auto Industry Paradigm,” Morgan Stanley Blue Pap., pp. 1–109, 2013. L. Gannes, “Google Introduces New Self Driving Car at the Code Conference,” Re/code. 2014. L. Mearian, “Nissan plans to offer affordable self-driving cars by 2020,” Computerworld. 2013. NHTSA, “U.S. Department of Transportation releases policy on automated vehicle development,” Nhtsa 14-13, 2013. . C. Pozna and C. Antonya, “Issues about autonomous cars,” in SACI 2016 - 11th IEEE International Symposium on Applied Computational Intelligence and Informatics, Proceedings, 2016. C. L. Kuo, N. S. Pai, Y. P. Kuo, and Y. C. Hu, “Following method for a car-like mobile robot using two IR sensors,” in IEEE International Conference on Control and Automation, ICCA, 2014. E. Alcal; et al., “Comparison of two non-linear model-based control strategies for autonomous vehicles,” 2016 24th Mediterranean Conference on Control and Automation (MED). 2016. O. M. G. Carsten and L. Nilsson, “Safety assessment of driver assistance systems,” Eur. J. Transp. Infrastruct. Res., 2001. C. Englund et al., “Enabling technologies for road vehicle automation,” in Road Vehicle Automation 4, 2017. Sharp Corporation, “Distance Measuring Sensor GP2YOA21YK,” 2016. Raspberry Pi Foundation, “Raspberry Pi - Teach, Learn, and Make with Raspberry Pi,” www.raspberrypi.org, 2012. E. Upton and H. Gareth, Raspberry Pi User Guide. 2014. D. Gunnarsson, S. Kuntz, G. Farrall, A. Iwai, and R. Ernst, “Trends in Automotive Embedded Systems,” in CASES’12: Proceedings of the 2012 ACM International Conference on Compiliers, Architectures and Synthesis for Embedded Systems, 2012. R.-E. Precup, A.-D. Balint, E. M. Petriu, M.-B. Radac, and E.-I. Voisan, “PI and PID controller tuning for an automotive application using backtracking search optimization algorithms,” SACI 2015 - 10th Jubil. IEEE Int. Symp. Appl. Comput. Intell. Informatics, Proc., 2015.