Perception, information processing and modeling: Critical stages for ...

11 downloads 0 Views 6MB Size Report
Perception, information processing and modeling: Critical stages for autonomous driving applications. Dominique Gruyera,*, Valentin Magniera, Karima Hamdia, ...
ARTICLE IN PRESS

JID: JARAP

[m5G;October 27, 2017;9:15]

Annual Reviews in Control 0 0 0 (2017) 1–19

Contents lists available at ScienceDirect

Annual Reviews in Control journal homepage: www.elsevier.com/locate/arcontrol

Review article

Perception, information processing and modeling: Critical stages for autonomous driving applications Dominique Gruyer a,∗, Valentin Magnier a, Karima Hamdi a, Laurène Claussmann a,b, Olivier Orfila a, Andry Rakotonirainy c a b c

LIVIC (COSYS-IFSTTAR), 25 allée des Marronniers, 78000 Versailles Satory, France VEDECOM, 77 rue des Chantiers, 78000 Versailles Satory, France CARRS-Q, Queensland University of Technology, 130 Victoria Park Road, Q 4059 Kelvin Grove Campus, Australia

a r t i c l e

i n f o

Article history: Received 14 July 2017 Revised 23 September 2017 Accepted 24 September 2017 Available online xxx Keywords: Perception Automated driving Quantified self Internet of Things Artificial intelligence Driver modeling

a b s t r a c t Over the last decades, the development of Advanced Driver Assistance Systems (ADAS) has become a critical endeavor to attain different objectives: safety enhancement, mobility improvement, energy optimization and comfort. In order to tackle the first three objectives, a considerable amount of research focusing on autonomous driving have been carried out. Most of these works have been conducted within collaborative research programs involving car manufacturers, OEM and research laboratories around the world. Recent research and development on highly autonomous driving aim to ultimately replace the driver’s actions with robotic functions. The first successful steps were dedicated to embedded assistance systems such as speed regulation (ACC), obstacle collision avoidance or mitigation (Automatic Emergency Braking), vehicle stability control (ESC), lane keeping or lane departure avoidance. Partially automated driving will require co-pilot applications (which replace the driver on his all driving tasks) involving a combination of the above methods, algorithms and architectures. Such a system is built with complex, distributed and cooperative architectures requiring strong properties such as reliability and robustness. Such properties must be maintained despite complex and degraded working conditions including adverse weather conditions, fog or dust as perceived by sensors. This paper is an overview on reliability and robustness issues related to sensors processing and perception. Indeed, prior to ensuring a high level of safety in the deployment of autonomous driving applications, it is necessary to guarantee a very high level of quality for the perception mechanisms. Therefore, we will detail these critical perception stages and provide a presentation of usable embedded sensors. Furthermore, in this study of state of the art of recent highly automated systems, some remarks and comments about limits of these systems and potential future research ways will be provided. Moreover, we will also give some advice on how to design a co-pilot application with driver modeling. Finally, we discuss a global architecture for the next generation of co-pilot applications. This architecture is based on the use of recent methods and technologies (AI, Quantify self, IoT …) and takes into account the human factors and driver modeling. © 2017 Elsevier Ltd. All rights reserved.

Contents 1. 2.

3.



Introduction: contextual elements on autonomous driving. . . . . . . . . . . . . . . . . . . . . . . . . . . . . General architecture and main processing stages for copilot application development . . . . . . 2.1. The perception of the environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2. Planning and decision-making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.3. The control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Perception of the environment: information sources, processing and dedicated architectures 3.1. Cameras. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2. RADAR. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . .

... ... ... ... ... ... ... ...

... ... ... ... ... ... ... ...

. . . . . . . .

... ... ... ... ... ... ... ...

... ... ... ... ... ... ... ...

... ... ... ... ... ... ... ...

. . . . . . . .

... ... ... ... ... ... ... ...

... ... ... ... ... ... ... ...

. . . . . . . .

.. .. .. .. .. .. .. ..

. . . . . . . .

.. 2 .. 5 .. 6 .. 7 .. 8 .. 8 .. 8 . . 10

Corresponding author. E-mail address: [email protected] (D. Gruyer).

https://doi.org/10.1016/j.arcontrol.2017.09.012 1367-5788/© 2017 Elsevier Ltd. All rights reserved.

Please cite this article as: D. Gruyer et al., Perception, information processing and modeling: Critical stages for autonomous driving applications, Annual Reviews in Control (2017), https://doi.org/10.1016/j.arcontrol.2017.09.012

JID: JARAP 2

ARTICLE IN PRESS

[m5G;October 27, 2017;9:15]

D. Gruyer et al. / Annual Reviews in Control 000 (2017) 1–19

3.3. LIDAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4. Evaluation of perception algorithms. . . . . . . . . . . . 3.5. Remarks and comments . . . . . . . . . . . . . . . . . . . . . 4. Perception of the environment: information fusion . . . . 4.1. Constraints on perception . . . . . . . . . . . . . . . . . . . . 4.2. Multi-sensors fusion architecture for localization . 4.3. Modeling theory for data fusion . . . . . . . . . . . . . . . 4.4. Functional and software architecture . . . . . . . . . . . 5. Prospective and limitations of drivers’ information . . . . . 5.1. Driver in-the-loop requirements . . . . . . . . . . . . . . . 5.2. Driver supervision: data management . . . . . . . . . . 5.3. Driver’s information modeling . . . . . . . . . . . . . . . . 6. Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

1. Introduction: contextual elements on autonomous driving We have witnessed, in the last four decades, the development of Advanced Driver Assistance Systems (ADAS), automated driving systems and Intelligent Transport Systems (ITS). Most of these systems have been developed within the framework of numerous research projects and programs that bring together car manufacturers and research laboratories around the world. Until very recently, these systems could be seen either as informative or as active short term assistance with short range information. Now, with the challenge of highly autonomous driving, research and recent developments attempt to answer to the highest autonomy level in order to completely remove the driver’s driving task. ADAS are designed to improve safety by avoiding collision and minimizing energy consumption followed by providing comfort to the vehicle’s occupants. More recently, ADAS have clearly appeared as the most important and critical step to converge towards either semi-automated or fully automated vehicles. In order to undertake the challenges of driving automation and to provide an intelligence level in order to autonomously perform a range of driving tasks, it is mandatory to use a set of processing functions, algorithms, and applications as shown in Fig. 1. These functionalities allow perceiving, predicting and estimating the state of the road scene. These key components can be classified into 5 main categories: obstacles (dynamic, static, vulnerable, nonvulnerable), road attributes and free driving zone (road marking, number of lanes, intersection …), ego vehicle (positioning and dynamic state), environmental conditions (weather conditions, vertical road sign ...), and the driver (biological and psychological state, current actions …). The estimation of the current or predicted state as well as their interactions creates the potential to build local or extended dynamic perception maps (see Fig. 2). Automated driving requires accurate, reliable, and robust information about the first four key components. Table 1 provides an overview of a part of the data extracted from the sensors data processing for these main key components. These partial or fully automated systems can be classified along three types of purposes: (i) mobility functions, (ii) safety functions, and (iii) energy management functions. A mobility function (i) is intended to make the navigation or the driving tasks easier and more comfortable for the driver while minimizing “travel time”, “distance”, or “a geographical goal” for example. The technologies required in this category of applications include Adaptive Cruise Control, Parking Assist and Lane Keeping Assist. On the other hand, many driving assistance systems have been developed more specifically to improve the user safety aspect

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

10 10 11 12 12 12 12 14 15 15 16 17 17 18

(ii). This began with the anti-lock braking system (ABS) in the 1960/1970s, then the electronic stability control (ESC) in 1995. Finally, in the last decade, many other more complex emerging embedded systems have been sold on the mass market. We can cite, for example, a lane departure alert in 2001, and emergency brake automation (AEB) for collision impact mitigation in 2003. Regarding the consumption aspect (iii) the main issue is to identify and implement the most relevant strategies to minimize the consumption of energy (gasoline or electricity). In all cases, these three issues are interdependent. In order to optimize the consumption aspect (iii) one has to act on the mobility and comfort strategies (i), and by acting on the mobility strategies we will have an impact on the level of security (ii). In addition, to ensure a higher level of autonomy, it is important to manage the energy available in the vehicle. So, multi-criteria optimization approaches and adaptive strategies have to be implemented in order to find the best balance between the three issues. Most of the systems mentioned above are now available on mainstream vehicles and not just on luxury vehicles. In 2015, the main ADAS included in consumer vehicles were parking assistance with ultra-sonic sensors, driver’s view enhancement around the vehicle during parking due to multiple cameras, and adaptive cruise control (ACC). The state of current research clearly shows that we are very close to embedded systems enabling partial automation during low-speed driving in heavy traffic situations. The ADAS market of the current decade is mainly composed of the functionalities explained in Table 2. Most of the applications listed in Table 2 are commercialized by OEM suppliers such as Valeo, Continental, ZF, TRW, Delphi, etc. Many suppliers propose systems to support lane keeping applications and active lane departure avoidance. It is the same for the Park Assist service from Daimler which proposes using RADARs to detect vehicles parked on the roadside and identify potentially free parking zones. For the same application, Valeo is working on a full autonomous parking valet (Park4U) allowing the car to park by itself and have a call mechanism so that it returns to the driver autonomously. With the increasing number of these embedded systems and the improvement of their capacities, reliability and robustness, the situation is progressing rapidly towards automated driving with the main objective to replace the driver in the driving task. This can be problematic, in case of unmanageable situations or failure of the automated control system due to sensors, actuators, electronic equipment, software error or breakdown, because the system should be able to quickly warn the driver. To limit the risk level in this machine/human transition situation, it is necessary to predict and anticipate these situations in order to alert the driver sufficiently ahead of time to allow him to take over the control of the car. Currently, the management of this machine/human

Please cite this article as: D. Gruyer et al., Perception, information processing and modeling: Critical stages for autonomous driving applications, Annual Reviews in Control (2017), https://doi.org/10.1016/j.arcontrol.2017.09.012

ARTICLE IN PRESS

JID: JARAP

[m5G;October 27, 2017;9:15]

D. Gruyer et al. / Annual Reviews in Control 000 (2017) 1–19

3

Fig. 1. Active and informative driving applications, a complex multiple layers system for information filtering, processing, modeling.

Fig. 2. A view on the detection of the three main components (obstacles, road attributes and ego vehicle). Table 1 Information that is taken into account in the local map, with corresponding octets. Ego-vehicle

Obstacles

Roadway

Road signs

Weather conditions

Additional information

• Position (6) • Speeds (6)

• Id (1) • Position (6) • Speeds (6)

• Attributes (6) • Type (2)

• Position (6) • Type (1)

• Type (1) • Density (1)

• Risk level (1) • Warning (1)

• Confidence (1)

• Information (1)

• Visibility • Distance (1)

• Mode (1)

• Variances (6) • Heading (2)

• Variances (6) • Heading (2) • Confidence (1)

transition step is still a real problem and is therefore a major issue for development of highly automated driving systems. In order to structure the different levels of active ADAS and automated driving systems according to the level of automation and the level of interaction and involvement of the driver, a classification has been proposed by the SAE distinguishing 6 different levels. The first three levels (SAE level 0, 1 and 2) only deal with driving assistance systems (intervening or warning) to help the driver to keep his perception task of the environment (the driver

is asked to stay aware of the driving task). Then, the three following levels (SAE level 3, 4 and 5) have been added to classify the various possible modes from partial and shared automation (System-Driver) to full automation without any driver needed in the vehicle. Level 3 provides a reproduction of the driving task but the driver is expected to stay aware and be able to takeover in case of failure. Level 4 is much more difficult to reach. Indeed, if the driver does not take back control of the vehicle when the system asks, the automated driving system must be able to

Please cite this article as: D. Gruyer et al., Perception, information processing and modeling: Critical stages for autonomous driving applications, Annual Reviews in Control (2017), https://doi.org/10.1016/j.arcontrol.2017.09.012

ARTICLE IN PRESS

JID: JARAP 4

[m5G;October 27, 2017;9:15]

D. Gruyer et al. / Annual Reviews in Control 000 (2017) 1–19

Table 2 Main embedded ADAS applications.

Application’s name ACC (Adaptive Cruise Control) ESP/ESC/DSC (Electronic Stability Program) VE (Visibility Enhancement) AEB (Autonomous Emergency Braking AFL (Adaptive Front Lighting)

BSM (Blind Spot Monitoring) DMS (Driver Monitoring Systems)

BCW/FCW (Backward/Forward Collision Warning) HUD (Head Up Display) ISA (Intelligent Speed Adaptation)

LDW (Lane Departure Warning) NVS (Night Vision System) IPAS (Intelligent Parking Assist System)

PDS (Pedestrian Detection System) RSR (Road Sign Recognition) SVC (Surround View Cameras) CSW (Curve-Speed Warning)

LKA/LDWS (Lane Keeping Assist) eCall (emergency call)

Proprio-ceptive sensors and observers/Extero-ceptive sensors Yes: speed (odometers, CAN bus data/Yes: Radar, cameras, LiDar Yes/No (only in the case of perceptual ESP) No/Yes (cameras) Yes (speed) but not mandatory/Yes (Radar, cameras, laser scanner) Yes (speed et angular velocity (yaw rate))/Yes (Radar, cameras, laser scanner) No/Yes (short range RADAR & ultrasonic sensor) Yes (Steering wheel angle, pressure level on the pedals, vehicle dynamics)/Yes (cameras and IR) Yes (speed: odometers, CAN bus data)/Yes (Radar, cameras, LiDAR) No/No Yes (ego-localization: positioning, speed, angular velocity, steering angle, accelerations)/Yes (cartography, perception of the environment) Yes (speed and yaw angle)/Yes (cameras) No/Yes (camera IR)

Actuators

Main goal and functionality

Yes (longitudinal control)

Adjusts automatically the vehicle speed to maintain a safe distance from vehicles ahead.

Yes

Detects and reduces loss of traction by applying the brakes to help “steer” the vehicle where the driver intends to go. Improves the driver visibility level with cameras able to perceive in night or foggy conditions. In low speed, avoids collisions and mitigates the severity of crashes by identifying critical situation. The system can warn the driver and reduce the vehicle speed. Allows adaptation of the range and shape of the lighting area to prevent glare from a vehicle and improve the driver visibility. Illuminates the future evolution area.

No Yes (longitudinal control)

Yes (but does not affect vehicle dynamics)

No No

Alerts the driver of an obstacle presence when manoeuvring a lane change. Sound and visual alert. Requires biometric non-intrusive sensors.

No

Warns a driver of a potential rear or front collision event.

No Yes (longitudinal control)

Interface with driver using ADAS outputs and CAN bus data. Alerts human driver or reduces automatically the speed in case of potential speeding.

No

Warns a driver or provide a haptic feedback (shared driving task, resistive torque, vibration, …). Improves night vision for obstacles detection based on the object temperature (heat level). Detects the free parking space and compute the optimal parking manoeuvre.

No

Yes (speed, steering angle, yaw rate, accelerations)/Yes (ultrasonic sensors belt) Yes (speed, steering angle or yaw rate)/Yes (Radar, cameras, laser scanner) No/Yes (cameras)

No

In urban and suburb areas, detects the pedestrians in order to warn the driver.

No

No/Yes (cameras)

No

Detects the vertical road signs in order to help the driver for driving rules respect. Application very dependent on visibility conditions. Only informative system. Very dependent on visibility conditions.

Yes (speed and ego-location)/Yes (cartography, map matching) Yes/Yes (cameras)

No

Warns the driver of dangerous situations when approaching a turn with an excessive speed.

Yes (lateral control)

Yes (shock sensor)/No

No

Monitors the ego lane markings. Helps the driver to stay in the ego-lane. Requires communication means to alert an emergency service.

Yes (longitudinal and lateral controls)

conduct an emergency maneuver so as to stop the vehicle safely. Finally, the last level (level 5) describes systems offering full autonomy where no human intervention is required. In this last level of automation, the control system has to rely on a great number of information sources in order to understand and model the road scene in whatever situation. The vehicle should take, at any time, the most appropriate decision in order to guarantee the highest level of safety. This topic will be discussed in more details in the next section. For this reason, the perception level and the embedded sensors topology is still a critical research topic and a technological lock. In this context a great set of research laboratories have developed their own prototype in order to develop, to test, and to evaluate autonomous and connected driving applications. Such prototypes are presented in Fig. 3.

Public authorities and certification agencies (such as NCAP) also play a major role in pushing the spread of active safety systems by regulating the introduction of ADAS. This has been the first case for ABS, then for ESP and more recently for AEB. As of today, only a few car manufacturers such as Tesla, Mercedes, Volvo, Hyundai, and Volkswagen offer partially automated driving functions. Those advanced features require the driver remaining continuously alert, monitoring the system and being ready to take back control at any time. As a result, systems currently sold on passenger vehicles can be ranked in SAE levels 2 and 3, and for very rare cases in level 4 (see Table 2). Moreover, if we consider the number of possible different road events (pedestrians, two wheels, intersections, dense traffic, driving rule violation, deterioration of roads and infrastructure, etc.)

Please cite this article as: D. Gruyer et al., Perception, information processing and modeling: Critical stages for autonomous driving applications, Annual Reviews in Control (2017), https://doi.org/10.1016/j.arcontrol.2017.09.012

JID: JARAP

ARTICLE IN PRESS

[m5G;October 27, 2017;9:15]

D. Gruyer et al. / Annual Reviews in Control 000 (2017) 1–19

5

Fig. 3. Different types of automated vehicles produced by research laboratories.

to which these systems must respond, it is clear that the fully autonomous vehicles equipped with a real copilot system working in all conditions still requires research and development. The will of the society to develop autonomous vehicles has revolutionized the automotive market, previously restricted to car manufacturers. The functionalities of environment perception, data management, embedded systems, and connected vehicles have offered research opportunities to new companies, a priori totally outside of the automotive world. As an example, the graphics card manufacturer Nvidia now offers dedicated cards based on GPUs and called PX2 (https://www.nvidia.com/content/tegra/automotive/ pdf/automotive-brochure-web.pdf, accessed on 3rd February 2017). These cards are specifically adapted to provide an efficient solution with sufficient power and computing resources to onboard complex multiple functions applications involving perception algorithms, safe path planning, decision modules, and the computed orders for actuators control. It is also interesting to note that recently, companies like Muse, Eazymile and Navya (formerly Induct) (http://navya.tech/), have developed and marketed autonomous shuttles focused on urban areas (city centre) and dedicated zones (airport parkings, campuses, theme parks, industrial areas, resorts ...). These shuttles operate at low speed (

Suggest Documents