Advanced Driving Assistance Systems for an Electric

0 downloads 0 Views 759KB Size Report
mechanisms for actuation and haptic feedback has been installed. .... subsystem in order to be used as power assisted steering wheel. ..... Visualization unit GUI.
Advanced Driving Assistance Systems for an Electric Vehicle Pau Muñoz-Benavent, Leopoldo Armesto, Vicent Girbés, J. Ernesto Solanes, Juan Dols, Adolfo Muñoz, Josep Tornero Universitat Politècnica de València. Inst. of Design and Manufacture. {pmunyoz, leoaran, vgirbes, jesolanes, jdols, amunyoz, jtornero}@idf.upv.es

Abstract – This paper describes the automation of a Neighbourhood Electric Vehicle (NEV) and the embedded distributed architecture for implementing an Advanced Driving Assistance System with haptic, visual and audio feedback in order to improve safety. For the automation, original electric signals have been conditioned and mechanisms for actuation and haptic feedback has been installed. It has been chosen an embedded distributed architecture based on two low-cost boards and implemented under ROS framework. The system includes features such as collision avoidance and motion planning.

I. INTRODUCTION Based on statistics reported by Guoyong [1], 95% of traffic accidents are related to human factors, while about 70% are directly related to human errors. In this sense, vehicle driving is a process that requires enough skill to adapt to environment changes in order to provide appropriate driving. Nowadays, we can find many solutions available on commercial vehicles. However, such solutions show lack of flexibility to adapt and compete at low prices in order to become part of a massive industrialized solution. The current trend in vehicle automation is to develop solutions where the driver and an intelligent system coexist and interact with a common aim: improve saferty. This research is in one of the most relevant interest areas not only in the automotive sector, but also in other sectors related to forklift manufactures, bus manufactures, wheel chair manufactures, etc. In this paper, a Neigbhbourhood Electric Vehicle (NEV) automation process is described, which includes a description of a particular hardware and software architecture, as well as the description of a particular Advanced Driving Assistance System with haptic, visual and audio feedback implemented in the vehicle to avoid collisions and improve safety. A. Related Work There are a spread number of technologies implementing Advanced Driving Assistance Systems (ADAS), most of them in the field of automobile vehicles. Some of the most significant are: Anti-Lock Breaking System (ABS) [2], Adaptive Cruise Control (ACC) [3], Adaptive Headlights (AH) [4], Lane Change Assistant (LCA) [5] / Blind Spot Detection (BSD) [6], Driver Drowsiness Monitoring and Warning (DDMW)

[7], Electronic Brake Assist System (EBAS) [8], Electronic Stability Control (ESC) [9], Gear Shift Indicator (GSI) [10], Lane Departure Warning System (LDWS) [11], Night Vision (NV) [12], Obstacle and Collision Warning (OCW) [13], Pedestrian / Vulnerable Road User Protection (VRUP) [14], Tire Pressure Monitoring System (TPMS) [15], Up-to-date Traffic Information (UTI) [16], Intelligent Speed Adaption or Intelligent Speed Advise (ISA) [17], Adaptive Light Control (ALC) [18], Automatic Parking (AP) [19], Traffic Sign Recognition (TSR) [20], Hill Descent Control (HDC) [21]. In ADAS, the driver plays an important role demanding more and more functionalities. This leads researches to focus on how people interact with these systems, such as maneuvers selectable based driving [22], interfaces design dedicated to satisfy drivers necessities [23], human-machine interfaces components development, both visual and auditory and haptic, among the concepts of safe speed and distance [24], combination of users and safety margins driving preferences to generate optimal maneuvers [25]. Likewise, vehicle’s environment monitoring is an important field in ADAS, demanding a large amount of sensors, leading to works in sensor fusion, like for example scanner laser and video [26]. There is also an effort being made in feedback components for the driver, where haptic interfaces are growing up, like pedals for controlling deceleration [27]. A lot of the studies are focused on developing simulation methods for evaluating and verifying the quality, security and functionality of these systems [28], as well as the analysis of the effects, in terms of distance and velocity, that they produce related to security [29].

II. AUTOMATION OF AN ELECTRIC VEHICLE Bombardier is a Neighbourhood Electric Vehicle (NEV) built in fibreglass, designed to be used in short travels and low-speed roads (45 km/h max.). The vehicle has been adapted to perform manual assisted driving although it can be also use to perform autonomous tasks. The system is composed of a direction subsystem, related to the steering wheel, a speed subsystem, related to the main drive electric motor. We have extended the capabilities in order to effectively implement an ADAS by installing additional range and imaging sensors such as lasers, sonars and cameras. 1

Fig. 1. Bombardier NEV

A. Direction Subsystem In order to fully automate Bobardier NEV Twingo steering column has been installed on the direction subsystem in order to be used as power assisted steering wheel. The motor, of the steering column, is a permanentmagnet DC motor that comes with an integrated electric clutch to engage the motor to the steering column. The motor is managed through a power stage over an I2C interface, while the clutch is controlled with a digital signal. In addition to this, a GMR (Giant Magnetorresistive) sensor has been conditioned to deliver a continuous analogue signal proportional to the stress applied over the steering wheel, and therefore used as torque sensor. Attached to the steering column an absolute encoder has been mounted to measure axis angle.

Fig. 3. Speed Subsystem diagram

For that purpose, the speed subsystem has been adapted in order to regulate the vehicle’s speed by installing two additional subsystems, see Fig. 4, one for each pedal. In the throttle pedal we have installed a proportional blocking system with a levy controlled by a servo. Such system acts as a haptic device providing feedback to the driver in case of exceeded speed. For the brake pedal, we have installed a mechanism with a servo that allows us to control the brake position by mechanically pulling the pedal in.

Fig. 4. Throttle and brake haptic feedback system

Fig. 2. Steering column mounted on the Bombardier

B. Speed Subsystem The original speed subsystem is composed of a speed sensor, throttle and brake pedals, together with the vehicle speed controller, as shown in Fig. 3. The speed sensor is a Hall Effect sensor, originally included with the vehicle, which delivers a frequency modulated pulse that has been conditioned to a continuous voltage level signal. Both pedals are indeed potentiometers which can be measured to determine their position. When the brake pedal is pressed a hydraulic pump activates brake pads on the left front wheel, however its electric signal has no effect on the vehicle speed controller. As a consequence, only the throttle pedal can be used to regulate the vehicle speed from an external electronic device. Therefore, the position of the brake pedal can be monitored but not used to regulate the vehicle speed. In any case, for an ADAS, it is very important for the user to have haptic feedback as a natural interface for sensing the risk of a given situation.

C. Exteroceptive sensors In order to perform intelligent tasks for detecting people, other vehicles and avoiding collisions with the environment for the implemented ADAS, additional sensors have been installed such as a ring of networked sonar sensors through CAN bus based on Polaroid 6500 module, a front SICK LMS200 laser ranger and camera modules. III. HARDWARE ARCHITECTURE FOR EMBEDDED PROCESSING We use a two-layered embedded processing system. The low-level system handles all vehicle related electric signals, while the high-level system is used to process images from the vision system as well as to execute the main parts of the assisted driver application. The Roboard RB-110 is based on a Vortex86DX 32 bits CPU, @1GHz, 256 MB RAM. A Linux based OS with Robotics Operating System (ROS) as main software 2

component. The dimensions of the board are 96 mm x 50 mm and it accepts supply voltages from 6 to 24 volts, with very low power consumption. It is equipped with several interfaces that allow us to connect to all kind of vehicle’s signals: 16 PWM/GPIO channels, high-speed serial bus, TTL serial port, RS-485, 3 USB 2.0, AD converter, I2C bus, 10/100M LAN and miniPCI socket. The IGEPv2 board is based on OMAP3730 processor @1Ghz with 512MB RAM and a dedicated DSP for image processing, with 10/100M LAN, USB ports, DVI output (for a touch screen panel), audio in/out connectors. The main characteristic of the OMAP3 processor is its dual-processor capabilities, providing tools for crosscompiling for Linux based ARMv7 platforms while accessing to the DSP through shared memory space and DSPLINK messages using Codec Engine framework. It also provides accelerated graphics hardware based on PowerVR SGX540, which can be used to develop realtime multimedia applications for assisted driving.

vtαrget = vt ⋅ cos(α ) , ω tαrget = vt ⋅ sin (α ) l x = vtarget ⋅ cos(θ )

,

y = vtarget ⋅ sin (θ )

(1) where l is the front and rear wheel distance. In order to compute the target goal 𝑥𝑥𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 and 𝑦𝑦𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 (1) must be first solved: xtarget = x + 2 ⋅

T  T    sin  ωtarget hor  ⋅ cosθ + ωtarget hor  2 2  ωtarget   

ytarget = y + 2 ⋅

vtarget Thor  T    sin  ω  ⋅ sin θ + ωtarget hor  2  ωtarget  target 2  

vtarget

(2)

where x , y and θ is the current robot Cartesian position and orientation and 𝑇𝑇ℎ𝑜𝑜𝑜𝑜 is the time-horizon. (2) is singular when ωtarget ≈ 0 , and the following

approximation sin  ω target ⋅ Thor t  ≈ ω target ⋅ Thor should be used 

2 

2

instead:

xtarget = x + vtarget ⋅ cos(θ ) ⋅ Thor y target = y + vtarget ⋅ sin (θ ) ⋅ Thor

(3)

Fig. 6 shows the locus of different target goals with different time-horizons and steering wheel angles and linear velocities.

a) Time-horizon Thor = 2sec. Fig. 5. Hardware connections

IV. ASSISTED DRIVING SYSTEM WITH HAPTIC-VISUALAUDIO FEEDBACK A. Concept Description In autonomous driving solutions, the robot is usually commanded by specific targets that should be reached within a map. In order to reuse most of existing implementations and algorithms for autonomous robots, on ADAS must be adapted in order to provide valid goals according to driver intentions. The goal, where the vehicle should be targeted, in manual-assisted mode, can be directly computed from odometry sensory data. The driver must steer the wheel and throttle the vehicle accordingly to his purposes. This generates a deviation on the steering wheel angle α and linear velocity νt of an equivalent artificial “front” of vehicles type (1,1) [30], which are treated as inputs to compute target linear 𝑣𝑣𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 and angular velocities 𝜔𝜔𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 of the vehicle, based on the standard tricycle kinematic model and non-holonomic constrains:

b) Time-horizon Thor = 3sec.

Fig. 6. Target goal geometric locus for a non-holonomic vehicle.

As mentioned before, using such target we can integrate generic navigation frameworks to deal with the driving assistance problem, instead of autonomous navigation. In particular, we use the move_base framework to design a global planner that provides a valid plan taking into account obstacles of the environment. Such plan will be used by a local planner to propose desired linear and angular velocities, as if it were controlling an autonomous robot. The haptic interface, will take into account such desired velocities in order to provide a haptic feedback to the driver. A standard global planner takes as inputs the proposed goal, the current robot pose and a 2D map containing information about obstacles. Assuming that the vehicle will describe an arc in a short term period, the global planner reconstructs such path, based on its actual velocities and position, and trims the path into a shorter path (a.k.a. collision-free plan) that is obstacle freeguaranteed using a map. In this step, the shape of the robot is taken into account inflated with a safety margin distance, using the inscribed radius of the vehicle 3

footprint. The local planner takes such proposed plan and computes desired linear and angular velocities (𝑣𝑣𝑑𝑑 and 𝜔𝜔𝑑𝑑 ) based on the plan length 𝑑𝑑 (to avoid frontal collisions) and the closest point of a potential collision using the true robot footprint (to determine which turning direction should be avoided). Therefore, in order to safely decelerate a vehicle to avoid a frontal collision with the closest obstacle, we assume, for simplicity, a second order dynamic model of an electric motor:

𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 (𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 −𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 )(𝑣𝑣−𝑣𝑣𝑑𝑑 )

𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 = �

𝑣𝑣⋅(1−𝛼𝛼)

𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 = �

(4)

where V (s ) ≡ L[v(t )] is the vehicle’s linear velocity,

D(s ) ≡ L[d (t )] the travel distance and U (s ) ≡ L[u (t )]

is the driver’s input with range u (t ) ∈ [− u max , u max ], being L the Laplace transform. Without loss of generalization, we will assume that the vehicle reaches a top linear velocity of vmax when the driver fully

accelerates, that is, u (t ) = u max , therefore K = vmax umax and τ is the time constant. By applying the inverse Laplace transform to (4) and with initial condition v(0 ) = v and d (0) = 0 , with v the current vehicle velocity:

    

𝛼𝛼𝛼𝛼

𝑖𝑖𝑖𝑖 𝑣𝑣 ≤ 𝑣𝑣𝑑𝑑

+ 𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 𝑖𝑖𝑖𝑖 α𝑣𝑣 < 𝑣𝑣𝑑𝑑 < 𝑣𝑣

(10)

𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒

0 𝑖𝑖𝑖𝑖 α𝑣𝑣 < 𝑣𝑣𝑑𝑑

−𝑎𝑎𝑚𝑚𝑚𝑚𝑚𝑚 (α𝑣𝑣−𝑣𝑣𝑑𝑑 )

V (s ) K D(s ) 1 = , = U (s ) τ ⋅ s + 1 V (s ) s

−τ −τ  d (τ ) = τ 1 − e τ  ⋅ v − vmax  τ − τ 1 − e τ      −τ −τ v(τ ) = e τ ⋅ v − vmax 1 − e τ   

Based on such desired velocity, our ADAS will modify the position of the servo and break as follows:

𝑖𝑖𝑖𝑖 0 < 𝑣𝑣𝑑𝑑 < 𝛼𝛼𝛼𝛼

(11)

−𝑎𝑎𝑚𝑚𝑚𝑚𝑚𝑚 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒

where 𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 and 𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 are the servo’s minimum and maximum reachable positions. 𝛼𝛼 ∈ {0,1} is a design parameter that establishes desired behaviour when 𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚 < 𝑑𝑑 < 𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚 . The higher the 𝛼𝛼 value, more aggressive the behaviour. As it can be seen from (10) and (11), our proposed haptic device will linearly cancel the user action over the throttle pedal if the plan length 𝑑𝑑 is lower than 𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚 . Closer to that distance, this will push-out the throttle pedal from its minimum allowed position 𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 to its maximum position 𝜙𝜙𝑚𝑚𝑚𝑚𝑚𝑚 , proportionally. Extra deceleration might be required if the vehicle reaches a distance to the closest potentially obstacle lower than a distance, 𝑑𝑑𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 , related to the design parameter 𝛼𝛼. In such cases, the break assistance becomes active and increases linearly up to the maximum allowed deceleration – 𝑢𝑢𝑚𝑚𝑚𝑚𝑚𝑚 when the vehicle reaches𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚 .

(5) we can compute the time to reach null velocity when applying the maximum negative acceleration: v +v  t = t ⋅ ln max  v max  (6) Therefore, the minimum travel distance

d min is:

  v + v   d s = τ ⋅ v − vmax ⋅ ln max  vmax  

d min = d s + d front

(7) (8)

where d front is the distance from the vehicle centre to the front part of its body (plus a security margin). On the other hand, a maximum distance, indicating the influence distance of an obstacle, is simply computed d max = d min + Dmax , where Dmax is a design parameter representing the anticipation distance so that the vehicle should reduce its speed before reaching the “inevitable collision” distance d min . Therefore, in order to assist vehicle speed control, we propose the following desired velocity: 0 𝑖𝑖𝑖𝑖 𝑑𝑑 < 𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚

𝑣𝑣⋅(𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚 −𝑑𝑑)

𝑣𝑣𝑑𝑑 = �𝑣𝑣𝑑𝑑 = 𝑣𝑣 − 𝑑𝑑

𝑚𝑚𝑚𝑚𝑚𝑚 −𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚

𝑖𝑖𝑖𝑖 𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚 < 𝑑𝑑 < 𝑑𝑑𝑚𝑚𝑚𝑚𝑚𝑚

𝑣𝑣𝑑𝑑 = 𝑣𝑣 𝑜𝑜𝑜𝑜ℎ𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒

(9)

Fig. 7. Haptic device response

In order to provide a desired angular velocity we propose: (12) 𝜔𝜔𝑑𝑑 = 𝜔𝜔 + 𝜔𝜔𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐

where 𝜔𝜔𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 ∈ {1,0, −1} is a compensation value depending on the quadrant the potential collision will be found, as shown in Fig. 8. The torque 𝜏𝜏𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 applied to the power assisted steering wheel, will be affected by such compensation value as follows: (13) 𝜏𝜏𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 = 𝜏𝜏𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 + 𝐾𝐾𝜏𝜏 sign(𝜔𝜔 − 𝜔𝜔𝑑𝑑 ) 4

II

I

X

III

The Vehicle_Controller node is responsible for reading vehicle’s proprioceptive sensor data and publishing the “/state” topic, representing the current state of the vehicle, which includes pedal measurements, gear sense, torque measurement, etc. The Vehicle_Controller node is also responsible for publishing the odometry data on “/odom” topic. Moreover, it executes the lowest level of the control loop between the driver and the vehicle. Reference commands are received from the ADAS_Controller containing the throttle servo and brake desired positions and the torque to be applied to the steering column through the “/reference” topic. ADAS_Controller has two main objectives: on one hand it computes a tentative goal based on the current state of the vehicle read from the “/state” topic and following the algorithm described in section IV (1) - (3); on the other hand, ADAS_Controller processes desired linear and angular velocities, read from “/cmd_vel”, to compute reference commands (10), (11) and (13). Move_Base node supports any global planner adhering to the nav_core::BaseGlobalPlanner interface specified in the nav_core package and any local planner adhering to the nav_core::BaseLocalPlanner interface, also specified in the nav_core package [31]. For that purpose we have implemented two plug-ins, adhering to such interfaces.

IV

Fig. 8. Desired angular velocity to be assigned based on the quadrant potential collision.

B. Software Architecture The software architecture is implemented under ROS (Robotics Operative System) [31] in a distributed architecture, by publishing topics and subscribing to them transparently through an Ethernet port. The system is implemented as different nodes using the ROS framework by defining our own nodes compatible with the Move_Base node in navigation stack [31]. An overview of the system is given in Fig. 9, where grey boxes represent ROS nodes and white ovals represent additional components like plug-ins. The software architecture is divided in two coarse control levels: the low-level, implemented in the RoBoard RB-110, is in charge of executing the Vehicle_Controller node, while the high-level in the IGEP v2 is aimed to integrate the decision control unit and visualization unit, executing ADAS_Controller, Move_Base, Laser_Driver and Visualization.

Fig. 9. ROS-based implemented solution

As mentioned before in section IV, the global planner computes a collision free plan according to the vehicle kinematics, ensuring that such plan is directed towards the desired goal. On the other hand, the local planner implements (4) – (9) and (12) providing desired linear and angular velocities.

Fig. 10. Visualization unit GUI

5

Visualization node subscribes to all topics and it is aimed to provide a Graphical User Interface (GUI) over the touch screen panel. The GUI uses OpenGL ES 2.0 library for the OMAP3 platform (IGEP v2). The aim of the GUI is to visualize the current state of the vehicle and its surroundings so that the driver can get visual feedback. The GUI shows a simplified overview of the position and direction of heading of the vehicle in its near environment. This is accomplished by giving an orthographic top view of the scene containing a footprint of the vehicle, nearby obstacles (in its real and inflated state) and an arrow of direction of heading. The arrow represents the direction of heading, current velocities and risk of collision coded in a colour map. The length of the arrow depends on the ratio of current and desired lineal velocities, while the colour of the arrow varies from green (no risk of collision) to red (potential collision). The system is mainly designed to give support to the driver in case of danger or bad view of its surrounding, while keeping idle during the normal process of driving. Future versions of the system are planned to include additional visual and auditory feedback, like a proposed direction of heading, camera feeds of blind spots and signal tones. V. CONCLUSIONS AND FURTHER WORK This paper has described the automation of an electric vehicle and the embedded distributed architecture for implementing an Advanced Driving Assistance System with haptic, visual and audio feedback in order to improve safety. It has been chosen an embedded distributed architecture based on two low-cost boards and implemented under ROS framework. The system includes features such as collision avoidance and motion planning. Research will be done on haptic-audio-visual interfaces for driving assistance to extend current development with more elaborated information, such as mobile objects detection and 3D maps generation. Development of new haptic interfaces adapted to the problem, integration of sound surrounding to eco-localize mobile objects from previously processed information, 3D visualization of the environment with image interpretation of relevant information for the driver…

Fig. 11. Vehicle simulator

Furthermore, we will focus on researching and implementing vision, laser and radar algorithms for pedestrian and mobile objects detection, prediction and tracking. Fields of sensor fusion with non-conventional techniques and incremental learning for mobile objects prediction will also be studied.

VI. BIBLIOGRAPHY [1]

[2] [3] [4] [5] [6] [7]

[8] [9] [10] [11] [12] [13]

[14] [15] [16]

Moreover, we plan to use a vehicle simulator currently in development, see Fig. 11, to generate different scenarios, in which evaluate safety performances of the proposed system as well as other assistance ones. In addition, the evaluation of the methodological benchmark based on different metrics already proposed in [32] will be studied.

[17]

[18]

[19]

[20] [21]

[22]

Mu Guoyong, Chen Mingwei, Study on Status Quo and Countermeasure of Road Traffic Safety Based on Accident Stat. (J), Communications Standardization, 2005(02), pp. 37-39. Bosh, 1978. Labuhn, Pamela I., Chundrlik Jr. y William J., “Adaptive cruise control”, United States Patent 5454442, 1995. Norman E., “Adaptive/Anti-blinding Headlights”, United States Patent 6144158, 2000. Kaller J. y Hoetzer D., “Lane-Change Assistant for Motor Vehicles”, United States Patent Application 20050155808, 2009. Brett A., “Vehicle Blind Spot Detector”, United States Patent 4694295, 1987. Otman Adam, Jean Pierre, Fakhreddine, Desrochers y Kristopher, “Drowsiness detection system”, United States Patent 6822573, 2004. Mercedes-Benz (S-Class, SL-Class), “Brake Assist”, 1998. Sawada, Mamoru, Matsumoto y Toshiki, “Vehicle Stability Control System”, “United States Patent 7577504”, 2009. Nurse y Charles Alexander, “Gear Shift Indicator”, United States Patent 3985095, 1976. Nissan Motors (Cima), “Lane Keeping Support”, 2001. Windshield y Toyota (Landcruiser Cignus), 2002. Grosch y Theodore O. "Radar Sensors for Automotive Collision Warning and Avoidance", Proceedings of SPIE - The International Society for Optical Engineering 2463 (1995), pp. 239-247. Parks y Brent, “Hinge Device for Pedestrian Protection System”, United States Patent Application 20070246281, 2009. Doerksen, Ben F., Nattinger y Donald M., “Tire Pressure Monitoring System”, United States Patent 4816802, 1989. Shyu y Jia-ming, “Traffic Information Inter-Vehicle Transference and Navigation System”, United States Patent 5428544, 1995. Päätalo, M., Peltola, H. Kaliiop, M., “Intelligent speed adaptationeffects on driving behavior” Traffic safety on three continents conference, 2001. Kong, Hongzhi, Sun, Qin, Ansari, Adil, Burns y Jeffrey H., “Adaptive Lighting Control for Vision-based Occupant Sensing” United States Patent 7095002, 2006. Macphail, Margaret Gardner, Kumhyr y David Bruce, “System and Method for Automated Parking”, United States Patent 6646568, 2003. Stromme y Oyvind, “Automatic Traffic Sign Recognition”, US Patent 6813545, 2004. Gallery, Michael John, Parsons, Keith Gary Reginald, Beever y Paul Adrian, “A Wheeled Vehicle with Hill Descent Control”, European Patent EP0856446, 2001. Kauer, M., Schreiber, M and Bruder, R. "How to conduct a car? A design example for maneuver based driver-vehicle interaction," Intelligent Vehicles Symposium (IV), 2010 IEEE, vol., no., pp.1214-1221, 21-24. 2010.

6

[23] L. Anders, C. Fang, A. Per and C. Per."Using Personas and Scenarios as an Interface Design Tool for Advanced Driver Assistance Systems" Springer Berlin / Heidelberg, pp. 460-469. (2007). [24] Adell, E.; Varhelyi, A.; Alonso, M. and Plaza, J. "Developing human-machine interaction components for a driver assistance system for safe speed and safe distance," Intelligent Transport Systems, IET, vol.2, no.1, pp.1-14, (2008). [25] Biral, F.; Da Lio, M. and Bertolazzi, E. "Combining safety margins and user preferences into a driving criterion for optimal control-based computation of reference maneuvers for an ADAS of the next generation," Intelligent Vehicles Symposium, 2005. Proceedings. IEEE, vol., no., pp. 36- 41, 6-8. (2005). [26] N. Kaempchen and K. Dietmayer, "Fusion of laserscanner and video for advanced driver assistance systems," in Proceedings of ITS 2004, 11th World Congress on Intelligent Transportation Systems , Japan, 2004. [27] Mulder, M.; van Paassen, M.M.; Pauwelussen, J. and Abbink, D.A.; "Haptic car-following support with deceleration control," Systems, Man and Cybernetics, 2009. SMC 2009. IEEE International Conference on, vol., no., pp.1686-1691, 11-14. (2009). [28] B. Schick, B. Kremer, J. Henning and M.zur Heiden. “Simulation Methods to Evaluate and Verify Functions, Quality and Safety of Advanced Driver Assistance Systems” IPG Technology Conference, 2008. [29] Adell, E., Várhelyi, A., and Fontana, M. D. "The effects of a driver assistance system for safe speed and safe distance - A real-life field study ". Transportation Research Part C: Emerging Technologies, 19(1), 145-155. (2011). [30] G. Campion, G. Bastin, and B. d'Andréa-Novel, "Structural Properties and Classification of Kinematic and Dynamic Models of Wheeled Mobile Robots", in Proc. ICRA (1), 1993, pp.462-469. [31] www.ros.org [32] H. Yuste, L. Armesto and J. Tornero, "Benchmark tools for evaluating AGVs at industrial environments", in Proc. IROS, 2010, pp.2657-2662.

7