Zie Binsk I 2016
Zie Binsk I 2016
Zie Binsk I 2016
of Sensor Fusion
1
Institute of Informatics, Silesian University of Technology, Gliwice, Poland
{Adam.Ziebinski,Rafal.Cupek}@polsl.pl
2
Conti Temic microelectronic GmbH, Ingolstadt, Germany
{Hueseyin.Erdogan,Sonja.Wachter}@continental-corporation.com
Abstract. Traffic has become more complex in recent years and therefore the
expectations that are placed on automobiles have also risen sharply. Support for
drivers and the protection of the occupants of vehicles and other persons involved
in road traffic have become essential. Rapid technical developments and innova‐
tive advances in recent years have enabled the development of plenty of Advanced
Driver Assistance Systems that are based on different working principles such as
radar, lidar or camera techniques. Some systems only warn the drivers via a visual,
audible or haptical signal of a danger. Other systems are used to actively engage
in the control of a vehicle in emergency situations. Although technical develop‐
ment is already quite mature, there are still many development opportunities for
improving road safety. The further development of current applications and the
creation of new applications that are based on sensor fusion are essential for the
future. A short summary of capabilities of ADAS systems and selected ADAS
modules was presented in this paper. The review was selected toward the future
perspective of sensors fusion applied on the autonomous mobile platform.
Keywords: ADAS · Radar sensor · Lidar sensor · Camera sensor · Sensor fusion
1 Introduction
The expectations that are placed on automobiles have changed continuously in recent
years and have risen sharply. Nowadays, in particular, safety plays an enormous role,
in addition to the performance and the comfort of a vehicle. The protection of a vehicle’s
occupants and other persons involved in road traffic has become essential. In today’s
traffic, a variety of manoeuvers have to be performed by a driver, which can easily lead
to an excessive demand. These manoeuvers range from navigation, to driving and stabi‐
lizing a vehicle. With the expansion of infrastructures and ever-evolving technology,
the technical possibilities in the field of automobile production have also improved
continuously. To this end, vehicles are equipped with numerous electronic components
that provide support for a driver. However, the application of these systems differs. Some
systems only warn a driver via a visual, audible or haptical signal of a danger. Other
systems are used to actively engage in the control of a vehicle in emergency situations,
which is intended to avoid accidents that result from mistakes and carelessness in
leading the way in the realization of intelligent transportation systems is out of the scope
of this paper. Instead the authors focus on selected application examples (in Sect. 2) and
sensors used in contemporary Advanced Driver Assistance Systems (in Sect. 3).
Presented level of details should be sufficient to compare the areas of application for
different ADAS solutions and sensors. This analysis does not cover whole range of
possible options but only shows selected representative examples of given technologies
in order to find the proper coverage for the sensor fusion discussed in Sect. 4. The final
conclusions are presented in Sect. 5.
The aim of Advanced Driver Assistance Systems (ADAS) is to reduce the consequences
of an accident, to prevent traffic accidents and in the near future to facilitate fully auton‐
omous driving. The development of driver assistance systems began with Anti-lock
Braking System (ABS) introduced into a serial production in the late 70 s of the twentieth
century. The comprehensive description of the evolution of Driver Assistance Systems
as well as the expected development directions can be found in [10] The main steps on
the development path in driver assistance systems can be classified as: proprioceptive
sensors, exteroceptive sensors, and sensor networks. The proprioceptive sensors were
able to detect and respond to danger situation by analysing the behaviour of the vehicle.
The exteroceptive sensors like ultrasonic, radar, lidar, infrared and vision sensors are
able to respond on an earlier stage and to predict possible dangers. Further improvements
are expected by application of multisensory platforms and traffic sensor networks. The
aim of this section is to make a survey of the capabilities of contemporary exteroceptive
sensors and based on them ADAS systems.
ADAS do not act autonomously but provide additional information about the traffic
situation to support a driver and assist him in implementing critical actions. The
synchronization of a driver’s actions and the information from the environment and
furthermore the recognition of the current situation and the possible vehicle manoeuvers
is essential for the efficient performance of the various applications of ADAS [10]. Some
ADAS application examples are described below.
Blind Spot Detection (BSD) monitors the area next to a vehicle. The function of a
Blind Spot Detection System is to warn a driver with a visual such as a sign in the side-
view mirror [6] or with an audible signal, when there are objects in the blind spots. The
aim of this system is to avoid potential accidents, especially during lane change manoeu‐
vers in heavy traffic [11].
The Rear Cross Traffic Alert (RCTA) can help to avoid accidents when reversing
out of a parking space, which can often lead to serious accidents with pedestrians or
cyclists that involve personal injuries. For this function, the environment behind a
vehicle is monitored and checked for objects. In of the event an object is detected in the
reverse driving direction, a driver receives an audible and a visible warning [6].
The Intelligent Headlamp Control (IHC) regulates the lights of a vehicle automati‐
cally according to the environmental conditions. This application optimizes changes
between full-beam and dipped headlights during night-time drives. Driving at night or
138 A. Ziebinski et al.
through tunnels is therefore more comfortable and safer. Moreover, drivers in oncoming
vehicles are no longer blinded by a vehicle’s lights [6].
Another safety application is the Traffic Sign Assist (TSA). This system automati‐
cally recognises traffic signs (also the signs of different countries) and can process the
information they contain. Therefore, a driver is able to receive important information
such as the legal speed limits or the actual priority rules. As a result of providing such
information, the traffic sign assist enables more relaxed and also safe driving [11].
The Lane Departure Warning (LDW) scans the sides of the road and detects when
a vehicle is leaving the lane or the road. By controlling the steering movement, the system
is able to evaluate whether the lane change is intentional. The system warns a driver that
the lanes have been changed inadvertently with a visual or a haptical warning such as
steering wheel vibrations. Traffic accidents that are caused by vehicles leaving the road
or collisions with passing or parked cars can be reduced [6].
The Emergency Brake Assist (EBA) enhances driving safety via active braking
support and by automatically braking in dangerous situations. Rear-end collisions can
therefore be avoided entirely. Furthermore, the consequences of accidents are reduced
because of the reduction in the impact speed and impact energy. The emergency brake
assist is also a possible interface for pre-crash applications and restraint systems or
pedestrian protection [11].
The Adaptive Cruise Control with Stop & Go functions (ACC + S&G) controls the
distance to the vehicle in front, even in stop-and-go situations. It either warns a driver
or actively slows a vehicle’s speed if the relative distance becomes too small. This
application supports a driver, particularly in congested traffic and tailback situations.
One consequence is more comfortable and stress-free driving with the flow of traffic.
Furthermore, safety is enhanced due to a pre-defined distance and warning when emer‐
gency braking is needed [11].
In addition to the above-mentioned many new applications [10] are being developed
and optimized continuously to enhance the safety of passengers [12] and pedestrians or
animals [13, 14] and also to provide more comfortable and economical driving.
A Short Range Radar (SRR) detects objects and can measure their relative distance and
velocity. This radar sensor is integrated at the four corners behind the bumpers on a car.
Two sensors work for the forward detection of an object and two for the reverse detec‐
tion. The sensor can be mounted in a vertical mounting window of 0.5 m up to 1.2 m
from the ground. The sensor is based on Pulse Compression Modulation, which means
that pulsed electromagnetic waves at the speed of light are emitted by the sensor. The
operating frequency is 24 GHz. The emitted waves are reflected when they hit an object.
The reflection is dependent on the material, size and shape of the object. Due to special
antennas, the electromagnetic waves can be bundled in a particular direction. Thus, it is
possible to determine the exact angular coordinates of the reflecting object. By meas‐
uring the time between the transmission and reflection, the distance and speed of the
object can be detected in a field view from ± 75° up to 50 m [21]. These reflected radar
140 A. Ziebinski et al.
signals are combined in clusters according to their position and movement. The cluster
information is transmitted via CAN every cycle. The position of the object is calculated
relative to the sensor and the output is in a Cartesian coordinate system. In this way the
sensor measures the distances to an object, the relative velocity and the angular rela‐
tionship between two or more objects simultaneously in real-time. The sensor provides
two values for the distance and two values for the velocity. The relative distance between
the sensor and an object is measured in the longitudinal and lateral directions with a
resolution of up to 0.1 m. Additionally, the velocity is given for the lateral and longitu‐
dinal directions in a measurement range of up to 35 m/s and a resolution of 0.2 m/s in
the longitudinal and 0.25 m/s in the lateral direction. Because it has the functions of
object detection and distance measuring, the sensor is used for several safety applications
such as Blind Spot Detection and Rear Cross Traffic Alert. Different sensor types for
several measurement ranges are available for the radar systems. Long-range radar is
designed for long-distance applications, for example, Emergency Brake Assist and
Adaptive Cruise Control [21]. With an operating frequency of 77 GHz, the sensor has
a measurement range of up to 250 m.
3.4 Lidar
A Short Range Lidar (SRL) sensor [6] is installed in the area of the rear-view mirror of
a vehicle and is useful for a measurement range of between 1 and 10 m. A Lidar sensor
is used for the emergency brake assist. The distance to objects in front of a vehicle, in
proportion to the vehicle’s speed, can be measured with an SRL sensor. The sensor can
distinguish between obstacles for which a vehicle needs to stop such as pedestrians who
are crossing a street and those where a vehicle need not stop such as rising emissions,
road damage etc. These parameters, including the force of braking, differ for each
customer. Infrared sensors operate by transmitting energy from a laser diode. The
reflected energy is focused onto a detector consisting of an array of pixels. The measured
data is then processed using various signal-processing methods. They can operate both
night and day. Their disadvantages are their sensitivity to inclement weather conditions
and ambient light. This sensor uses three independent channels. A field of view of 27°
in horizontal and 11° in vertical direction is achieved via the three laser pulses [21]. By
measuring the time between the emitted and received pulses, the relative distance to the
reflecting object can be determined with an accuracy of ± 0.1 m. The desired results
about the relative distance and speed are given in the CAN frames. One value for the
distance and one value for the speed are measured by each of the three independent laser
beams. Thus, the sensor provides three distance frames as well as three speed frames.
All of the frames have the same structure. The measured data are communicated via a
High-Speed-CAN bus [21].
142 A. Ziebinski et al.
4 Sensor Fusion
The different ADAS all have their advantages and disadvantages and can be used for
different functions and in different application areas (Fig. 2). A camera, for example, of
course has a better performance in object-detection. It has not just the possibility to
detect an object but the fact that it can also provide specific information about what kind
of object has been detected. A camera also has a better resolution in laterally and at
elevations. However, a camera does not perform very well at night or in bad weather
conditions. A safety sensor is supposed to work in all conditions. In such cases, radar
and lidar perform much better. These two sensor types also have a reliable principle to
measure the distance and velocity to an object. The camera is only able to measure
distance in a stereo version. Each sensor is suitable for different applications and use
cases. The aim of this section is to define how to join specified in Sect. 3 ADAS devices
and take additional advantage by sensors fusion.
A good solution is the merging of data from multiple environment detecting sensors
in order to improve the efficiency of environmental detection. One possibility is to use
sensors that combine several operating principles such as the Multi-Function Lidar
sensor, which was already introduced. This sensor combines the benefits of a camera
and of an infrared sensor. A single working principle provides additional information.
For example, due to the higher lateral resolution of the camera, the cross-location and
cross-speed can be determined more precisely. The complementary information helps
to determine whether an emergency manoeuver has to be carried out.
Another example is image-based approach for fast and robust vehicle tracking from
a moving platform based on a stereo-vision camera system. Position, orientation, and
full motion state, including velocity, acceleration, and yaw rate of a detected vehicle,
are estimated from a tracked rigid 3-D point cloud. This point cloud represents a 3-D
object model and is computed by analyzing image sequences in both space and time,
i.e., by fusion of stereo vision and tracked image features [23, 24].
The risk of false triggering can be reduced by the possibility of a more accurate
calculation. ADAS use information from a vehicle’s environment. Individual solutions
A Survey of ADAS Technologies for the Future Perspective 143
calculate information independently and are partially redundant. With a central envi‐
ronment modelling, synergy effects can be used thus resulting in an economically better
technical feasibility. Moreover, the increased performance of the systems can be
achieved via the fusion of the information from different environmental sensors [25]
which could be also used for indoor location, identification of objects and navigation
system [26, 27]. A basic question for the design process is what type of hardware and
software architecture is required for such a sensor data fusion. As yet, no unified archi‐
tecture has been determined for the automobile industry. The targeted development of
a unified architecture and algorithms is desirable.
A fusion of several sensors with different working principles such as radar, infrared
and camera are possible using an embedded system [28, 29]. For autonomous driving
is required preparation a functional architecture which allow to control usage signals
from all ADAS [30]. The use of multiple sensor systems to realize several active safety
applications implies a number of other problems associated with merger signals [25].
In this case problem occurs with the assessment of reliability information provided by
each of the sensors. While in the case of a single sensor the information about result is
used, a merger between several may be overlapping and possibly with conflicting infor‐
mation requires additional output information form devices providing component infor‐
mation [25]. This problem causes, that in the case of a merger standard devices and
algorithms must be developed with functionality to allowing for a reliability assessment
of the information provided. Such solutions are realized in the AutoUniMo project. The
project deals with such new challenges for intelligent driving. A concept for a mobile
platform is being developed including the implementation of real-time CAN commu‐
nication between several different ADAS in order to create autonomous and intelligent
driving. The ADAS that are used in an actual vehicle application were chosen for this
project. A part of described above sensors will be implemented on a mobile platform
which could be used additionally in indoor environment. In order to enhance the number
of advanced control units in a vehicle, the development of efficient solutions for the
combination of these different units is essential. The system will be controlled using a
Raspberry Pi (RPI), which will receive and process the measurement frames from the
connected sensors. The RPI will be connected to the electronic sensors and ADAS
144 A. Ziebinski et al.
modules by several interfaces (I2C, SPI, UART, CAN). According to technical require‐
ments ADAS sensors will be connected by four separate CAN buses. These four CAN
buses will allow two SRR on the front, two SRR on the back, SRL and MFC to be
connected. The requirements for the preparation of communication with the ADAS
modules using CAN buses are presented in Fig. 3.
Fig. 3. The requirements for the preparation of communication with the ADAS modules
The capability of working in real-time is the essential basis for autonomous driving.
Fast data processing and artificial intelligence functions will be realised using a RPI or
partially in a Field Programmable Gate Array [31] for some of the functions. RPI solution
allows for fast prototyping. For RPI many common applications are available. They are
useful in rapid implementation of the new features: remote monitoring of the mobile
platform using WiFi; viewing, processing and analysis of measurements; communica‐
tion with ADAS modules; etc. RPI solution on mobile platform with connected ADAS
modules will allow in the future to develop and implement new computational intelli‐
gence applications for ADAS and to find new solutions for sensor fusion.
5 Conclusions
Traffic has become more complex in recent years. Humans alone are sometimes no
longer able to cope with the traffic situation. Increased mobility and therefore an
increasing number of road users, confusing traffic and traffic sign signage, time pressure
and the desire for continuous availability, have made support for drivers a necessity.
Rapid technical developments and innovative advances in recent years have enabled the
development of Control Units to enhance the safety of passengers and pedestrians. Many
ADAS are on the market and are no longer a luxury item for upscale vehicles but are
now standard equipment in low-cost cars. These control units are based on different
working principles such as radar, lidar or camera techniques. All of these sensors have
their advantages and disadvantages, which means that the most efficient application is
using a combination of the different types of sensors. The technical developments are
already quite mature but there are still many development opportunities for improving
road safety. The further development and the creation of new applications are essential
for the future. The design of the hardware and software architecture of the sensor data
fusion is a determining factor. By the CAN interface, the Raspberry Pi can communicate
A Survey of ADAS Technologies for the Future Perspective 145
with ADAS modules. So, now it is possible to use standard ADAS solutions not only
for normal vehicle but also in control mobile platform with use simple processing units.
Acknowledgements. This work was supported by the European Union through the FP7-
PEOPLE-2013-IAPP AutoUniMo project “Automotive Production Engineering Unified
Perspective based on Data Mining Methods and Virtual Factory Model” (grant agreement no:
612207) and research work financed from funds for science for years: 2016-2017 allocated to an
international co-financed project.
References
1. Winner, H., Hakuli, S., Wolf, G.: Handbuch Fahrerassistenzsysteme. Spriger Vieweg,
Wiesbaden (2012)
2. Brookhuis, K.A., de Waard, D., Janssen, W.H.: Behavioural impacts of Advanced Driver
Assistance Systems–an overview. TNO Human Factors Soesterberg; The Netherlands
3. Piao, J., McDonald, M.: Advanced driver assistance systems from autonomous to cooperative
approach. Trans. Rev. 28, 659–684 (2008)
4. Schneider, J.H.: Modellierung und Erkennung von Fahrsituationen und Fahrmanövern für
sicherheitsrelevante Fahrerassistenzsysteme. Fakultät für Elektrotechnik und
Informationstechnik, TU Chemnitz (2009)
5. Bertozzi, M., Broggi, A., Carletti, M., Fascioli, A., Graf, T., Grisleri, P., Meinecke, M.: IR
pedestrian detection for advanced driver assistance systems. In: Michaelis, B., Krell, G. (eds.)
DAGM 2003. LNCS, vol. 2781, pp. 582–590. Springer, Heidelberg (2003)
6. Continental Automotive, ‘Advanced Driver Assistance Systems’. http://www.continental-
automotive.com/www/automotive_de_en/themes/passenger_cars/chassis_safety/adas/
7. Boodlal, L., Chiang, K.-H.: Study of the Impact of a Telematics System on Safe and Fuel-
efficient Driving in Trucks. U.S. Department of Transportation Federal Motor Carrier Safety
Administration Office of Analysis, Research and Technology (2014)
8. Keller, C.G., Dang, T., Fritz, H., Joos, A., Rabe, C., Gavrila, D.M.: IEEE Xplore abstract -
active pedestrian safety by automatic braking and evasive steering. IEEE Trans. Intell. Transp.
Syst. 12, 1292–1304 (2011)
9. Tewolde, G.S.: Sensor and network technology for intelligent transportation systems.
Presented at the May (2012)
10. Bengler, K., Dietmayer, K., Farber, B., Maurer, M., Stiller, C., Winner, H.: Three decades of
driver assistance systems: review and future perspectives. IEEE Intell. Trans. Syst. Mag. 6,
6–22 (2014)
11. Vollrath, M., Briest, S., Schiessl, C., Drewes, K., Becker, U.: Ableitung von Anforderungen
an Fahrerassistenzsysteme aus Sicht der Verkehrssicherheit. Berichte der Bundesanstalt für
Straßenwesen. Bergisch Gladbach: Wirtschaftsverlag NW (2006)
12. Fildes, B., Keall, M., Thomas, P., Parkkari, K., Pennisi, L., Tingvall, C.: Evaluation of the
benefits of vehicle safety technology: The MUNDS study. Accid. Anal. Prev. 55, 274–281
(2013)
13. David, K., Flach, A.: CAR-2-X and pedestrian safety. IEEE Veh. Technol. Mag. 5, 70–76
(2010)
14. Horter, M.H., Stiller, C., Koelen, C.: A hardware and software framework for automotive
intelligent lighting. Presented at the June (2009)
15. Hegeman, G., Brookhuis, K., Hoogendoorn, S.: Opportunities of advanced driver assistance
systems towards overtaking. Eur. J. Trans. Infrastruct. Res. EJTIR 5(4), 281 (2005)
146 A. Ziebinski et al.
16. Lu, M., Wevers, K., Heijden, R.V.D.: Technical feasibility of advanced driver assistance
systems (ADAS) for road traffic safety. Trans. Planning Technol. 28, 167–187 (2005)
17. NXP - Automotive Radar Millimeter-Wave Technology. http://www.nxp.com/pages/
automotive-radar-millimeter-wave-technology:AUTRMWT
18. TDA2x - Texas Instruments Wiki. http://processors.wiki.ti.com/index.php/TDA2x
19. Bosch Mobility Solutions. http://www.bosch-mobility-solutions.com/en/
20. AutoUniMo: FP7-PEOPLE-2013-IAPP AutoUniMo project “Automotive Production
Engineering Unified Perspective based on Data Mining Methods and Virtual Factory Model”
(grant agreement no: 612207). http://autounimo.aei.polsl.pl/
21. Continental Industrial Sensors-Willkommen bei Industrial Sensors. http://www.conti-
online.com/www/industrial_sensors_de_de/
22. Kaempchen, N., Dietmayer, K.C.J.: Fusion of laserscanner and video for ADAS. IEEE Trans.
Intell. Transp. Syst. TITS 16(5), 1–12 (2015)
23. Błachuta, M., Czyba, R., Janusz, W., Szafrański, G.: Data fusion algorithm for the altitude
and vertical speed estimation of the VTOL platform. J. Intell. Rob. Syst. 74, 413–420 (2014)
24. Budzan, S., Kasprzyk, J.: Fusion of 3D laser scanner and depth images for obstacle recognition
in mobile applications. Opt. Lasers Eng. 77, 230–240 (2016)
25. Sandblom, F., Sorstedt, J.: Sensor data fusion for multiple configurations. In: Presented at the
June (2014)
26. Grzechca, D., Wrobel, T., Bielecki, P.: Indoor location and idetification of objects with video
survillance system and WiFi module. In: Presented at the September (2014)
27. Tokarz, K., Czekalski, P., Sieczkowski, W.: Integration of ultrasonic and inertial methods in
indoor navigation system. Theor. Appl. Inform. 26, 107–117 (2015)
28. Pamuła, D., Ziębiński, A.: Securing video stream captured in real time. Przegląd
Elektrotechniczny. R. 86(9), 167–169 (2010)
29. Ziebinski, A., Swierc, S.: Soft core processor generated based on the machine code of the
application. J. Circ. Syst. Comput. 25, 1650029 (2016)
30. Behere, S., Törngren, M.: A functional architecture for autonomous driving. In: Presented at
the Proceedings of the First International Workshop on Automotive Software Architecture
(2015)
31. Cupek, R., Ziebinski, A., Franek, M.: FPGA based OPC UA embedded industrial data server
implementation. J. Circ. Syst. Comp. 22, 18 (2013)