Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Zie Binsk I 2016

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

A Survey of ADAS Technologies for the Future Perspective

of Sensor Fusion

Adam Ziebinski1 ✉ , Rafal Cupek1, Hueseyin Erdogan2, and Sonja Waechter2


( )

1
Institute of Informatics, Silesian University of Technology, Gliwice, Poland
{Adam.Ziebinski,Rafal.Cupek}@polsl.pl
2
Conti Temic microelectronic GmbH, Ingolstadt, Germany
{Hueseyin.Erdogan,Sonja.Wachter}@continental-corporation.com

Abstract. Traffic has become more complex in recent years and therefore the
expectations that are placed on automobiles have also risen sharply. Support for
drivers and the protection of the occupants of vehicles and other persons involved
in road traffic have become essential. Rapid technical developments and innova‐
tive advances in recent years have enabled the development of plenty of Advanced
Driver Assistance Systems that are based on different working principles such as
radar, lidar or camera techniques. Some systems only warn the drivers via a visual,
audible or haptical signal of a danger. Other systems are used to actively engage
in the control of a vehicle in emergency situations. Although technical develop‐
ment is already quite mature, there are still many development opportunities for
improving road safety. The further development of current applications and the
creation of new applications that are based on sensor fusion are essential for the
future. A short summary of capabilities of ADAS systems and selected ADAS
modules was presented in this paper. The review was selected toward the future
perspective of sensors fusion applied on the autonomous mobile platform.

Keywords: ADAS · Radar sensor · Lidar sensor · Camera sensor · Sensor fusion

1 Introduction

The expectations that are placed on automobiles have changed continuously in recent
years and have risen sharply. Nowadays, in particular, safety plays an enormous role,
in addition to the performance and the comfort of a vehicle. The protection of a vehicle’s
occupants and other persons involved in road traffic has become essential. In today’s
traffic, a variety of manoeuvers have to be performed by a driver, which can easily lead
to an excessive demand. These manoeuvers range from navigation, to driving and stabi‐
lizing a vehicle. With the expansion of infrastructures and ever-evolving technology,
the technical possibilities in the field of automobile production have also improved
continuously. To this end, vehicles are equipped with numerous electronic components
that provide support for a driver. However, the application of these systems differs. Some
systems only warn a driver via a visual, audible or haptical signal of a danger. Other
systems are used to actively engage in the control of a vehicle in emergency situations,
which is intended to avoid accidents that result from mistakes and carelessness in

© Springer International Publishing Switzerland 2016


N.T. Nguyen et al. (Eds.): ICCCI 2016, Part II, LNAI 9876, pp. 135–146, 2016.
DOI: 10.1007/978-3-319-45246-3_13
136 A. Ziebinski et al.

complex traffic. Furthermore, most of these systems contribute to more fuel-efficient


and relaxing driving. These complex components can be divided according to their
intended use as follows [1–5]:
• Human Machine Interfaces form the basis for communication between humans and
machines. Passing information by screening them on a display is much more visual,
and therefore, it provides a driver with a better understanding. For example, naviga‐
tion tools are optimised by such visualisations. Furthermore, the outputs of a request
about a vehicle’s status are more transparent. Regulating the heating, ventilation and
radio station setting is clear, and therefore, easier. For this reason, a driver is less
distracted and can concentrate more on driving the car. These components mainly
increase the level of comfort during driving while at the same time. They also increase
security by reducing distractions from the road traffic [6].
• Safety Telematics Systems are solutions for processing information via telecommu‐
nications that enable a car-to-car or car-to-environment communication. These solu‐
tions can be grouped as systems for intelligent driving. They provide more comfort‐
able and relaxed driving and help to handle confusing situations in traffic, for
example, via navigation and the announcement of traffic jams and road closures.
A driver can, therefore, focus more on the traffic situation and does not have to search
for the right route. One application area is, for example, the avoidance of traffic jams
or the use of navigation systems that take over or assist a driver in the task of navi‐
gation and target acquisition. Although at first sight this is a comfort-enhancing
system, it also contributes to security because a driver can focus more on the manage‐
ment responsibilities by not having to navigate [7].
• Vehicle Surrounding Sensors are used to prevent accidents by monitoring a vehicle’s
environment. Through various sensors, a car’s environment is checked for obstacles
or other cars that are not always visible to a driver. The sensors warn a driver of a
potential collision in different ways or work together with active sensors to actively
support a driver in critical or dangerous situations. The known technical benefits that
are realised through the use of these sensors are, for example, parking assistance or
lane change assistance [6].
• Another group, Active Safety Sensors, actively support a driver in dangerous situa‐
tions. They take control of a vehicle, for example via Electronic Stability Control or
Brake Assist. These sensors extend the impact of the actions of a driver and therefore
reduce the negative effects of a slow human response. Thus, traffic accidents can be
reduced [8].
• Passive Safety Systems act immediately after an accident has already happened in
order to reduce the consequences of the accident and prevent passengers and pedes‐
trians from worse injuries. The most well-known example is the airbag. An example
of pedestrian protection is a pressure sensor in the bumper that can reliably detect
accidents involving people and activate safety systems in a vehicle’s body to lessen
the injuries [6].
Presented above, in-vehicle elements together with roadway sensors and vehicular
communication create a new perspective for intelligent transportation systems [9].
However, full and comprehensive analysis of all components and technologies that
A Survey of ADAS Technologies for the Future Perspective 137

leading the way in the realization of intelligent transportation systems is out of the scope
of this paper. Instead the authors focus on selected application examples (in Sect. 2) and
sensors used in contemporary Advanced Driver Assistance Systems (in Sect. 3).
Presented level of details should be sufficient to compare the areas of application for
different ADAS solutions and sensors. This analysis does not cover whole range of
possible options but only shows selected representative examples of given technologies
in order to find the proper coverage for the sensor fusion discussed in Sect. 4. The final
conclusions are presented in Sect. 5.

2 Various Applications of Adas

The aim of Advanced Driver Assistance Systems (ADAS) is to reduce the consequences
of an accident, to prevent traffic accidents and in the near future to facilitate fully auton‐
omous driving. The development of driver assistance systems began with Anti-lock
Braking System (ABS) introduced into a serial production in the late 70 s of the twentieth
century. The comprehensive description of the evolution of Driver Assistance Systems
as well as the expected development directions can be found in [10] The main steps on
the development path in driver assistance systems can be classified as: proprioceptive
sensors, exteroceptive sensors, and sensor networks. The proprioceptive sensors were
able to detect and respond to danger situation by analysing the behaviour of the vehicle.
The exteroceptive sensors like ultrasonic, radar, lidar, infrared and vision sensors are
able to respond on an earlier stage and to predict possible dangers. Further improvements
are expected by application of multisensory platforms and traffic sensor networks. The
aim of this section is to make a survey of the capabilities of contemporary exteroceptive
sensors and based on them ADAS systems.
ADAS do not act autonomously but provide additional information about the traffic
situation to support a driver and assist him in implementing critical actions. The
synchronization of a driver’s actions and the information from the environment and
furthermore the recognition of the current situation and the possible vehicle manoeuvers
is essential for the efficient performance of the various applications of ADAS [10]. Some
ADAS application examples are described below.
Blind Spot Detection (BSD) monitors the area next to a vehicle. The function of a
Blind Spot Detection System is to warn a driver with a visual such as a sign in the side-
view mirror [6] or with an audible signal, when there are objects in the blind spots. The
aim of this system is to avoid potential accidents, especially during lane change manoeu‐
vers in heavy traffic [11].
The Rear Cross Traffic Alert (RCTA) can help to avoid accidents when reversing
out of a parking space, which can often lead to serious accidents with pedestrians or
cyclists that involve personal injuries. For this function, the environment behind a
vehicle is monitored and checked for objects. In of the event an object is detected in the
reverse driving direction, a driver receives an audible and a visible warning [6].
The Intelligent Headlamp Control (IHC) regulates the lights of a vehicle automati‐
cally according to the environmental conditions. This application optimizes changes
between full-beam and dipped headlights during night-time drives. Driving at night or
138 A. Ziebinski et al.

through tunnels is therefore more comfortable and safer. Moreover, drivers in oncoming
vehicles are no longer blinded by a vehicle’s lights [6].
Another safety application is the Traffic Sign Assist (TSA). This system automati‐
cally recognises traffic signs (also the signs of different countries) and can process the
information they contain. Therefore, a driver is able to receive important information
such as the legal speed limits or the actual priority rules. As a result of providing such
information, the traffic sign assist enables more relaxed and also safe driving [11].
The Lane Departure Warning (LDW) scans the sides of the road and detects when
a vehicle is leaving the lane or the road. By controlling the steering movement, the system
is able to evaluate whether the lane change is intentional. The system warns a driver that
the lanes have been changed inadvertently with a visual or a haptical warning such as
steering wheel vibrations. Traffic accidents that are caused by vehicles leaving the road
or collisions with passing or parked cars can be reduced [6].
The Emergency Brake Assist (EBA) enhances driving safety via active braking
support and by automatically braking in dangerous situations. Rear-end collisions can
therefore be avoided entirely. Furthermore, the consequences of accidents are reduced
because of the reduction in the impact speed and impact energy. The emergency brake
assist is also a possible interface for pre-crash applications and restraint systems or
pedestrian protection [11].
The Adaptive Cruise Control with Stop & Go functions (ACC + S&G) controls the
distance to the vehicle in front, even in stop-and-go situations. It either warns a driver
or actively slows a vehicle’s speed if the relative distance becomes too small. This
application supports a driver, particularly in congested traffic and tailback situations.
One consequence is more comfortable and stress-free driving with the flow of traffic.
Furthermore, safety is enhanced due to a pre-defined distance and warning when emer‐
gency braking is needed [11].
In addition to the above-mentioned many new applications [10] are being developed
and optimized continuously to enhance the safety of passengers [12] and pedestrians or
animals [13, 14] and also to provide more comfortable and economical driving.

3 Advanced Driver Assistance Systems

Advanced Driver Assistance Systems have become indispensable in today’s vehicles.


Due to the increasing need for mobility, traffic has become more and more complex and
therefore has become a greater challenge for all road users. ADAS are essential in order
to avoid accidents and any concomitant injuries or possible fatalities [15]. Furthermore,
they provide solutions for comfortable, economical and intelligent driving. In the last
few years, more and more different types of complex control units have been developed
and integrated into vehicles [9, 16]. These systems differ in their operating principles
and application areas. ADAS use surrounding sensors such as radar, infrared, video or
ultrasound to monitor and analyses a vehicle’s environment.
A Survey of ADAS Technologies for the Future Perspective 139

Fig. 1. ADAS and their vehicle position [6]

Various companies such as Bosch, Continental, Delphi Automotive, Freescale,


Texas Instruments and many other suppliers provide different types of ADAS solutions
for End-User Applications. They offer microcontrollers, microprocessors or complete
sensors for ADAS. Freescale, for example, creates embedded processing solutions for
the automotive industry and provides solutions with basic-rear-, smart-rear- and
surround-view-cameras as well as a 77 GHz radar system [17]. With ADAS application
processors, Texas Instruments enables Original-Equipment and Design Manufacturers
a fully integrated mixed processor solution. The processors are highly integrated,
programmable platforms whose benefits are the combination of high processing
performance, low power usage, smaller footprint and a flexible and robust operating
system. It enables embedded automobile technology by including a front camera, rear
camera, surround view, radar and fusion on a single, heterogeneous and scalable archi‐
tecture [18]. Other suppliers such as Continental or Bosch [19] provide complete sensors
for ADAS that are also based, for example, on a camera or radar. Although the various
ADAS solutions of the several suppliers differ in their construction, the main working
principle, the mounting placement in the car and their application for ADAS are similar.
Due to the usage of the equipment from Continental in AutoUniMo project [20], a major
review was prepared based on examples of their ADAS products (Fig. 1) [6].

3.1 Radar Sensor

A Short Range Radar (SRR) detects objects and can measure their relative distance and
velocity. This radar sensor is integrated at the four corners behind the bumpers on a car.
Two sensors work for the forward detection of an object and two for the reverse detec‐
tion. The sensor can be mounted in a vertical mounting window of 0.5 m up to 1.2 m
from the ground. The sensor is based on Pulse Compression Modulation, which means
that pulsed electromagnetic waves at the speed of light are emitted by the sensor. The
operating frequency is 24 GHz. The emitted waves are reflected when they hit an object.
The reflection is dependent on the material, size and shape of the object. Due to special
antennas, the electromagnetic waves can be bundled in a particular direction. Thus, it is
possible to determine the exact angular coordinates of the reflecting object. By meas‐
uring the time between the transmission and reflection, the distance and speed of the
object can be detected in a field view from ± 75° up to 50 m [21]. These reflected radar
140 A. Ziebinski et al.

signals are combined in clusters according to their position and movement. The cluster
information is transmitted via CAN every cycle. The position of the object is calculated
relative to the sensor and the output is in a Cartesian coordinate system. In this way the
sensor measures the distances to an object, the relative velocity and the angular rela‐
tionship between two or more objects simultaneously in real-time. The sensor provides
two values for the distance and two values for the velocity. The relative distance between
the sensor and an object is measured in the longitudinal and lateral directions with a
resolution of up to 0.1 m. Additionally, the velocity is given for the lateral and longitu‐
dinal directions in a measurement range of up to 35 m/s and a resolution of 0.2 m/s in
the longitudinal and 0.25 m/s in the lateral direction. Because it has the functions of
object detection and distance measuring, the sensor is used for several safety applications
such as Blind Spot Detection and Rear Cross Traffic Alert. Different sensor types for
several measurement ranges are available for the radar systems. Long-range radar is
designed for long-distance applications, for example, Emergency Brake Assist and
Adaptive Cruise Control [21]. With an operating frequency of 77 GHz, the sensor has
a measurement range of up to 250 m.

3.2 Multi Function Camera


The Multi-Function Camera (MFC) [6] is mounted in the vicinity of the rearview mirror
behind the windscreen at a height above the ground of 1-1,45 m with a max. lateral shift
of ± 0,1 m. The optical path can be cleaned by wipers. The camera is available in a mono
or stereo version. Due to the perspective differences between the two camera images,
the stereo camera can detect objects in front of the car and measure their distance within
a range of between 20-30 m. The redundancy of the second camera also provides a higher
degree of reliability. The camera can furthermore detect whether and where an object
is moving. A grayscale image is captured and processed using the camera. The camera
operates with a monochrome CMOS image sensor. This provides 40 ms an image with
a resolution of 640 × 496 pixels, a pixel size of 7,5 × 7 microns and a colour depth of
8 bits. A detection range of approximately 44° is achieved at a focal length of 6 mm.
Plausibility algorithms and filters are used to gather further information on the environ‐
ment. Visible edges in the video image, for example, can represent vehicle contours, a
lane marking or headlights from other cars. A typical processing frequency is 12 Hz.
With these characteristics the camera can be used for different advanced driver
assistance functions such as Intelligent Headlamp Control [21]. For this application, the
camera provides information about the angle and distance to the next light point. The
sensor is also used for the Lane Departure Warning and therefore measures the distance
to the left and right edges of the roadway and the allowable distance before an alarm is
triggered. This function is possible up to a road width of 2,5–4,6 m and at a distance of
up to 7,5–90 m. Another application of the camera is the Traffic Sign Assist. The camera
sensor provides information about the current speed limit, gives feedback about no-
passing bans and warns when a street is entered in the wrong driving direction. All of
the system outputs are transmitted via a CAN bus [6, 21].
A Survey of ADAS Technologies for the Future Perspective 141

3.3 Surround View Camera


ADAS are continually being developed in order to improve their efficiency and to
increase their applications. The Multi-Function Camera that has already been described
is a very useful sensor and is used in a large number of safety applications although its
field of view is limited. An essential prerequisite for autonomous driving, however, is
knowledge of all relevant information from the vehicle’s entire environment. An inno‐
vative system that fulfils these requirements is the Surround View Camera [6]. A
Surround View Camera provides a complete three-dimensional 360° view via a
“fisheye” camera and also provides the possibility for augmented reality. Therefore, the
surround view sensor can be used for several applications. One of these functions is
Blind Spot detection, which enables cyclists and pedestrians to be easily tracked. The
camera provides stress-free vehicle maneuverability. Parking by a driver as well as
automated parking is especially easy and trouble free. Moreover, the camera ensures
safer lane changes and overtaking.
Furthermore, the surround view camera is a very small sensor and can therefore be
built in a very space-saving manner such as the ceiling of a car or on the side-view
mirrors. Size is another challenge in development and production. Sensors have to be
smaller, while still integrating new and complex technology that has to be of a high
quality and reliable. The dimensions and tolerance deviations have become smaller,
which requires ever more precise manufacturing [1, 6].

3.4 Lidar

A Short Range Lidar (SRL) sensor [6] is installed in the area of the rear-view mirror of
a vehicle and is useful for a measurement range of between 1 and 10 m. A Lidar sensor
is used for the emergency brake assist. The distance to objects in front of a vehicle, in
proportion to the vehicle’s speed, can be measured with an SRL sensor. The sensor can
distinguish between obstacles for which a vehicle needs to stop such as pedestrians who
are crossing a street and those where a vehicle need not stop such as rising emissions,
road damage etc. These parameters, including the force of braking, differ for each
customer. Infrared sensors operate by transmitting energy from a laser diode. The
reflected energy is focused onto a detector consisting of an array of pixels. The measured
data is then processed using various signal-processing methods. They can operate both
night and day. Their disadvantages are their sensitivity to inclement weather conditions
and ambient light. This sensor uses three independent channels. A field of view of 27°
in horizontal and 11° in vertical direction is achieved via the three laser pulses [21]. By
measuring the time between the emitted and received pulses, the relative distance to the
reflecting object can be determined with an accuracy of ± 0.1 m. The desired results
about the relative distance and speed are given in the CAN frames. One value for the
distance and one value for the speed are measured by each of the three independent laser
beams. Thus, the sensor provides three distance frames as well as three speed frames.
All of the frames have the same structure. The measured data are communicated via a
High-Speed-CAN bus [21].
142 A. Ziebinski et al.

3.5 Multi Function Lidar


A Multi-Function Lidar sensor [6] is a combination of an infrared and camera sensor.
It can be used for the same applications as a camera and lidar sensor, e.g. Emergency
Brake Assist, Lane Departure Warning, Intelligent Headlamp Control or for Traffic Sign
Assist but it achieves this with a much smaller range. Multi-Function Lidar has a compact
One Box Design with a colour CMOS imager of 1024 x 605 pixels and a 905 nm infrared
laser. A combination of several different sensors is necessary to enable the maximum
benefits and an efficient safety function. Space-saving sensors are also required by auto‐
mobile manufacturers. New innovative systems show that this technical trend is
currently being pursued. One step in the further development of ADAS is the integration
of several different techniques into one sensor. The Multi-Function Lidar is a smart
sensor that has the advantages of a lidar as well as those from a camera sensor in a space-
saving manner [6, 11, 22].

4 Sensor Fusion

The different ADAS all have their advantages and disadvantages and can be used for
different functions and in different application areas (Fig. 2). A camera, for example, of
course has a better performance in object-detection. It has not just the possibility to
detect an object but the fact that it can also provide specific information about what kind
of object has been detected. A camera also has a better resolution in laterally and at
elevations. However, a camera does not perform very well at night or in bad weather
conditions. A safety sensor is supposed to work in all conditions. In such cases, radar
and lidar perform much better. These two sensor types also have a reliable principle to
measure the distance and velocity to an object. The camera is only able to measure
distance in a stereo version. Each sensor is suitable for different applications and use
cases. The aim of this section is to define how to join specified in Sect. 3 ADAS devices
and take additional advantage by sensors fusion.
A good solution is the merging of data from multiple environment detecting sensors
in order to improve the efficiency of environmental detection. One possibility is to use
sensors that combine several operating principles such as the Multi-Function Lidar
sensor, which was already introduced. This sensor combines the benefits of a camera
and of an infrared sensor. A single working principle provides additional information.
For example, due to the higher lateral resolution of the camera, the cross-location and
cross-speed can be determined more precisely. The complementary information helps
to determine whether an emergency manoeuver has to be carried out.
Another example is image-based approach for fast and robust vehicle tracking from
a moving platform based on a stereo-vision camera system. Position, orientation, and
full motion state, including velocity, acceleration, and yaw rate of a detected vehicle,
are estimated from a tracked rigid 3-D point cloud. This point cloud represents a 3-D
object model and is computed by analyzing image sequences in both space and time,
i.e., by fusion of stereo vision and tracked image features [23, 24].
The risk of false triggering can be reduced by the possibility of a more accurate
calculation. ADAS use information from a vehicle’s environment. Individual solutions
A Survey of ADAS Technologies for the Future Perspective 143

calculate information independently and are partially redundant. With a central envi‐
ronment modelling, synergy effects can be used thus resulting in an economically better
technical feasibility. Moreover, the increased performance of the systems can be
achieved via the fusion of the information from different environmental sensors [25]
which could be also used for indoor location, identification of objects and navigation
system [26, 27]. A basic question for the design process is what type of hardware and
software architecture is required for such a sensor data fusion. As yet, no unified archi‐
tecture has been determined for the automobile industry. The targeted development of
a unified architecture and algorithms is desirable.

Fig. 2. Technological comparison of different advanced driver assistance systems

A fusion of several sensors with different working principles such as radar, infrared
and camera are possible using an embedded system [28, 29]. For autonomous driving
is required preparation a functional architecture which allow to control usage signals
from all ADAS [30]. The use of multiple sensor systems to realize several active safety
applications implies a number of other problems associated with merger signals [25].
In this case problem occurs with the assessment of reliability information provided by
each of the sensors. While in the case of a single sensor the information about result is
used, a merger between several may be overlapping and possibly with conflicting infor‐
mation requires additional output information form devices providing component infor‐
mation [25]. This problem causes, that in the case of a merger standard devices and
algorithms must be developed with functionality to allowing for a reliability assessment
of the information provided. Such solutions are realized in the AutoUniMo project. The
project deals with such new challenges for intelligent driving. A concept for a mobile
platform is being developed including the implementation of real-time CAN commu‐
nication between several different ADAS in order to create autonomous and intelligent
driving. The ADAS that are used in an actual vehicle application were chosen for this
project. A part of described above sensors will be implemented on a mobile platform
which could be used additionally in indoor environment. In order to enhance the number
of advanced control units in a vehicle, the development of efficient solutions for the
combination of these different units is essential. The system will be controlled using a
Raspberry Pi (RPI), which will receive and process the measurement frames from the
connected sensors. The RPI will be connected to the electronic sensors and ADAS
144 A. Ziebinski et al.

modules by several interfaces (I2C, SPI, UART, CAN). According to technical require‐
ments ADAS sensors will be connected by four separate CAN buses. These four CAN
buses will allow two SRR on the front, two SRR on the back, SRL and MFC to be
connected. The requirements for the preparation of communication with the ADAS
modules using CAN buses are presented in Fig. 3.

Fig. 3. The requirements for the preparation of communication with the ADAS modules

The capability of working in real-time is the essential basis for autonomous driving.
Fast data processing and artificial intelligence functions will be realised using a RPI or
partially in a Field Programmable Gate Array [31] for some of the functions. RPI solution
allows for fast prototyping. For RPI many common applications are available. They are
useful in rapid implementation of the new features: remote monitoring of the mobile
platform using WiFi; viewing, processing and analysis of measurements; communica‐
tion with ADAS modules; etc. RPI solution on mobile platform with connected ADAS
modules will allow in the future to develop and implement new computational intelli‐
gence applications for ADAS and to find new solutions for sensor fusion.

5 Conclusions

Traffic has become more complex in recent years. Humans alone are sometimes no
longer able to cope with the traffic situation. Increased mobility and therefore an
increasing number of road users, confusing traffic and traffic sign signage, time pressure
and the desire for continuous availability, have made support for drivers a necessity.
Rapid technical developments and innovative advances in recent years have enabled the
development of Control Units to enhance the safety of passengers and pedestrians. Many
ADAS are on the market and are no longer a luxury item for upscale vehicles but are
now standard equipment in low-cost cars. These control units are based on different
working principles such as radar, lidar or camera techniques. All of these sensors have
their advantages and disadvantages, which means that the most efficient application is
using a combination of the different types of sensors. The technical developments are
already quite mature but there are still many development opportunities for improving
road safety. The further development and the creation of new applications are essential
for the future. The design of the hardware and software architecture of the sensor data
fusion is a determining factor. By the CAN interface, the Raspberry Pi can communicate
A Survey of ADAS Technologies for the Future Perspective 145

with ADAS modules. So, now it is possible to use standard ADAS solutions not only
for normal vehicle but also in control mobile platform with use simple processing units.

Acknowledgements. This work was supported by the European Union through the FP7-
PEOPLE-2013-IAPP AutoUniMo project “Automotive Production Engineering Unified
Perspective based on Data Mining Methods and Virtual Factory Model” (grant agreement no:
612207) and research work financed from funds for science for years: 2016-2017 allocated to an
international co-financed project.

References

1. Winner, H., Hakuli, S., Wolf, G.: Handbuch Fahrerassistenzsysteme. Spriger Vieweg,
Wiesbaden (2012)
2. Brookhuis, K.A., de Waard, D., Janssen, W.H.: Behavioural impacts of Advanced Driver
Assistance Systems–an overview. TNO Human Factors Soesterberg; The Netherlands
3. Piao, J., McDonald, M.: Advanced driver assistance systems from autonomous to cooperative
approach. Trans. Rev. 28, 659–684 (2008)
4. Schneider, J.H.: Modellierung und Erkennung von Fahrsituationen und Fahrmanövern für
sicherheitsrelevante Fahrerassistenzsysteme. Fakultät für Elektrotechnik und
Informationstechnik, TU Chemnitz (2009)
5. Bertozzi, M., Broggi, A., Carletti, M., Fascioli, A., Graf, T., Grisleri, P., Meinecke, M.: IR
pedestrian detection for advanced driver assistance systems. In: Michaelis, B., Krell, G. (eds.)
DAGM 2003. LNCS, vol. 2781, pp. 582–590. Springer, Heidelberg (2003)
6. Continental Automotive, ‘Advanced Driver Assistance Systems’. http://www.continental-
automotive.com/www/automotive_de_en/themes/passenger_cars/chassis_safety/adas/
7. Boodlal, L., Chiang, K.-H.: Study of the Impact of a Telematics System on Safe and Fuel-
efficient Driving in Trucks. U.S. Department of Transportation Federal Motor Carrier Safety
Administration Office of Analysis, Research and Technology (2014)
8. Keller, C.G., Dang, T., Fritz, H., Joos, A., Rabe, C., Gavrila, D.M.: IEEE Xplore abstract -
active pedestrian safety by automatic braking and evasive steering. IEEE Trans. Intell. Transp.
Syst. 12, 1292–1304 (2011)
9. Tewolde, G.S.: Sensor and network technology for intelligent transportation systems.
Presented at the May (2012)
10. Bengler, K., Dietmayer, K., Farber, B., Maurer, M., Stiller, C., Winner, H.: Three decades of
driver assistance systems: review and future perspectives. IEEE Intell. Trans. Syst. Mag. 6,
6–22 (2014)
11. Vollrath, M., Briest, S., Schiessl, C., Drewes, K., Becker, U.: Ableitung von Anforderungen
an Fahrerassistenzsysteme aus Sicht der Verkehrssicherheit. Berichte der Bundesanstalt für
Straßenwesen. Bergisch Gladbach: Wirtschaftsverlag NW (2006)
12. Fildes, B., Keall, M., Thomas, P., Parkkari, K., Pennisi, L., Tingvall, C.: Evaluation of the
benefits of vehicle safety technology: The MUNDS study. Accid. Anal. Prev. 55, 274–281
(2013)
13. David, K., Flach, A.: CAR-2-X and pedestrian safety. IEEE Veh. Technol. Mag. 5, 70–76
(2010)
14. Horter, M.H., Stiller, C., Koelen, C.: A hardware and software framework for automotive
intelligent lighting. Presented at the June (2009)
15. Hegeman, G., Brookhuis, K., Hoogendoorn, S.: Opportunities of advanced driver assistance
systems towards overtaking. Eur. J. Trans. Infrastruct. Res. EJTIR 5(4), 281 (2005)
146 A. Ziebinski et al.

16. Lu, M., Wevers, K., Heijden, R.V.D.: Technical feasibility of advanced driver assistance
systems (ADAS) for road traffic safety. Trans. Planning Technol. 28, 167–187 (2005)
17. NXP - Automotive Radar Millimeter-Wave Technology. http://www.nxp.com/pages/
automotive-radar-millimeter-wave-technology:AUTRMWT
18. TDA2x - Texas Instruments Wiki. http://processors.wiki.ti.com/index.php/TDA2x
19. Bosch Mobility Solutions. http://www.bosch-mobility-solutions.com/en/
20. AutoUniMo: FP7-PEOPLE-2013-IAPP AutoUniMo project “Automotive Production
Engineering Unified Perspective based on Data Mining Methods and Virtual Factory Model”
(grant agreement no: 612207). http://autounimo.aei.polsl.pl/
21. Continental Industrial Sensors-Willkommen bei Industrial Sensors. http://www.conti-
online.com/www/industrial_sensors_de_de/
22. Kaempchen, N., Dietmayer, K.C.J.: Fusion of laserscanner and video for ADAS. IEEE Trans.
Intell. Transp. Syst. TITS 16(5), 1–12 (2015)
23. Błachuta, M., Czyba, R., Janusz, W., Szafrański, G.: Data fusion algorithm for the altitude
and vertical speed estimation of the VTOL platform. J. Intell. Rob. Syst. 74, 413–420 (2014)
24. Budzan, S., Kasprzyk, J.: Fusion of 3D laser scanner and depth images for obstacle recognition
in mobile applications. Opt. Lasers Eng. 77, 230–240 (2016)
25. Sandblom, F., Sorstedt, J.: Sensor data fusion for multiple configurations. In: Presented at the
June (2014)
26. Grzechca, D., Wrobel, T., Bielecki, P.: Indoor location and idetification of objects with video
survillance system and WiFi module. In: Presented at the September (2014)
27. Tokarz, K., Czekalski, P., Sieczkowski, W.: Integration of ultrasonic and inertial methods in
indoor navigation system. Theor. Appl. Inform. 26, 107–117 (2015)
28. Pamuła, D., Ziębiński, A.: Securing video stream captured in real time. Przegląd
Elektrotechniczny. R. 86(9), 167–169 (2010)
29. Ziebinski, A., Swierc, S.: Soft core processor generated based on the machine code of the
application. J. Circ. Syst. Comput. 25, 1650029 (2016)
30. Behere, S., Törngren, M.: A functional architecture for autonomous driving. In: Presented at
the Proceedings of the First International Workshop on Automotive Software Architecture
(2015)
31. Cupek, R., Ziebinski, A., Franek, M.: FPGA based OPC UA embedded industrial data server
implementation. J. Circ. Syst. Comp. 22, 18 (2013)

You might also like