How Advanced Driver Assistance Systems Are Speeding Up Security and Innovation in the Automotive Industry

Advanced Driver Assistance Systems (ADAS) have been enhancing cars and the driving experience for several years. From lane detection, reversing assistants, and cruise control, the technology has advanced safety on the road, as well as comfort inside cars.

In fact, ADAS has the potential to prevent more than 20,800 deaths per year. These systems have unsurprisingly played a big role in the recent EU Vehicle General Safety Regulation which specifies mandatory ADAS to enhance road safety and provide a framework for a future of fully autonomous vehicles in the region. Meanwhile in the United States, over 92% of new vehicles have at least one ADAS component built in.

With ADAS being so influential in the automotive industry, how is the technology developing and responding to current challenges? And how will it continue to shape the next generation of driving?

Denijel discusses below.

How ADAS/AD is Composed Nowadays

ADAS/AD systems are primarily composed of sensors, electronic control units (ECUs), software algorithms, actuators, as well as mapping and localization. Each of these has real-world effects on the automation and safety of cars.

Sensors

Modern driver assistance systems use a variety of sensor technologies to precisely record the vehicle's surroundings. The main challenges include optimizing sensor packaging size to meet tough installation space requirements, protecting external sensors from environmental influences, and optimizing sensor covers like windscreens and chassis to preserve signal quality. Additionally, choosing sensor setups that scale for low, mid, and high-end platforms is crucial – for instance, deciding whether sensors should send raw data (satellite sensors) or pre-processed signals (intelligent sensors) to central processing units. There is also a focus on ensuring functional safety, cost-effective longevity, and maintaining sensor performance over a wide environmental temperature range, from -40 °C to 125 °C.

Processors and Electronic Control Units (ECUs)

ECUsh host the ADAS/AD software algorithms. Compliance with functional safety standards according to ASIL and cost-effective cooling concepts are among the current development challenges. Integrating highly specialized software components from different suppliers on a single ECU requires efficient management of suppliers and their delivery artifacts. Processing ever-increasing data streams while meeting the power budget is another major challenge, as the hosted software is often cutting-edge, making it difficult to predict computing power needs in advance. Companies that develop their SoCs specifically to serve their software algorithms often show superior performance compared to those that develop hardware first without software considerations.

Software Algorithms

Software is the backbone of ADAS – its algorithms interpret data provided by sensors and ECUs and make decisions based on the interpretation of the surrounding environment. Since new algorithms are used with every series introduction and since these systems have not matured over decades, SOTIF (Safety Of The Intended Functionality) according to ISO 21448 plays a central role. SOTIF aims to minimize the security risks that could arise from limitations of an ADAS/AD function.

Actuators

Actuators convert the ADAS software's digital decisions into physical actions such as braking, accelerating, and steering.

Mapping and localization

Mapping and localization combine high-resolution maps with detailed GPS data and environmental information. As a result, AD systems can precisely register a vehicle’s location and movement.

In the race for the most advanced AD systems, companies that can generate the most detailed and accurate HD maps covering the majority of the world’s routes will win the competition for the best autonomous driving systems.

Sensor Technologies Put Autonomous Driving in a New Lane

ADAS Sensors have been installed in cars for more than a decade, and over time, their function has evolved from basic speed control to more advanced obstacle detection and environmental awareness.

For example, front and corner radar sensors have been used to support Adaptive Cruise Control (to adjust vehicles’ speed), Rear Cross Traffic Alert (for reversing out of a parking space), and Automatic Emergency Braking (to activate brakes in a possible collision scenario).

There are two main trends in current radar development: cost-effective radar systems and high-priced imaging radar systems. Cost-effective radar systems enhance detection performance through the use of machine learning in signal processing, while high-priced imaging radar systems are increasingly serving as primary front radar solutions from level 3 onwards. Imaging radars offer better object detection and separation but are significantly more expensive compared to established radar solutions. As with classic radars, the performance of imaging radars also depends on the material properties of the objects detected, their size and angle of incidence, as well as the number of unwanted echoes (“cluttered scenarios”).

Elsewhere, ADAS cameras are indispensable for detecting the environment on public roads and are crucial for any automation efforts. They are the only source of color information about the environment, which is essential for recognizing traffic signs, traffic lights, and road markings. Their strength lies in object classification and more precise angle determination compared to radars. However, they face challenges in distance measurement and sensitivity in poor lighting and adverse weather conditions.

LiDAR systems are active optical sensors that use invisible light pulses to scan the environment and measure the reflection time of these pulses to calculate distances, creating detailed 3D models (point cloud) of the surroundings. LiDAR sensors precisely determine object sizes and distances, making them essential for higher levels of automation. Certain scenarios, such as detecting protruding cargo, are difficult for cameras or radars to handle. However, LiDAR systems are more vulnerable to adverse weather conditions and significantly more expensive, which is why they are primarily used in systems from Level 3 onwards, such as the Mercedes Benz Drive Pilot.

In addition to the aforementioned sensor technologies, other sensors are also used for specific ADAS/AD functions, particularly at low speeds such as valet parking or parking assistants.

These include Birds-Eye-View cameras which generate a 360-degree, top-down view of a vehicle surroundings and detect obstacles. The camera sensors can assist in parking, lane departure, and obstacle detection. Likewise, ultrasonic sensors can recognize objects at close range, making them ideal for low-speed maneuvers.

Going Up a Gear in Autonomous Driving Levels

There are five levels of automation when it comes to autonomous driving:

  • Level 0: No automation

  • Level 1: Assisted driving

  • Level 2: Partial automation

  • Level 3: Conditionally automated driving

  • Level 4: Highly automated driving

  • Level 5: Fully automated driving

Since 2015, premium OEMs and some ambitious Tier 1 suppliers have repeatedly announced the imminent series production of level 3 systems. In reality, end customers can currently purchase very few such systems, and only with very limited Operational Design Domains (ODDs). There is a backwards movement towards level 2+ systems, which offer level 3-like functions such as autonomous highway pilots but still require the driver's attention and monitoring. This approach requires less development effort and keeps the sales price affordable for end customers. Once the desired sales have been achieved with level 2+ systems, the focus will switch back to level 3 and higher automation levels again.

There is no universal concept for a minimal sensor set from level 3 onwards. For instance, functions such as valet parking do not require sensors with long-range detection capabilities, whereas a highway pilot system with a lane change function needs forward and rear-facing sensors with high resolution and long-range detection capabilities.

From level 3 onwards, all the sensor technologies listed above are combined to ensure comprehensive detection of the vehicle's surroundings. Compensation for system failures demands additional sensor and ECU redundancies. In level 3 automation, if several critical sensors fail simultaneously – due to factors such as dirt on the windshield or fender, which can result in camera or radar blockage, or a critical system failure (e.g., failure of the primary energy supply)—the system must maintain functionality during the defined handover time to the driver, which is several seconds.

Level 4 requires a significant increase in the number of sensors, resulting in substantially higher costs. The target group primarily comprises fleet operators investing in business models such as taxi services or future delivery services. Expensive sensors are currently being used here, such as a rotating LiDAR on the vehicle roof, which costs between 15,000 and well over 25,000 euros.

Level 5 ultimately marks the transition to full automation. The exact configuration of sensor technologies and their interaction for this level of automation remains a challenge for future research and development—at least as far as vehicles with public street approval are concerned.

From level 2+ onwards, the use and further development of simulation solutions to validate the entire AD stack will have a particularly positive impact on the time to series introduction.

At the same time, standards such as ISO 21448 (SOTIF) must mature to minimize the likelihood of critical risks. Regarding sensors and perception software, the focus will continue to be on high data quality, long durability, minimizing the effects of environmental influences on performance, minimizing sensor size, and cost efficiency.

Want to learn more about the move into the digital fast lane? intive will be attending Automobil-Elektronik Kongress 2024. Book to speak with one of our experts at the event, and learn about our cutting-edge engineering services and insights for the automotive sector.


Jak możemy Ci pomóc?
Porozmawiajmy!

Skontaktuj się

Widzisz się wśród nas?
Wspaniale!

Dołącz do nas