Life-saving sensors: How semiconductors are changing car safety

As the world’s population grows and more vehicles are on the road, improvements in traffic safety can’t come soon enough. Although the death rate has more than halved since 2000, from 135 to 64 per 100,000 vehicles, the total number of deaths continues to climb. Today, 94% of accidents are attributable to driver behavior.

Cars are now safer than ever. Improvements in airbags, mandatory seat belts, and vehicle structural and functional design have made it more likely that drivers and passengers will exit a crash. Better braking and steering subsystems, as well as common innovations such as anti-lock braking systems (ABS) or electronic stability control (ESC), rely on precise sensors to improve safety, while advanced driver assistance systems (ADAS) Makes traffic accidents less likely. As cars become more autonomous, the aim is to further reduce this risk. The ultimate goal is fully autonomous driving, or “Level 5,” which effectively eliminates human error. Vision Zero Casualty is a multinational initiative that envisages no road traffic accidents resulting in death or serious injury. One of the major factors in improving car safety is increasing the level of electronics in the car. For example, there are currently more than 230 ON Semiconductor devices per vehicle globally.

If we are to achieve Vision Zero, there is no room for complacency, and there is much work to be done. Despite safer cars, road traffic injuries remain the most common cause of death among people aged 5 to 29, at least in the developed world, with 1.35 million deaths each year from road traffic crashes, according to the World Health Organization (WHO) .

sensor integration

Twenty years ago, any sensor in your car was pretty simple. You have a fuel gauge that measures the level in the tank, and an engine temperature gauge. Along with your speedometer and some warning lights, and maybe a tachometer, that’s probably it.

Today, numerous electronic sensors help you stay safe. For example, cameras and imaging sensors have many uses, including ADAS, rear view for safe reversing and parking, and in-vehicle monitoring. This means that it is important for automakers, OEMs and TIer-1s to work with suppliers with a wide range of products so that they can choose the best sensor for each application. Sensors should also be designed for mission-critical use and capable of operating over an extended temperature range.

Blind-Spot Detection: Blind Spot Detection

Backup Camera: rear view camera

Car DVR: Dash Cam

Driver Monitoring: Driver Monitoring

Lane Departure Warning: Lane Departure Warning

Pedestrian/Object Detection: Pedestrian/Object Detection

Collision MiTIgation: Collision mitigation

Adaptive Cruise Control: Adaptive Cruise Control

Smart Headlight/Mirror: Smart Headlight/Mirror

Night Vision: Night Vision

Performance is also important—sensors must have high enough resolution to capture sufficient detail for ADAS and other systems, and must provide excellent image quality to deal with darkness, inclement weather, glare, and other issues. For example, excellent dynamic range can dramatically alter the image sent from the sensor to the ADAS system processor (see Figure 2). It’s no exaggeration to say it’s the difference between life and death, if it means the car can identify the problem ahead faster.

In addition to image sensors, radar and lidar (LiDAR) are essential tools in today’s cars. The radar can be used for short, medium and long range applications such as evasive steering, junction assist and adaptive cruise control, looking up to 250 m ahead. For each use case, choosing the right radar transceiver will ensure optimum performance.

LiDAR complements radar with photon detectors capable of generating images as well as 3D maps based on measuring time of flight (ToF). This enables LiDAR to provide high-resolution depth data, enabling object detection capabilities not possible with radar or cameras alone.

In practice, the best solution is often a combination of sensing modalities in one vehicle: imaging, radar and LiDAR, and ultrasonic sensing. Using multiple types of sensors, their strengths can work together, and redundancy is built in.

Put into practice – 100 million times

Let’s look at an example system: SUBARU’s EyeSight driver assistance system, which uses ON Semiconductor’s 1.2-megapixel AR0132AT CMOS image sensor. The EyeSight system was first installed in SUBARU’s Levorg model in 2014 and has since been offered in Legacy, Forester, Impreza and SUBARU XV models.

EyeSight uses image sensors in its stereo camera system for safety features including adaptive cruise control, lane keep assist and sway warning, pre-collision braking and pre-collision throttle management. The system has received numerous awards, including the highest rating for Advanced Safety Vehicle Triple Plus (ASV+++) from the Japan New Car Assessment Program (JNCAP).

ADAS is now a mainstream technology, not just protection for high-end cars. As it turns out, ON Semiconductor has shipped more than 100 million AR0132AT image sensors for driver assistance applications, including EyeSight. No other vendor appears to have reached this number (and in 2018 ON Semiconductor had an 81% market share in perception cameras for driver assistance), which shows the scale of technology adoption.

Dozing off while driving is another major cause of traffic accidents. Technology can help monitor driver performance and issue alerts or warnings if they appear to be driving erratically. Another option is to use a camera-based system to observe the driver and trigger an alert if it detects signs of fatigue, such as eyes closed or head drooping. For example, a recent demonstration system integrates multiple image sensors, including ON Semiconductor’s AR0144AT 1-megapixel sensor, to provide images to in-vehicle systems running artificial intelligence (AI) software.

The future of safe driving

Today, 28% of traffic accidents in the United States alone can be prevented by ADAS, and ON Semiconductor’s sensors have saved more than 81,000 lives each year. That’s good, but it has to be better, and we’ll continue to work on improving sensors and working with partners to make cars and roads safer. Costs also matter; traffic accidents are much more common in low-income countries, so any safety innovation should be widely affordable.

Going forward, we must also ensure that safety systems not only protect drivers and passengers, but also need to help reduce casualties among pedestrians, motorcyclists and cyclists. Regulations and standards are recognizing this, such as the European New Car Assessment Programme (EuroNCAP 2020), and sensor technology can play an important role in protecting vulnerable road users.

The long-term goal of the industry must be to work towards zero fatalities, even zero injuries and zero accidents. There is still some way to go, but we are working hard to make it happen as soon as possible.

留下评论