Understanding Safety & Security: Making Sense of It All

Understanding Safety & Security: Making Sense of It All

Giri Venkat, who works in image sensor technical marketing at On Semiconductor, discusses the importance of functional safety for automotive image sensors. As advanced driver-assistance systems (ADAS) features like lane-keeping, adaptive cruise control, and collision-avoidance braking evolve towards full autonomy, more cameras are being integrated into production vehicles.

In nearly all ADAS setups, the image sensor is the primary sensor. As these systems transition from simply assisting drivers to fully automating driving tasks, the safe operation of the vehicle relies heavily on the reliability of the imaging subsystem. An essential element of this safety is ensuring that the image sensor functions correctly within ADAS and autonomous systems.

The ISO 26262 standard introduces automotive safety integrity levels (ASILs), ranging from ASIL-A (lowest) to ASIL-D (highest). An ASIL is based on three considerations: the severity of potential failures, the probability of such failures, and the controllability of the failure’s effects. Key performance metrics include detection, delay, efficiency, and the impact of the failure.

Image sensors are at the heart of ADAS, serving as the primary data source for the vehicle’s vision system. They capture the raw data the system needs to analyze the environment and make operational decisions. Other sensors, such as radar and lidar, can complement these sensors, but image sensors remain the central data providers. ADAS systems also include components for image processing, analysis, and decision-making.

The number of image sensors in ADAS is rapidly increasing. From a single forward-facing camera, vehicles are now equipped with full surround-view systems, sometimes with more than ten cameras in total. The impact of a sensor failure depends on its nature, ranging from minor to critical. The ability of a system to detect, protect, and correct individual sensor failures significantly affects overall safety and reliability.

A CMOS image sensor is essentially a grid of photosensitive pixels arranged in rows and columns. These pixels convert light into electrical signals, which are then converted into digital values, usually row by row. Digital logic enables the data to be stored, processed, and transmitted to other system components for further processing and analysis.

The data captured by image sensors in ADAS applications help the system make decisions influencing vehicle operations. As ADAS becomes more complex, these decisions evolve from simple alerts to more advanced actions like braking, acceleration, and steering, eventually leading to fully autonomous driving.

Failures in image sensors can be defined conservatively as any output differing from a fault-free model. This definition includes errors at the pixel level or higher, such as row, column, and frame errors. Such faults can arise from issues within the device, whether in its analog or digital functions, or from problems in transmitting data from the sensor to the system. Video faults can be either fixed (static) or change over time (dynamic), both spatially and temporally.

Understanding and managing these potential failures is crucial for maintaining the safety and reliability of ADAS and autonomous vehicles.

smartautotrends