Sensor & Processor Technology for Autonomous Vehicles and Automated Driving Systems

February 21, 2024

Autonomous vehicles and autonomous driving have the potential to fundamentally change the transportation industry. Some of the immediate benefits include an increase in the overall safety of vehicles, more mobility for many different population segments like the elderly, and more efficient roadways with less traffic. Beyond safety and increased efficiency, the industry will ultimately have tremendous financial implications as well. A 2023 McKinsey & Company report notes that “Passenger car advanced driver-assistance systems and autonomous-driving systems could create $300 billion to $400 billion in revenues by 2035.”

Beyond the ultimate goal of a fully autonomous vehicle, simply by developing and working toward that technology, the components that make up Advanced Driver Assistance Systems (ADAS) improve. The movement towards autonomous vehicles is best viewed as incremental steps, rather than one or two giant leaps.

SAE Levels of Driving Automation

Sensor Systems for Achieving Automation

SAE International has developed a six-level classification system for driving automation. The classification system, which has been adopted by the U.S. National Highway Traffic Safety Administration (NHTSA), ranges from level 0 (no automation at all) to level 5 (fully automated driving). To reach level 5 driving automation, vehicles must be able to accurately observe their immediate environment to navigate safely. To do so, autonomous vehicles gather data and info from a wide variety of sensors to generate a model of the local environment, mapping the road, traffic, traffic controls, and other observable objects, along with their relative motion. Some of the most common types of sensor systems rely on Lidar, radar, cameras, GPS, and inertial navigation.


Lidar is a type of remote sensing technology that uses pulsed laser light to measure distances and to create detailed three-dimensional images of terrain or objects. Beyond general navigation and pathing functionality, lidar systems allow the detection and tracking of pedestrians, obstacles, and other vehicles, ensuring safe movement in complex environments. Lidar is often paired with other sensor technologies, such as cameras and radar, to provide a comprehensive and robust system for autonomous vehicles.


Radar systems emit radio waves that bounce off objects or terrain and return to the radar antenna. Measuring the time it takes for the radio waves to return to the sensor provides info about the speed and location of the targeted object or terrain. In principle, lidar and radar operate similarly, with one using light and the other using radio waves. However, the difference between the characteristics of light and radio waves leads to different applications. Light has significantly shorter wavelengths than radio waves, making lidar much better at identifying small objects and better at capturing fine details. However, radar has a significantly longer range than lidar, and can penetrate most weather conditions. For autonomous vehicles, radar sensors are best thought of as supplemental devices, providing data in low visibility environments like poor weather or night driving.


Generally, autonomous vehicles utilize video cameras. There may be just one camera or several, but the majority of autonomous vehicles have a continuous 360° view of the immediate vicinity. These cameras, coupled with the processors and software of the overall system, act as a machine vision system. Simply, machine vision includes all applications in which machines can automatically capture, process, and interpret visual information from the world around them to make decisions. 

The three sensor types above are far and away the most used in current autonomous vehicle designs and testing. That said, there are a few additional sensor types that should be mentioned.

GPS, Inertial Navigation, and Other Sensors

Several inertial navigation devices are used in autonomous vehicle design. These devices include accelerometers to track motion, gyroscopes to track rotation, and magnetometers to track the orientation of the vehicle. Inertial navigation sensors and devices can be used to calculate a vehicle’s location, velocity, and orientation without the need for any external references.

Some autonomous vehicles also utilize GPS to precisely determine location. While GPC, in theory, could be used to perform other driving and sensing functions, it is mostly used in route planning and navigation at this point.

Beyond everything listed above, some autonomous vehicles include a variety of different environmental sensors. These come in all shapes and sizes, from humidity sensors to thermometers, to vibration sensors, to microphones. Ultimately, the goal of these auxiliary sensors is to provide a fuller, more robust understanding of the current road conditions.

Sensor Fusion and Processing

Many of the autonomous vehicle sensors, and even whole sensor systems, have overlapping functionality. By design, this approach provides redundancy, meaning that even in the case of a failure of one or more sensors, the vehicle can still operate safely. Combining real-time data from a variety of sensors and sensor types through sensor fusion, reduces the uncertainty of the data set as a whole, too.

Much like the brain processes real-time sensory data and information, the autonomous vehicle must be able to make sense of a constant flow of data from a wide variety of sensors. Although there isn’t a standard configuration, generally all an autonomous vehicle’s sensors send data directly to a central computer. This computer combines and filters all the sensor data to make driving decisions. Rather than relying on one type of sensor data, sensor fusion provides the computer with critical data from multiple sources, which greatly improves both reliability and redundancy.

One of the most important things to consider when developing any products for use in ADAS applications is I/O. That is to say, how do all the disparate sensors communicate with each other and the central processor. As more and more sensors and peripheral devices are being added to ADAS platforms, it is crucial that the system can handle and manage the traffic from the new devices. Recently, Sealevel partnered with a leading provider of embedded computing solutions for ADAS technology to do just that. The ADAS technology provider needed to integrate a large number of peripherals, but did not have the enough connection points. After evaluating the platform, Sealevel recommended the Embedded Rugged SuperSpeed 7-Port USB 3.1 Hub, which the ADAS provider ultimately standardized into their products.