Technology Frontier: On-board Cameras
2016-11-25
In-vehicle cameras: The eyes of autonomous driving
In-vehicle cameras are often referred to as the “eyes of autonomous driving” and are considered core sensing devices in the field of ADAS systems and autonomous vehicle technology. These cameras primarily use lenses and image sensors to collect visual information, enabling 360-degree perception of the surrounding environment. They compensate for the limitations of radar in object recognition, making them the sensors most closely resembling human vision. In-vehicle cameras have a wide range of applications in the automotive industry. Initially used for recording driving footage, providing reverse-view images, and assisting with parking, their functions have gradually expanded to include behavior recognition within intelligent cockpits and ADAS-assisted driving, resulting in an increasingly diverse range of use cases.
Currently, the top three companies in the global in-vehicle camera market account for 41% of the total market share, while the top ten companies together hold 96% of the market. This indicates a high level of market concentration in this industry. According to the Highway Loss Data Institute (HLDI) in the United States, it is predicted that by 2030, nearly 50% of vehicles will be equipped with ADAS technology. Based on data from ICVTank, the scale of China’s in-vehicle camera market is expected to reach 23 billion yuan by 2025, with a compound annual growth rate of 30% over five years. Globally, the market size for in-vehicle cameras is expected to increase from $11.2 billion in 2019 to $27 billion by 2025, representing a CAGR of 15.8% over the same period.
Autonomous driving encompasses perception, decision-making, and execution, and perception is the foundation of the entire process, serving as a crucial component of autonomous driving systems. During vehicle operation, the perception system uses sensors to collect real-time information about the surrounding environment, effectively acting as the “eyes” of the autonomous vehicle. This enables the vehicle to observe its surroundings in a manner similar to that of a human driver. In autonomous vehicles, the perception system is typically composed of cameras, millimeter-wave radars, and lidars (which are optional, mainly to avoid arguments from Tesla enthusiasts). Cameras, as primary environmental perception sensors, play a vital role in providing 360-degree visibility and compensating for radar’s shortcomings in object recognition, making them the sensors most closely resembling human vision. Therefore, in-vehicle cameras are one of the key components in the field of autonomous driving.
[Back]