With a greater number of lasers and detectors, more accurate 3D images can be produced, improving the resolution of the sensor image. LiDAR can recognize fixed and moving objects using billions of photons of light, protecting drivers, occupants, and other road users. One of the solutions used by audiovisual developers to overcome the challenges of the LIDAR system in rainy climates is sensor fusion, which consists of combining data from different types of sensors, such as cameras, radars, ultrasound or GPS, to create a more complete and solid representation of the environment. Sensor fusion can help compensate for the weaknesses and limitations of each sensor and improve the accuracy and reliability of perception and decision-making processes.
For example, radar can provide better range and penetration in the rain than the LIDAR system, while cameras can provide better information on colors and textures. By merging data from the LiDAR, Radar, and Cameras, an AV can achieve a more complete and accurate understanding of its environment in humid conditions.