LIDAR Helps Self-Driving Cars Run Well Even In Bad Weather

LIDAR Helps Self-Driving Cars Run Well Even In Bad Weather

LiDAR technology has been developed that allows self-driving cars run well even in bad weather. The National Research Foundation of Korea announced on the 28th that a research team led by Professor Kim Chang-seok of the Department of Optical and Mechatronics Engineering at Pusan ​​National University developed a LiDAR technology that realizes the surroundings even in bad weather environments through industry-academic research with the electromagnetic energy materials research team at Hyundai Motor Company’s Basic Materials Research Center.

Lidar, called the ‘eyes of self-driving cars,’ is a device that emits laser pulses, receives the light reflected from surrounding objects, measures the distance to the objects, and accurately depicts the surroundings. It is considered a key technology that needs to be upgraded to commercialize self-driving cars in earnest.

Most existing self-driving car lidars were ToF-type lidars. The ToF method creates an image by firing a laser and measuring the laser round-trip time when light is reflected and returned to the surrounding area. The problem is that this type of lidar is sensitive to sunlight and has severe interference between vehicles. When it was foggy, snowy, or rainy, I couldn’t draw images well.

The lidar developed by the research team is an FMCW type lidar. The FMCW method is a technology that continuously modulates and fires a laser and measures the returning waveform. In particular, this lidar incorporates color modulation technology. Color modulation is a technology that fires a laser by changing the wavelengths into various colors, including red, green, green, blue, green, and blue.

Professor Kim explained, “When lasers of various wavelengths are fired, lasers of different wavelengths can take turns capturing images that were not recognized by one wavelength, so it provides a good picture of the surroundings even in bad weather situations.”

In addition, this LiDAR scans by shooting a laser in two dimensions, up, down, left, and right, while simultaneously measuring the perspective, three-dimensional distance information, and one-dimensional speed information of the target object. Self-driving information is implemented as real-time images across all four dimensions combining space and time.

Professor Kim said, “This is a technology that overcomes the limitations of existing autonomous driving, which is limited to road demonstrations in limited environments such as clear weather and solo driving.”

See More:

Why The Geneva Motor Show Has Become Shabby, TOP3 Toyota, Volkswagen, & Hyundai Are All Missing

Angry European Farmers Tractor Protesters ‘Advance’ On EU Headquarters

Italy Contacts BYD Chinese Electric Car Company, “Please Build A Factory”

Moon Landing US Spacecraft Will Soon & Stop Expected Operation Time Reduced

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *