Lidar (light detection and ranging) technology uses light from lasers to measure distances. The reflections produced by these laser lights after bouncing off objects generate precise information about the surrounding area. While these weak laser pulses help autonomous vehicles determine distance and see things, lidar lasers
can potentially damage human eyes and certain kinds of cameras.
Lidar is used for a wide range of purposes. The
National Oceanic and Atmospheric Administration uses it to map out Earth's elevation and depth, while some autonomous vehicle manufacturers use it to make self-driving cars "see" buildings, pedestrians and other things on the road.
Jim Park wrote
a March 11 piece for TruckingInfo.com examining the safety of lasers used in lidar scanners. He noted that lidars used for self-driving light vehicles can emit light energy with a wavelength of 905 nanometers. This can cause eye damage at certain intensities, leading authorities to limit the lidar's power output and restricting its effective range to about 200 to 300 feet.
The 200 to 300 feet distance is ideal for city driving at 32 to 48 kilometers per hour (20 to 30 miles per hour), but nor for highway speeds. Ranges of 600 to 1,000 feet are needed when driving on a highway to identify hazards and provide a safe stopping distance. Because of this limitation, some lidar companies have resorted to a 1,550 nanometer laser said to be 40 times more powerful and can reach up to 1,000 feet. This
new laser light is also safer for human eyes.
Laser expert Jeff Hecht told
TruckingInfo.com that lidar pulses at 1,550 nanometers cannot penetrate the human eye and reach the retina, causing blindness. "The ocular fluid [inside the eye] is largely clear at visible wavelengths and out, to about 1,300 or 1,400 nanometers. [But it] becomes nearly opaque … at 1,550 nanometers," he remarked. Hecht continued: "On the other hand, 905-nanometer light does reach the retina and could cause eye damage." (Related:
Behold the Cytophone: Researchers unveil a new kind of laser that can find and ZAP cancer cells.)
Lidar can cause significant damage to cameras
While Hecht surmised that lidar is safe for human eyes, an incident in January 2019 showed that it caused significant damage to certain kinds of cameras. During the CES 2019 trade show in Las Vegas, an attendee claimed that a lidar sensor attached to a vehicle damaged his almost $2,000 camera.
Ars Technica reported that autonomous vehicle engineer Jit Ray Chowdhury took pictures of a
car equipped with a sensor developed by lidar startup AEye. Unfortunately, all subsequent pictures he took using the camera had two spots in the same area. "I noticed that all my pictures were having that spot," Chowdhury remarked. He continued: "I covered up the camera with the lens cap and the spots are there – [they are] burned into the sensor." The engineer who works at mobility startup Ridecell said AEye has offered to buy him a new camera.
A BBC report later said Chowdhury was satisfied with AEye's response, but he noted that a warning should have been clearly placed near the vehicle. "Lidar companies should test how camera-safe they are," the engineer commented. According to Chowdhury, other lidar systems he tested and took pictures of at close range did not damage his camera.
In an email to the technology news website, AEye Founder and CEO Luis Dussan acknowledged that the company's lidars can damage camera sensors. He explained: "Cameras are up to 1,000 times more sensitive to lasers than eyeballs." Dussan nevertheless insisted that lidars pose no danger to human eyes. (Related:
New laser-based system can locate small methane leaks in an area of several square miles.)
The International Laser Display Association agreed with Dussan's remarks on the possible effect of lasers on certain cameras. It warned: "Camera sensors in general are known to be more susceptible to damage than the human eye." The group continued that "[the] extent of damage can vary widely, depending on [the] distance from the source, beam direction and power."
Several factors play a role on how lidar affects nearby cameras, such as the intensity and duration of the laser pulse. While shorter pulses can measure distance more accurately, they require a higher intensity that can negatively impact camera sensors.
Lidar has been a point of contention ever since the CES incident involving Chowdhury, with the issue of whether it was a fluke or a real problem with the systems being debated on. Hecht also pointed out that cheaper versions of lidar units could leak some light in the lower 900 nanometer band, where it can damage cameras and permanently blind people.
Visit
FutureScienceNews.com for more news about new technology and their potential risks.
Sources include:
CleanTechnica.com
OceanService.NOAA.gov
TruckingInfo.com
ArsTechnica.com
BBC.com