LIDAR, which stands for Light Detection and Ranging, is a remote sensing technology that measures distances by illuminating targets with laser light and analyzing the reflected pulses. Often described as “radar using light,” LIDAR creates high-resolution 3D maps of environments in real-time. The concept was first introduced in the 1960s, following the invention of the laser, and it gained significant attention in the 1970s during NASA’s Apollo 15 mission, which utilized LIDAR to map the Moon’s surface. Today, miniaturized and affordable LIDAR sensors are used in a variety of applications, ranging from autonomous vehicles to consumer smartphones.
How LIDAR Works
A LIDAR system consists of three main components: a laser emitter, a scanner, and a detector. The emitter sends out rapid pulses of near-infrared laser light, typically at wavelengths of 905 nm or 1550 nm, which are invisible to the human eye. These pulses travel at the speed of light, approximately 300,000 km/s, until they hit an object and bounce back.
Scanners direct the laser beam using rotating mirrors, micro-electromechanical systems (MEMS), or solid-state phased arrays. Flash LIDAR, on the other hand, skips the scanning process entirely by illuminating entire scenes at once with a diffused laser pulse, similar to a camera flash. The resolution of a LIDAR system depends on the pulse frequency, scanner precision, and detector sensitivity, with top systems achieving millimetre accuracy at distances of up to 200 meters.
LIDAR in Autonomous Vehicles

Self-driving cars depend on LIDAR technology for perception and localization. Companies like Waymo, Cruise, and Tesla have installed rooftop “spinning bucket” LIDAR units that rotate 10 to 30 times per second, generating 1 to 2 million data points every second. These units create a 360-degree 3D model that is updated 10 to 20 times per second.
LIDAR outperforms cameras and radar in several key areas:
- Low light: Unlike cameras, LIDAR can function effectively in complete darkness.
- Direct depth measurement: While radar provides a rough estimate of range, LIDAR offers precise geometric data.
- Object classification: The point clouds generated by LIDAR reveal shapes that help differentiate a pedestrian from an object like a trash bag.
In sensor fusion, LIDAR data is combined with camera images (for colour and texture) and radar data (for measuring velocity in rain or fog). The vehicle’s computer uses this integrated information to predict trajectories and plan safe paths. For instance, Waymo’s fifth-generation LIDAR can detect a child’s ball rolling into the street from 100 meters away, prompting the car to brake before cameras confirm the presence of the child.
LIDAR in iPhones and Consumer Devices
Apple introduced LIDAR to the masses with the iPad Pro (2020) and iPhone 12 Pro. The rear “LiDAR Scanner” is a vertical-cavity surface-emitting laser (VCSEL) array paired with a single-photon avalanche diode (SPAD) receiver. It emits 200–300 diffuse laser dots, measuring ToF up to 5 meters with ~1 cm accuracy.
Unlike automotive LIDAR, the iPhone’s version prioritizes augmented reality (AR) and photography:
- Instant scene understanding: AR apps place virtual objects on real surfaces without “drift.”
- Low-light portrait mode: Depth maps enable precise background blur even in darkness.
- 3D scanning: Apps like Polycam capture room models in seconds.
The sensor consumes minimal power (~100 mW) and fits within a 2×3 mm module. Apple’s ARKit framework processes point clouds on the A-series Neural Engine, achieving 30–60 fps depth maps.
Beyond iPhones, LIDAR is used in drones (DJI Mavic 3 Enterprise), robotics (Boston Dynamics Spot), and even vacuum cleaners like the Dyson 360. Visual Nav uses LIDAR for SLAM (Simultaneous Localization and Mapping).
Limitations and the Future
LIDAR struggles in heavy rain, fog, or smoke as droplets scatter laser pulses. Direct sunlight can saturate detectors. And while costs have plummeted, high-resolution units remain pricier than cameras.
The industry shifts toward solid-state LIDAR (no moving parts) and frequency-modulated continuous wave (FMCW) systems that measure velocity like radar. Startups like Aeva and Aurora integrate 4D LIDAR (range and velocity) into a single chip.
In consumer tech, expect LIDAR in AR glasses (Meta Orion prototype) and home robots. By 2030, LIDAR may become as universal as GPS, quietly mapping our world, one laser pulse at a time.

