Abstract
Lidar has become a cornerstone sensing modality for 3D vision, especially for large outdoor scenarios and au-tonomous driving. Conventional lidar sensors are capable of providing centimeter-accurate distance information by emitting laser pulses into a scene and measuring the time- of-flight (ToF) of the reflection. However, the polarization of the received light that depends on the surface orientation and material properties is usually not considered. As such, the polarization modality has the potential to improve scene reconstruction beyond distance measurements. In this work, we introduce a novel long-range polarization wave-front lidar sensor (PolLidar) that modulates the polarization of the emitted and received light. Departing from con-ventional lidar sensors, PolLidar allows access to the raw time-resolved polarimetric wavefronts. We leverage polari-metric wavefronts to estimate normals, distance, and ma-terial properties in outdoor scenarios with a novel learned reconstruction method. To train and evaluate the method, we introduce a simulated and real-world long-range dataset with paired raw lidar data, ground truth distance, and nor-mal maps. We find that the proposed method improves normal and distance reconstruction by 53% mean angular error and 41% mean absolute error compared to existing shape-from-polarization (SfP) and ToF methods. Code and data are open-sourced here11https://light.princeton.edu/pollidar/.
Original language | English (US) |
---|---|
Pages (from-to) | 21241-21250 |
Number of pages | 10 |
Journal | Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition |
DOIs | |
State | Published - 2024 |
Event | 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024 - Seattle, United States Duration: Jun 16 2024 → Jun 22 2024 |
All Science Journal Classification (ASJC) codes
- Software
- Computer Vision and Pattern Recognition
Keywords
- 3D Scene Reconstruction
- Lidar
- Polarization