TY - JOUR
T1 - Learned hardware-in-the-loop phase retrieval for holographic near-eye displays
AU - Chakravarthula, Praneeth
AU - Tseng, Ethan
AU - Srivastava, Tarun
AU - Fuchs, Henry
AU - Heide, Felix
N1 - Publisher Copyright:
© 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM
PY - 2020/11/26
Y1 - 2020/11/26
N2 - Holography is arguably the most promising technology to provide wide field-of-view compact eyeglasses-style near-eye displays for augmented and virtual reality. However, the image quality of existing holographic displays is far from that of current generation conventional displays, effectively making today's holographic display systems impractical. This gap stems predominantly from the severe deviations in the idealized approximations of the "unknown"light transport model in a real holographic display, used for computing holograms. In this work, we depart from such approximate "ideal"coherent light transport models for computing holograms. Instead, we learn the deviations of the real display from the ideal light transport from the images measured using a display-camera hardware system. After this unknown light propagation is learned, we use it to compensate for severe aberrations in real holographic imagery. The proposed hardware-in-the-loop approach is robust to spatial, temporal and hardware deviations, and improves the image quality of existing methods qualitatively and quantitatively in SNR and perceptual quality. We validate our approach on a holographic display prototype and show that the method can fully compensate unknown aberrations and erroneous and non-linear SLM phase delays, without explicitly modeling them. As a result, the proposed method significantly outperforms existing state-of-the-art methods in simulation and experimentation - just by observing captured holographic images.
AB - Holography is arguably the most promising technology to provide wide field-of-view compact eyeglasses-style near-eye displays for augmented and virtual reality. However, the image quality of existing holographic displays is far from that of current generation conventional displays, effectively making today's holographic display systems impractical. This gap stems predominantly from the severe deviations in the idealized approximations of the "unknown"light transport model in a real holographic display, used for computing holograms. In this work, we depart from such approximate "ideal"coherent light transport models for computing holograms. Instead, we learn the deviations of the real display from the ideal light transport from the images measured using a display-camera hardware system. After this unknown light propagation is learned, we use it to compensate for severe aberrations in real holographic imagery. The proposed hardware-in-the-loop approach is robust to spatial, temporal and hardware deviations, and improves the image quality of existing methods qualitatively and quantitatively in SNR and perceptual quality. We validate our approach on a holographic display prototype and show that the method can fully compensate unknown aberrations and erroneous and non-linear SLM phase delays, without explicitly modeling them. As a result, the proposed method significantly outperforms existing state-of-the-art methods in simulation and experimentation - just by observing captured holographic images.
KW - artificial intelligence
KW - augmented reality
KW - computational displays
KW - computer generated holography
KW - deep learning
KW - machine learning
KW - near-eye displays
KW - virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85097367340&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85097367340&partnerID=8YFLogxK
U2 - 10.1145/3414685.3417846
DO - 10.1145/3414685.3417846
M3 - Article
AN - SCOPUS:85097367340
SN - 0730-0301
VL - 39
JO - ACM Transactions on Graphics
JF - ACM Transactions on Graphics
IS - 6
M1 - 186
ER -