Computational imaging with multi-camera time-of-flight systems

Shikhar Shrestha, Felix Heide, Wolfgang Heidrich, Gordon Wetzstein

Research output: Contribution to journalConference articlepeer-review

51 Scopus citations

Abstract

Depth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating.

Original languageEnglish (US)
Article numbera33
JournalACM Transactions on Graphics
Volume35
Issue number4
DOIs
StatePublished - Jul 11 2016
Externally publishedYes
EventACM SIGGRAPH 2016 - Anaheim, United States
Duration: Jul 24 2016Jul 28 2016

All Science Journal Classification (ASJC) codes

  • Computer Graphics and Computer-Aided Design

Keywords

  • Computational photography
  • Light fields
  • Time-of-flight

Fingerprint

Dive into the research topics of 'Computational imaging with multi-camera time-of-flight systems'. Together they form a unique fingerprint.

Cite this