Efficiently synthesizing virtual video

Richard J. Radke, Peter J. Ramadge, Sanjeev R. Kulkarni, Tomio Echigo

Research output: Contribution to journalArticle

4 Scopus citations

Abstract

Given a set of synchronized video sequence of a dynamic scene taken by different cameras, we address the problem of creating a virtual video of the scene from a novel viewpoint. A key aspect of our algorithm is a method for recursively propagating dense and physically accurate correspondences between the two video sources. By exploiting temporal continuity and suitably constraining the correspondences, we provide an efficient framework for synthesizing realistic virtual video. The stability of the propagation algorithm is analyzed, and experimental results are presented.

Original languageEnglish (US)
Pages (from-to)325-337
Number of pages13
JournalIEEE Transactions on Circuits and Systems for Video Technology
Volume13
Issue number4
DOIs
StatePublished - Apr 1 2003

All Science Journal Classification (ASJC) codes

  • Media Technology
  • Electrical and Electronic Engineering

Keywords

  • Correspondence
  • Image-based rendering
  • View synthesis
  • Virtual video
  • Virtual views

Fingerprint Dive into the research topics of 'Efficiently synthesizing virtual video'. Together they form a unique fingerprint.

Cite this