Efficiently synthesizing virtual video

Richard J. Radke, Peter J. Ramadge, Sanjeev R. Kulkarni, Tomio Echigo

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


Given a set of synchronized video sequence of a dynamic scene taken by different cameras, we address the problem of creating a virtual video of the scene from a novel viewpoint. A key aspect of our algorithm is a method for recursively propagating dense and physically accurate correspondences between the two video sources. By exploiting temporal continuity and suitably constraining the correspondences, we provide an efficient framework for synthesizing realistic virtual video. The stability of the propagation algorithm is analyzed, and experimental results are presented.

Original languageEnglish (US)
Pages (from-to)325-337
Number of pages13
JournalIEEE Transactions on Circuits and Systems for Video Technology
Issue number4
StatePublished - Apr 2003

All Science Journal Classification (ASJC) codes

  • Media Technology
  • Electrical and Electronic Engineering


  • Correspondence
  • Image-based rendering
  • View synthesis
  • Virtual video
  • Virtual views


Dive into the research topics of 'Efficiently synthesizing virtual video'. Together they form a unique fingerprint.

Cite this