Viewing direction estimation in cryo-EM using synchronization

Yoel Shkolnisky, Amit Singer

Research output: Contribution to journalArticlepeer-review

62 Scopus citations


A central task in recovering the structure of a macromolecule from cryo-electron microscopy (cryo-EM) images is to determine a three-dimensional model of the macromolecule given many of its two-dimensional projection images. The direction from which each image was taken is unknown, and the images are small and extremely noisy. The goal is to determine the direction from which each image was taken and then to combine the images into a three-dimensional model of the molecule. We present an algorithm for determining the viewing direction of all cryo-EM images at once, which is robust to high levels of noise. The algorithm is based on formulating the problem as a synchronization problem; that is, we estimate the relative spatial configuration of pairs of images and then estimate a global assignment of orientations that maximizes the number of satisfied pairwise relations. Information about the spatial relation between pairs of images is extracted from common lines between triplets of images. These noisy pairwise relations are combined into a single consistent assignment of orientations by constructing a matrix whose entries encode the pairwise relations. This matrix is shown to have rank 3, and its nontrivial eigenspace is shown to reveal the projection orientation of each image. In particular, we show that the nontrivial eigenvectors encode the rotation matrix that corresponds to each image.

Original languageEnglish (US)
Pages (from-to)1088-1110
Number of pages23
JournalSIAM Journal on Imaging Sciences
Issue number3
StatePublished - 2012

All Science Journal Classification (ASJC) codes

  • Applied Mathematics
  • General Mathematics


  • Angular reconstitution
  • Cryo-electron microscopy
  • Synchronization
  • Tomography


Dive into the research topics of 'Viewing direction estimation in cryo-EM using synchronization'. Together they form a unique fingerprint.

Cite this