Cross-Scale Residual Network: A General Framework for Image Super-Resolution, Denoising, and Deblocking

Yuan Zhou, Xiaoting Du, Mingfei Wang, Shuwei Huo, Yeda Zhang, Sun Yuan Kung

Research output: Contribution to journalArticlepeer-review

21 Scopus citations


In general, image restoration involves mapping from low-quality images to their high-quality counterparts. Such optimal mapping is usually nonlinear and learnable by machine learning. Recently, deep convolutional neural networks have proven promising for such learning processing. It is desirable for an image processing network to support well with three vital tasks, namely: 1) super-resolution; 2) denoising; and 3) deblocking. It is commonly recognized that these tasks have strong correlations, which enable us to design a general framework to support all tasks. In particular, the selection of feature scales is known to significantly impact the performance on these tasks. To this end, we propose the cross-scale residual network to exploit scale-related features among the three tasks. The proposed network can extract spatial features across different scales and establish cross-temporal feature reusage, so as to handle different tasks in a general framework. Our experiments show that the proposed approach outperforms state-of-the-art methods in both quantitative and qualitative evaluations for multiple image restoration tasks.

Original languageEnglish (US)
Pages (from-to)5855-5867
Number of pages13
JournalIEEE Transactions on Cybernetics
Issue number7
StatePublished - Jul 1 2022

All Science Journal Classification (ASJC) codes

  • Software
  • Information Systems
  • Human-Computer Interaction
  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Computer Science Applications


  • Convolutional neural network (CNN)
  • general framework
  • image processing


Dive into the research topics of 'Cross-Scale Residual Network: A General Framework for Image Super-Resolution, Denoising, and Deblocking'. Together they form a unique fingerprint.

Cite this