A Neural Network-Prepended GLRT Framework for Signal Detection Under Nonlinear Distortions

Rajeev Sahay, Swaroop Appadwedula, David J. Love, Christopher G. Brinton

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Many communications and sensing applications hinge on the detection of a signal in a noisy, interference-heavy environment. Signal processing theory yields techniques such as the generalized likelihood ratio test (GLRT) to perform detection when the received samples correspond to a linear observation model. Numerous practical applications exist, however, where the received signal has passed through a nonlinearity, causing significant performance degradation of the GLRT. In this work, we propose prepending the GLRT detector with a neural network classifier capable of identifying the particular nonlinear time samples in a received signal. We show that pre-processing received nonlinear signals using our trained classifier to eliminate excessively nonlinear samples (i) improves the detection performance of the GLRT on nonlinear signals and (ii) retains the theoretical guarantees provided by the GLRT on linear observation models for accurate signal detection.

Original languageEnglish (US)
Pages (from-to)1
Number of pages1
JournalIEEE Communications Letters
StatePublished - Sep 1 2022
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering
  • Computer Science Applications
  • Modeling and Simulation


  • Antenna arrays
  • Detectors
  • Generalized likelihood ratio test
  • Interference
  • Nonlinear distortion
  • Signal detection
  • Testing
  • Training data
  • dense neural network
  • nonlinear signal processing
  • wireless communications


Dive into the research topics of 'A Neural Network-Prepended GLRT Framework for Signal Detection Under Nonlinear Distortions'. Together they form a unique fingerprint.

Cite this