A Neural Network-Prepended GLRT Framework for Signal Detection Under Nonlinear Distortions

Rajeev Sahay, Swaroop Appadwedula, David J. Love, Christopher G. Brinton

Research output: Contribution to journalArticlepeer-review

Abstract

Many communications and sensing applications hinge on the detection of a signal in a noisy, interference-heavy environment. Signal processing theory yields techniques such as the generalized likelihood ratio test (GLRT) to perform detection when the received samples correspond to a linear observation model. Numerous practical applications exist, however, where the received signal has passed through a nonlinearity, causing significant performance degradation of the GLRT. In this work, we propose prepending the GLRT detector with a neural network classifier capable of identifying the particular nonlinear time samples in a received signal. We show that pre-processing received nonlinear signals using our trained classifier to eliminate excessively nonlinear samples (i) improves the detection performance of the GLRT on nonlinear signals and (ii) retains the theoretical guarantees provided by the GLRT on linear observation models for accurate signal detection.

Original languageEnglish (US)
Pages (from-to)2161-2165
Number of pages5
JournalIEEE Communications Letters
Volume26
Issue number9
DOIs
StatePublished - Sep 1 2022
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Modeling and Simulation
  • Computer Science Applications
  • Electrical and Electronic Engineering

Keywords

  • Generalized likelihood ratio test
  • dense neural network
  • nonlinear signal processing
  • wireless communications

Fingerprint

Dive into the research topics of 'A Neural Network-Prepended GLRT Framework for Signal Detection Under Nonlinear Distortions'. Together they form a unique fingerprint.

Cite this