Sensor networks are an indispensable part of the Internet of Things (IoT), where sensors perform data acquisition and information processing tasks to obtain the parameters of interest so that IoT-based monitoring, diagnosis and other systems respond quickly to the changing conditions, instantaneous faults, etc. Distributed estimation algorithms are usually employed to estimate the parameters of interest in these IoT-based applications. However, when sensor networks have highly correlated input signals and nonstationary behavior in which the parameters of interest are time-varying, conventional distributed estimation algorithms suffer from severely degraded learning performance due to the large eigenvalue spread in the covariance matrix of the input signals and the random perturbation of the parameters of interest. To address these problems, this paper proposes two diffusion Bayesian subband adaptive filter (DBSAF) algorithms from a Bayesian learning perspective. As the highly-correlated input signal is whitened in a multiband structure and an estimate of the uncertainty in the parameters of interest is obtained by performing Bayesian inference, the proposed DBSAF algorithms are able to achieve better learning performance in comparison with the competing diffusion algorithms. The transient and steady-state mean square error performance of the proposed DBSAF algorithms are analyzed, and are verified by numerical simulations. A lower bound on the time-varying step-size is derived to maintain the optimal steady-state performance in nonstationary scenarios. A new method for the estimation of the noise variance is also proposed. Numerical simulations demonstrate the excellent learning performance of the proposed algorithms in comparison with benchmark algorithms.
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering
- Adaptive signal processing
- distributed estimation algorithm
- multiband structure
- sensor networks