Beamspace Channel Estimation for Wideband Millimeter Wave MIMO A Model Driven Unsupervised Learning

Beamspace Channel Estimation for Wideband Millimeter Wave MIMO A Model Driven Unsupervised Learning

Abstract:

Millimeter-wave (mmWave) communications have been one of the promising technologies for future wireless networks that integrate a wide range of data-demanding applications. To compensate for the large channel attenuation in mmWave band and avoid high hardware cost, a lens-based beamspace massive multiple-input multiple-output (MIMO) system is considered. However, the spatial-wideband effect in wideband mmWave systems makes channel estimation very challenging, especially when the receiver is equipped with a limited number of radio-frequency (RF) chains. Furthermore, the real channel data cannot be obtained before the mmWave system is used in a new environment, which makes it impossible to train a deep learning (DL)-based channel estimator using real data set beforehand. To solve the problem, we propose a model-driven unsupervised learning network, named learned denoising-based generalized expectation consistent (LDGEC) signal recovery network. By utilizing the Stein’s unbiased risk estimator loss, the LDGEC network can be trained only with limited measurements corresponding to the pilot symbols, instead of the real channel data. Even if designed for unsupervised learning, the LDGEC network can be supervisingly trained with the real channel via the denoiser-by-denoiser way. The numerical results demonstrate that the LDGEC-based channel estimator significantly outperforms state-of-the-art compressive sensing-based algorithms when the receiver is equipped with a small number of RF chains and low-resolution ADCs.