Abstract:
Machine learning and, more precisely, data-driven models are providing solutions where physics-based models are intractable. This article discusses the use of deep learning models to characterize the intricate effects of multipath propagation on GNSS correlation outputs. Particularly, we aim at substituting standard correlation schemes, optimal under single-ray Gaussian noise assumptions, with neural network (NN)-based correlation schemes, that are able to learn the otherwise challenging to model multipath channels. The article shows that deep neural networks (DNNs), as applied to tracking loops, can provide enhanced performance as compared to standard correlation schemes in 1) line-of-sight (LOS) scenarios, by filtering out more noise thanks to strong prior regularization through knowledge of correlation characteristics and Gaussian noise during training process; and 2) at the same time, the DNN can adjust its behavior to better disentangle multipath signals from LOS signals. This article provides results showing the superiority of the proposed DNN trained models, with focus on time-delay tracking in a variety of realistic scenarios.