Dynamic Recurrent Routing via Low Rank Regularization in Recurrent Neural Networks

Dynamic Recurrent Routing via Low Rank Regularization in Recurrent Neural Networks

Abstract:

Recurrent neural networks (RNNs) continue to show outstanding performance in sequence learning tasks such as language modeling, but it remains difficult to train RNNs for long sequences. The main challenges lie in the complex dependencies, gradient vanishing or exploding, and low resource requirement in model deployment. In order to address these challenges, we propose dynamic recurrent routing neural networks (DRRNets), which can: 1) shorten the recurrent lengths by allocating recurrent routes dynamically for different dependencies and 2) reduce the number of parameters significantly by imposing low-rank constraints on the fully connected layers. A novel optimization algorithm via low-rank constraint and sparsity projection is developed to train the network. We verify the effectiveness of the proposed method by comparing it with multiple competitive approaches in several popular sequential learning tasks, such as language modeling and speaker recognition. The results in terms of different criteria demonstrate the superiority of our proposed method.