Digital Implementation of Radial Basis Function Neural Networks Based on Stochastic Computing

Digital Implementation of Radial Basis Function Neural Networks Based on Stochastic Computing

Abstract:

Nowadays Internet of Things (IoT) and mobile systems use more and more Machine Learning based solutions, which implies a high computation cost with a low energy consumption. This is causing a revival of interest in unconventional hardware computing methods capable of implementing both linear and nonlinear functions with less hardware overhead than conventional fixed point and floating point alternatives. Particularly, this work proposes a novel Radial Basis Function Neural Network (RBF-NN) hardware implementation based on Stochastic Computing (SC), which applies probabilistic laws over conventional digital gates. Several complex functions design to implement RBF-NN are presented and theoretically analyzed, such as the squared Euclidean distance and the stochastic Gaussian kernel similarity function between input samples and prototypes. The efficiency and performance of the methodology is tested over well-known pattern recognition tasks, including the MNIST dataset. The results show a low-cost methodology in terms of logic resources and power, along with an inherent capability to implement complex functions in a simple way. This methodology enables the implementation of massively parallel large scale RBF-NN with relatively low hardware requirements while maintaining 96.20% accuracy, which is nearly the same for the floating point and fixed point models (96.4% and 96.25%, respectively).