Abstract:
Stochastic computing (SC) has become a promising approximate computing solution by its negligible resource occupancy and ultralow energy consumption. As a potential replacement of accurate multiplication, SC can dramatically mitigate the problematic power consumption by DNNs. However, current SC-multipliers illustrate an extremely imbalanced accuracy across product space, i.e., neglectable noise with large products but significant noise for small ones, which is discordant to the distribution of products by the sparse matrix in neural computing. In this article, we present a heterogeneous SC-multiplier that heuristically performs three divergent approximating multiplication, including “set-to-0,” “look-up-table,” and “low-discrepancy-SC,” for appropriate precision-provision in the whole space of products. Due to those popular DNN models cannot achieve consensus on the boundaries of above operations, a training-involved method is proposed to determine the settings with limited overhead. In this way, those models successively learn the SC-operation characters and exhibit a definitely improvement on network precision. The experiment shows that, for single multiplication, the product noise can be restrained by 36.86% on average, and for multiplication in multiple network models, the accuracy improvement reaches to 5.5% on average. Furthermore, a group of proposed logic-reduction techniques can improve the energy efficiency by 65% in the system-level evaluation.