Visible to the public Scalable stochastic-computing accelerator for convolutional neural networks

TitleScalable stochastic-computing accelerator for convolutional neural networks
Publication TypeConference Paper
Year of Publication2017
AuthorsSim, H., Nguyen, D., Lee, J., Choi, K.
Conference Name2017 22nd Asia and South Pacific Design Automation Conference (ASP-DAC)
KeywordsArrays, Biological neural networks, computational complexity, ConvNets, convolution, convolutional neural network, convolutional neural networks, neural nets, Neural Network Resilience, pubcrawl, Recognition accuracy, resilience, Resiliency, SC-based neural networks, scalable stochastic-computing accelerator, Stochastic computing, system-on-chip, Weight parameter retraining

Stochastic Computing (SC) is an alternative design paradigm particularly useful for applications where cost is critical. SC has been applied to neural networks, as neural networks are known for their high computational complexity. However previous work in this area has critical limitations such as the fully-parallel architecture assumption, which prevent them from being applicable to recent ones such as convolutional neural networks, or ConvNets. This paper presents the first SC architecture for ConvNets, shows its feasibility, with detailed analyses of implementation overheads. Our SC-ConvNet is a hybrid between SC and conventional binary design, which is a marked difference from earlier SC-based neural networks. Though this might seem like a compromise, it is a novel feature driven by the need to support modern ConvNets at scale, which commonly have many, large layers. Our proposed architecture also features hybrid layer composition, which helps achieve very high recognition accuracy. Our detailed evaluation results involving functional simulation and RTL synthesis suggest that SC-ConvNets are indeed competitive with conventional binary designs, even without considering inherent error resilience of SC.

Citation Keysim_scalable_2017