Visible to the public Evaluating Data Resilience in CNNs from an Approximate Memory Perspective

TitleEvaluating Data Resilience in CNNs from an Approximate Memory Perspective
Publication TypeConference Paper
Year of Publication2017
AuthorsChen, Yuanchang, Zhu, Yizhe, Qiao, Fei, Han, Jie, Liu, Yuansheng, Yang, Huazhong
Conference NameProceedings of the on Great Lakes Symposium on VLSI 2017
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-4972-7
Keywordsapproximate memory, convolutional neural network, data resilience evaluation, Neural Network Resilience, pubcrawl, resilience, Resiliency
AbstractDue to the large volumes of data that need to be processed, efficient memory access and data transmission are crucial for high-performance implementations of convolutional neural networks (CNNs). Approximate memory is a promising technique to achieve efficient memory access and data transmission in CNN hardware implementations. To assess the feasibility of applying approximate memory techniques, we propose a framework for the data resilience evaluation (DRE) of CNNs and verify its effectiveness on a suite of prevalent CNNs. Simulation results show that a high degree of data resilience exists in these networks. By scaling the bit-width of the first five dominant data subsets, the data volume can be reduced by 80.38% on average with a 2.69% loss in relative prediction accuracy. For approximate memory with random errors, all the synaptic weights can be stored in the approximate part when the error rate is less than 10–4, while 3 MSBs must be protected if the error rate is fixed at 10–3. These results indicate a great potential for exploiting approximate memory techniques in CNN hardware design.
Citation Keychen_evaluating_2017