Visible to the public Switched by Input: Power Efficient Structure for RRAM-based Convolutional Neural Network

TitleSwitched by Input: Power Efficient Structure for RRAM-based Convolutional Neural Network
Publication TypeConference Paper
Year of Publication2016
AuthorsXia, Lixue, Tang, Tianqi, Huangfu, Wenqin, Cheng, Ming, Yin, Xiling, Li, Boxun, Wang, Yu, Yang, Huazhong
Conference NameProceedings of the 53rd Annual Design Automation Conference
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-4236-0
Keywordsanalogical transfer, analogies, Human Behavior, pubcrawl

Convolutional Neural Network (CNN) is a powerful technique widely used in computer vision area, which also demands much more computations and memory resources than traditional solutions. The emerging metal-oxide resistive random-access memory (RRAM) and RRAM crossbar have shown great potential on neuromorphic applications with high energy efficiency. However, the interfaces between analog RRAM crossbars and digital peripheral functions, namely Analog-to-Digital Converters (ADCs) and Digital-to-Analog Converters (DACs), consume most of the area and energy of RRAM-based CNN design due to the large amount of intermediate data in CNN. In this paper, we propose an energy efficient structure for RRAM-based CNN. Based on the analysis of data distribution, a quantization method is proposed to transfer the intermediate data into 1 bit and eliminate DACs. An energy efficient structure using input data as selection signals is proposed to reduce the ADC cost for merging results of multiple crossbars. The experimental results show that the proposed method and structure can save 80% area and more than 95% energy while maintaining the same or comparable classification accuracy of CNN on MNIST.

Citation Keyxia_switched_2016