Abdolzadeh, Vida (2020) Efficient Implementation of Recurrent Neural Network Accelerators. [Tesi di dottorato]


Download (2MB) | Preview
[error in script] [error in script]
Item Type: Tesi di dottorato
Lingua: English
Title: Efficient Implementation of Recurrent Neural Network Accelerators
Abdolzadeh, Vidavida.abdolzadeh@unina.it
Date: 13 March 2020
Number of Pages: 85
Institution: Università degli Studi di Napoli Federico II
Department: Ingegneria Elettrica e delle Tecnologie dell'Informazione
Dottorato: Information technology and electrical engineering
Ciclo di dottorato: 32
Coordinatore del Corso di dottorato:
Riccio, Danieledaniele.riccio@unina.it
Date: 13 March 2020
Number of Pages: 85
Uncontrolled Keywords: Long Short-Term Memory layer : LSTM Recurrent Neural Networks: RNN
Settori scientifico-disciplinari del MIUR: Area 09 - Ingegneria industriale e dell'informazione > ING-INF/01 - Elettronica
Date Deposited: 05 Apr 2020 15:17
Last Modified: 05 Nov 2021 12:53
URI: http://www.fedoa.unina.it/id/eprint/13225


In this dissertation, we propose an accelerator for the implementation of Lthe ong Short-Term Memory layer in Recurrent Neural Networks. We analyze the effect of quantization on the accuracy of the network and we derive an architecture that improves the throughput and latency of the accelerator. The proposed technique only requires one training process, hence reducing the design time. We present the implementation results of the proposed accelerator. The performance compares favorably with other solutions presented in Literature. The goal of this thesis is to choose which circuit is better in terms of precision, area and timing. In addition, to verify that the chosen circuit works perfectly as activation functions, it is converted in Vivado HLS using C and then integrated in an LSTM Layer. A Speech recognition application has been used to test the system. The results are compared with the ones computed using the same Layer in Matlab to obtain the accuracy and to decide if the precision of the Non-Linear functions is sufficient.


Downloads per month over past year

Actions (login required)

View Item View Item