Abdolzadeh, Vida (2020) Efficient Implementation of Recurrent Neural Network Accelerators. [Tesi di dottorato]

[img]
Anteprima
Testo
Abdolzadeh_Vida_32.pdf

Download (2MB) | Anteprima
[error in script] [error in script]
Tipologia del documento: Tesi di dottorato
Lingua: English
Titolo: Efficient Implementation of Recurrent Neural Network Accelerators
Autori:
AutoreEmail
Abdolzadeh, Vidavida.abdolzadeh@unina.it
Data: 13 Marzo 2020
Numero di pagine: 85
Istituzione: Università degli Studi di Napoli Federico II
Dipartimento: Ingegneria Elettrica e delle Tecnologie dell'Informazione
Dottorato: Information technology and electrical engineering
Ciclo di dottorato: 32
Coordinatore del Corso di dottorato:
nomeemail
Riccio, Danieledaniele.riccio@unina.it
Tutor:
nomeemail
Petra, Nicola[non definito]
Data: 13 Marzo 2020
Numero di pagine: 85
Parole chiave: Long Short-Term Memory layer : LSTM Recurrent Neural Networks: RNN
Settori scientifico-disciplinari del MIUR: Area 09 - Ingegneria industriale e dell'informazione > ING-INF/01 - Elettronica
Depositato il: 05 Apr 2020 15:17
Ultima modifica: 05 Nov 2021 12:53
URI: http://www.fedoa.unina.it/id/eprint/13225

Abstract

In this dissertation, we propose an accelerator for the implementation of Lthe ong Short-Term Memory layer in Recurrent Neural Networks. We analyze the effect of quantization on the accuracy of the network and we derive an architecture that improves the throughput and latency of the accelerator. The proposed technique only requires one training process, hence reducing the design time. We present the implementation results of the proposed accelerator. The performance compares favorably with other solutions presented in Literature. The goal of this thesis is to choose which circuit is better in terms of precision, area and timing. In addition, to verify that the chosen circuit works perfectly as activation functions, it is converted in Vivado HLS using C and then integrated in an LSTM Layer. A Speech recognition application has been used to test the system. The results are compared with the ones computed using the same Layer in Matlab to obtain the accuracy and to decide if the precision of the Non-Linear functions is sufficient.

Downloads

Downloads per month over past year

Actions (login required)

Modifica documento Modifica documento