Giampaolo, Fabio (2023) Learning paradigms for neural networks: from backpropagation to locally backpropagated Forward-Forward. [Tesi di dottorato]

[thumbnail of Giampaolo_Fabio_36.pdf]
Anteprima
Testo
Giampaolo_Fabio_36.pdf

Download (8MB) | Anteprima
Tipologia del documento: Tesi di dottorato
Lingua: English
Titolo: Learning paradigms for neural networks: from backpropagation to locally backpropagated Forward-Forward
Autori:
Autore
Email
Giampaolo, Fabio
fabio.giampaolo@unina.it
Data: 12 Dicembre 2023
Numero di pagine: 314
Istituzione: Università degli Studi di Napoli Federico II
Dipartimento: Matematica e Applicazioni "Renato Caccioppoli"
Dottorato: Matematica e Applicazioni
Ciclo di dottorato: 36
Coordinatore del Corso di dottorato:
nome
email
Moscariello, Gioconda
gioconda.moscariello@unina.it
Tutor:
nome
email
Mercaldo, Anna
[non definito]
Piccialli, Francesco
[non definito]
Cuomo, Salvatore
[non definito]
Data: 12 Dicembre 2023
Numero di pagine: 314
Parole chiave: Deep Learning, Neural Networks, Learning Paradigms, Time Series, Forward-Forward algorithm
Settori scientifico-disciplinari del MIUR: Area 01 - Scienze matematiche e informatiche > INF/01 - Informatica
Depositato il: 19 Dic 2023 18:12
Ultima modifica: 12 Mar 2026 11:13
URI: http://www.fedoa.unina.it/id/eprint/15645

Abstract

This thesis delves into the transformative world of neural networks, integral to modern artificial intelligence, exploring their challenges and developments. It provides a comprehensive analysis of their fundamental components and learning processes, emphasizing the importance of a deep understanding of these elements in enhancing their practical utility. The work covers the intricacies of neural network learning, dissecting these systems to understand the dynamics of their training and operation. This insight is pivotal for adapting neural networks to solve challenging tasks and drive advancements in various fields. A major focus is on developing a robust forecasting framework for time series prediction, demonstrating the practical application and feasibility of neural learning. Additionally, the thesis explores alternative learning paradigms beyond traditional methods, highlighting new strategies that modify foundational training mechanisms. This includes an investigation into an alternative to backpropagation, showing its relevance and promise, especially in addressing federated learning challenges. In conclusion, the thesis underscores the necessity of understanding and innovating within the neural network domain. As technology evolves, this knowledge becomes critical in redefining what is achievable, with potential significant impacts across various scientific and technological fields.

Downloads

Downloads per month over past year

Actions (login required)

Modifica documento Modifica documento