Nowadays, the volume of data produced in our society is exponentially surging, as is its variety and complexity. Automotive, smart cities, and Industry 4.0 are just a few examples of trends that are accelerating the demand for increased computational and storage capacities. Having surpassed its "second winter", Artificial Intelligence (AI) today offers a tool to address these new challenges. However, in its practical implementations, AI tends to overload computing infrastructures and systems, leading to an even more dramatic increase in computational demand. The underlying issue is rooted in the architecture of modern computers, proposed in 1945 by Von Neumann. The Von Neumann architecture, which separates the computing unit from the memory, has allowed us to build flexible, general-purpose machines over the past 70 years. However, this architecture is now showing significant inefficiencies primarily due to the need to move data between the two units. This problem is exacerbated by an asymmetry in the rate of technological improvement between memory units and processing units and by approaching the physical limit for scaling CMOS technology. In this context, Neuromorphic Computing emerges: started from the seminal work of Carver Mead in the late ’80s, this new paradigm draws inspiration from biological neural structures to emulate the functioning and efficiency of the brain at the hardware level. In the brain, there is no separation between computing and memory: the two fundamental units, neurons and synapses, work in synergy, and the concepts of memory and computing merge. Beyond a shift towards in-memory computing, biological networks are characterized by their plasticity, or the ability to continuously adapt to stimuli. It is neuronal plasticity that is responsible for our ability to remember, learn, and adapt through a myriad of complex plasticity mechanisms that together result in energy efficiency and computational capabilities that are unimaginable in their artificial counterparts today. However, new paradigms usually require new technologies, and this is where resistive switching devices come into play. These emerging memory devices are not only viable for supporting future technological scaling but potentially can implement mechanisms that emulate the plasticity of the human brain. Expanding and investigating this set of plasticity and learning mechanisms is the open challenge of neuromorphic computing, leading to scalable, efficient, and biologically plausible systems. This doctoral thesis focuses on expanding the plasticity mechanisms achievable through the dynamic properties of memristive devices and their use in biologically plausible neuromorphic systems, where this latter component is crucial for bridging the gap between models of computational neuroscience and hardware for AI. The approach of this work is based on moving computation inside the device by exploring its intrinsic dynamics resulting from its physical properties. A framework is presented that defines the boundary between static and dynamic memory in neuromorphic systems and how this impacts the emulation of biological mechanisms. This maps onto three structural areas of the work, analogous to properties present in biological neural networks: external factors that modify plasticity, internal dynamic factors that act on plasticity, and stochasticity.
Oggigiorno, il volume dei dati prodotti nella nostra società sta crescendo esponenzialmente, così come la loro varietà e complessità. Automotive, smart cities e Industria 4.0 sono solo alcuni esempi di tendenze che stanno accelerando la domanda di capacità computazionali e di storage sempre maggiori. Superato il suo "secondo inverno", l'Intelligenza Artificiale (AI) offre oggi uno strumento per affrontare queste nuove sfide. Tuttavia, nelle sue implementazioni pratiche, l'AI tende a sovraccaricare le infrastrutture e i sistemi informatici, portando a un aumento ancora più drammatico della domanda computazionale. Il problema di fondo risiede nell'architettura dei computer moderni, proposta nel 1945 da Von Neumann. L'architettura di Von Neumann, dove l'unità di calcolo e di memoria sono separate, ci ha permesso di costruire macchine versatili e di uso generale negli ultimi 70 anni. Tuttavia, questa architettura sta ora mostrando significative inefficienze, principalmente a causa della necessità di spostare i dati tra le due unità. Questo problema è aggravato da un'asimmetria nel tasso di miglioramento tecnologico tra le unità di memoria e le unità di elaborazione e dall'avvicinarsi al limite fisico per il ridimensionamento della tecnologia CMOS. In questo contesto, emerge il Calcolo Neuromorfico: iniziato dal lavoro seminale di Carver Mead alla fine degli anni '80, questo nuovo paradigma si ispira alle strutture neurali biologiche per emulare il funzionamento e l'efficienza del cervello a livello hardware. Nel cervello, non vi è separazione tra calcolo e memoria: le due unità fondamentali, neuroni e sinapsi, lavorano in sinergia e i concetti di memoria e calcolo si fondono. Oltre a un cambiamento verso il calcolo in memoria, le reti biologiche sono caratterizzate dalla loro plasticità, ovvero la capacità di adattarsi continuamente agli stimoli. È la plasticità neuronale che è responsabile della nostra capacità di ricordare, apprendere e adattarsi attraverso una miriade di complessi meccanismi di plasticità che insieme risultano in efficienza energetica e capacità computazionali inimmaginabili nei loro controparti artificiali di oggi. Tuttavia, i nuovi paradigmi solitamente richiedono nuove tecnologie, ed è qui che entrano in gioco i dispositivi memristivi. Questi dispositivi di memoria emergenti non sono solo validi per supportare il futuro scaling tecnologico, ma possono potenzialmente implementare meccanismi che emulano la plasticità del cervello umano. Espandere e indagare questo insieme di meccanismi di plasticità e apprendimento è la sfida aperta del calcolo neuromorfico, portando a sistemi scalabili, efficienti e biologicamente plausibili. Questa tesi di dottorato si concentra sull'espansione dei meccanismi di plasticità ottenibili attraverso le proprietà dinamiche dei dispositivi memristivi e il loro utilizzo in sistemi neuromorfici biologicamente plausibili, dove quest'ultimo componente è cruciale per colmare il divario tra i modelli di neuroscienza computazionale e l'hardware per l'AI. L'approccio di questo lavoro si basa sul portare il calcolo all'interno del dispositivo esplorando le sue dinamiche intrinseche derivanti dalle sue proprietà fisiche. Viene presentato un quadro che definisce il confine tra memoria statica e dinamica nei sistemi neuromorfici e come questo impatti sull'emulazione dei meccanismi biologici. Questo si mappa su tre aree strutturali del lavoro, analoghe alle proprietà presenti nelle reti neurali biologiche: fattori esterni che modificano la plasticità, fattori dinamici interni che agiscono sulla plasticità e stocasticità.
Emerging memory devices & systems for biologically plausible neuromorphic computing
Milozzi, Alessandro
2023/2024
Abstract
Nowadays, the volume of data produced in our society is exponentially surging, as is its variety and complexity. Automotive, smart cities, and Industry 4.0 are just a few examples of trends that are accelerating the demand for increased computational and storage capacities. Having surpassed its "second winter", Artificial Intelligence (AI) today offers a tool to address these new challenges. However, in its practical implementations, AI tends to overload computing infrastructures and systems, leading to an even more dramatic increase in computational demand. The underlying issue is rooted in the architecture of modern computers, proposed in 1945 by Von Neumann. The Von Neumann architecture, which separates the computing unit from the memory, has allowed us to build flexible, general-purpose machines over the past 70 years. However, this architecture is now showing significant inefficiencies primarily due to the need to move data between the two units. This problem is exacerbated by an asymmetry in the rate of technological improvement between memory units and processing units and by approaching the physical limit for scaling CMOS technology. In this context, Neuromorphic Computing emerges: started from the seminal work of Carver Mead in the late ’80s, this new paradigm draws inspiration from biological neural structures to emulate the functioning and efficiency of the brain at the hardware level. In the brain, there is no separation between computing and memory: the two fundamental units, neurons and synapses, work in synergy, and the concepts of memory and computing merge. Beyond a shift towards in-memory computing, biological networks are characterized by their plasticity, or the ability to continuously adapt to stimuli. It is neuronal plasticity that is responsible for our ability to remember, learn, and adapt through a myriad of complex plasticity mechanisms that together result in energy efficiency and computational capabilities that are unimaginable in their artificial counterparts today. However, new paradigms usually require new technologies, and this is where resistive switching devices come into play. These emerging memory devices are not only viable for supporting future technological scaling but potentially can implement mechanisms that emulate the plasticity of the human brain. Expanding and investigating this set of plasticity and learning mechanisms is the open challenge of neuromorphic computing, leading to scalable, efficient, and biologically plausible systems. This doctoral thesis focuses on expanding the plasticity mechanisms achievable through the dynamic properties of memristive devices and their use in biologically plausible neuromorphic systems, where this latter component is crucial for bridging the gap between models of computational neuroscience and hardware for AI. The approach of this work is based on moving computation inside the device by exploring its intrinsic dynamics resulting from its physical properties. A framework is presented that defines the boundary between static and dynamic memory in neuromorphic systems and how this impacts the emulation of biological mechanisms. This maps onto three structural areas of the work, analogous to properties present in biological neural networks: external factors that modify plasticity, internal dynamic factors that act on plasticity, and stochasticity.File | Dimensione | Formato | |
---|---|---|---|
PhD Thesis Milozzi Alessandro.pdf
solo utenti autorizzati a partire dal 09/07/2025
Dimensione
72.05 MB
Formato
Adobe PDF
|
72.05 MB | Adobe PDF | Visualizza/Apri |
I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/10589/224412