In recent years, the use of Machine Learning (ML) models has been expanded either on the research community, but also in our daily life. Despite their advances, concerns regarding privacy guarantees are growing more and more, particularly in contexts where data contains sensitive information, such as medical or financial fields. Privacy-Preserving ML can solve this problem, guaranteeing privacy throughout the entire process. Among different approaches, Homomorphic Encryption (HE) guarantees mathematical operations to be carried out directly on encrypted data, without the need to decrypt it. In particular, Torus Fully Homomorphic Encryption (TFHE) supports an unlimited number of homomorphic operations, making it suitable for deep and complex tasks. Despite the big potential, combining TFHE with Deep Neural Networks (DNNs) training remains only partially explored due to the computational complexity. This thesis presents, for the first time, a solution for training neural networks, both Multi-Layer Perceptrons (MLPs) and Convolutional Neural Networks (CNNs), under the TFHE scheme using floating-point arithmetic. The proposed approach relies on integer reinterpretation of floating-point numbers while using approximate multiplication and division algorithm to significantly reduce computational overhead. The entire framework is implemented in Rust, resulting in a system where standard deep learning techniques can be seamlessly applied in the encrypted domain by switching to encrypted parameters and operations. This work demonstrates the feasibility and flexibility of floating-point arithmetic for encrypted training under TFHE, paving the way for a new path of privacy-preserving deep learning solutions.
Negli ultimi anni, l'utilizzo di modelli di Machine Learning (ML) si è espanso sia nella comunità di ricerca, ma anche nella nostra vita quotidiana. Nonostante i loro vantaggi, preoccupazioni riguardo la privacy crescono sempre di più, soprattutto in quei contesti dove i dati contengono informazioni sensibili, come in ambito medico o finanziario. Privacy-Preserving ML può risolvere questo problema garantendo la privacy durante tutto il processo. Tra i possibili approcci, la crittografia omomorfica (HE) garantisce che le operazioni matematiche vengano compiute direttamente sui dati criptati, senza la necessità di decrittarli. In particolare, Torus Fully Homomorphic Encryption (TFHE) supporta un numero illimitato di operazioni omomorfiche, rendendolo adatto per soluzioni complesse e lunghe. Nonostante il grande potenziale, combinare TFHE con l'addestramento di reti neurali profonde (DNNs) rimane solo parzialmente esplorato a causa della sua complessità computazionale. Questa tesi presenta, per la prima volta, una soluzione per allenare reti neurali, sia per Multi-Layer Perceptrons (MLPs) sia per Convolutional Neural Networks (CNNs), attraverso lo schema TFHE ed utilizzando l'aritmetica floating-point. L'approccio proposto si basa sulla reinterpretazione intera dei numeri floating-point, mentre, contemporanemente, vengono utilizzati algoritmi di moltiplicazione e divisione approssimati per ridurre il sovraccarico computazionale. L'intero framework è implementato in Rust, ottenendo un sistema dove tecniche standard di deep learning possono essere perfettamente applicate nel dominio criptato, utilizzando parametri e operazioni criptate. Questo lavoro dimostra fattibilità e flessibilità nell'uso dell'aritmetica floating-point per l'allenamento criptato con TFHE, aprendo la strada ad un nuovo percorso tra le soluzioni di Privacy-Preserving deep learning.
TFHE-based floating-point neural network training
Nicoletti, Emanuele
2024/2025
Abstract
In recent years, the use of Machine Learning (ML) models has been expanded either on the research community, but also in our daily life. Despite their advances, concerns regarding privacy guarantees are growing more and more, particularly in contexts where data contains sensitive information, such as medical or financial fields. Privacy-Preserving ML can solve this problem, guaranteeing privacy throughout the entire process. Among different approaches, Homomorphic Encryption (HE) guarantees mathematical operations to be carried out directly on encrypted data, without the need to decrypt it. In particular, Torus Fully Homomorphic Encryption (TFHE) supports an unlimited number of homomorphic operations, making it suitable for deep and complex tasks. Despite the big potential, combining TFHE with Deep Neural Networks (DNNs) training remains only partially explored due to the computational complexity. This thesis presents, for the first time, a solution for training neural networks, both Multi-Layer Perceptrons (MLPs) and Convolutional Neural Networks (CNNs), under the TFHE scheme using floating-point arithmetic. The proposed approach relies on integer reinterpretation of floating-point numbers while using approximate multiplication and division algorithm to significantly reduce computational overhead. The entire framework is implemented in Rust, resulting in a system where standard deep learning techniques can be seamlessly applied in the encrypted domain by switching to encrypted parameters and operations. This work demonstrates the feasibility and flexibility of floating-point arithmetic for encrypted training under TFHE, paving the way for a new path of privacy-preserving deep learning solutions.| File | Dimensione | Formato | |
|---|---|---|---|
|
2025_12_Nicoletti_Tesi.pdf
solo utenti autorizzati a partire dal 10/11/2026
Descrizione: testo della tesi
Dimensione
2.87 MB
Formato
Adobe PDF
|
2.87 MB | Adobe PDF | Visualizza/Apri |
|
2025_12_Nicoletti_Executive_Summary.pdf
solo utenti autorizzati a partire dal 10/11/2026
Descrizione: executive summary
Dimensione
1.09 MB
Formato
Adobe PDF
|
1.09 MB | Adobe PDF | Visualizza/Apri |
I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/10589/245897