The emergence of quantum computing represents a profound challenge to the security of widely-adopted public-key cryptographic systems, which rely on the computational complexity of tasks such as factoring large integers or solving discrete logarithms. To confront this challenge, esteemed organizations like the U.S. National Institute of Standards and Technology (NIST), the Chinese Association for Cryptologic Research (CACR), and the European Telecommunications Standards Institute (ETSI) are actively engaged in the formulation of cryptographic primitives capable of withstanding both classical and quantum attacks. These novel cryptographic systems, collectively termed post-quantum cryptosystems, are at the forefront of standardization efforts. Among the leading contenders in this standardization endeavor, linear code-based cryptosystems, deriving their strength from the computational complexity of the Syndrome Decoding Problem (SDP), have gained significant recognition. The SDP is defined as the task of retrieving an error vector when provided with the parity check matrix of a randomly generated linear block error correction code and the syndrome of the error, as computed through the same matrix. Classically, the most effective technique for solving the SDP is the Information Set Decoding (ISD) method, which, notably, exhibits exponential complexity with respect to the parameters of the cryptosystems. Current quantum approaches to the SDP, on the other hand, do not surpass the quadratic speedup offered by adapting Grover’s algorithm to the ISD technique, and provide only asymptotic estimates of their computational cost, potentially hiding non-trivial constant and polynomial factors. The central focus of this study revolves around the precise computational complexity evaluation of quantum solvers for the SDP, tailored to cryptography-grade code parameters. Our approach introduces quantum circuits designed for universal quantum gate-based computing models, that are build upon the foundations laid by classic ISD techniques. Our scrutiny extends to both complete quantum solutions to the SDP and hybrid methodologies that effectively partition the computational load between classical and quantum computing resources. In our investigation, the approach stemming from Prange’s approach to the ISD technique stands out, as it displays a substantial enhancement in computational efficiency. Notably, it leads to a reduction in both the depth of quantum circuits and the depth-times-width metric by factors ranging from 2¹² to 2²⁴ applicable to concrete cryptography-grade parameters. Surprisingly, our findings reveal that the gains achieved through the approach inspired by Lee and Brickell’s ideas, which materialize as a hybrid classical-quantum algorithm, are somewhat modest. These enhancements range from 2¹⁰ to 2²⁰ for the same cryptographic parameters, a result contrary to expectations based on classical counterparts, where Lee and Brickell’s approach prevails over Prange’s one. However, the hybrid approach substantially reduces the size and depth of quantum circuits, rendering the estimates more realistic and facilitating parallel execution on separate quantum computing platforms. Our quantitative analysis of computational costs brings forth a significant conclusion: all code-based cryptoschemes under the scrutiny of esteemed organizations such as NIST, particularly BIKE, HQC, and McEliece, unequivocally surpass the predefined threshold for computational hardness. Put simply, they prove to be computationally more demanding than the task of breaking a corresponding symmetric cipher with appropriately-sized key lengths. Furthermore, a critical vulnerability in the Classic McEliece cryptoscheme is unveiled. Parallelizing this algorithm across multiple quantum processing units erodes its security, plunging it below the targeted security threshold by a factor of 16. An ancillary contribution of this research is the development of a set of quantum circuits capable of solving common algebraic and algorithmic problems, including Gauss-Jordan Elimination over finite fields, bit string sorting, and Hamming weight computation, which may be of independent interest in the field of quantum computing.
L’avvento del calcolo quantistico rappresenta una profonda sfida alla sicurezza dei sistemi crittografici basati su chiavi pubbliche ampiamente utilizzati. Tali sistemi fanno affidamento sulla complessità computazionale di operazioni come la fattorizzazione di grandi numeri interi o la risoluzione di logaritmi discreti. Per affrontare questa sfida, istituzioni di grande prestigio come l’ufficio nazionale di standard e tecnologia degli Stati Uniti (NIST), l’associazione Cinese per la ricerca crittografica (CACR) e l’istituto Europeo per le norme delle telecomunicazioni (ETSI), sono impegnate nella formulazione di primitive crittografiche in grado di resistere sia agli attacchi classici che a quelli quantistici. Questi innovativi sistemi crittografici, noti collettivamente come crittosistemi post-quantistici, sono al centro degli sforzi di standardizzazione. Tra i principali contendenti in questo sforzo di standardizzazione emergono i crittosistemi basati su codici lineari, che basano la loro sicurezza sulla complessità computazionale del problema di decodifica della sindrome (SDP). Il SDP è definito come il compito di recuperare un vettore di errori a partire dalla matrice di controllo di parità di un codice di correzione di errori lineare a blocchi generato casualmente, e della sindrome dell’errore calcolata attraverso la stessa matrice. Dal punto di vista classico, la tecnica più efficace per risolvere il SDP è il metodo di decodifica dell’insieme di informazioni (ISD), che mostra una complessità esponenziale rispetto ai parametri dei crittosistemi. D’altra parte, le attuali soluzioni quantistiche per il SDP non superano l’accelerazione quadratica offerta dall’adattamento dell’algoritmo di Grover alla tecnica ISD e forniscono solo stime asintotiche dei costi computazionali, nascondendo potenziali fattori costanti e polinomiali non trascurabili. Il fulcro di questo studio ruota intorno alla valutazione precisa della complessità computazionale dei risolutori quantistici per il SDP, adattata ai parametri dei codici proposti per la crittografia post-quantistica. La ricerca svolta mostra circuiti quantistici progettati per modelli di calcolo universali basati su porte logiche quantistiche, che si basano sui fondamenti delle tecniche ISD classiche proposte da Prange, Lee e Brickell. L’analisi si estende sia a soluzioni quantistiche complete per il SDP che a metodologie ibride che suddividono efficacemente il carico computazionale tra risorse di calcolo classico e quantistico. Nel corso dello studio, è emersa chiaramente l’efficacia dell’approccio derivante dalla proposta di Prange alla tecnica ISD, in grado di ottenere un miglioramento sostanziale dell’efficienza computazionale. In particolare, si mostra una riduzione sia della profondità dei circuiti quantistici che della metrica profondità per larghezza da 2¹² a 2²⁴. Sorprendentemente, i risultati rivelano che i miglioramenti ottenuti tramite l’approccio ispirato alle idee di Lee e Brickell, che sono state materiliazzati come un algoritmo ibrido classico-quantistico, sono più modesti, variando da 2¹⁰ a 2²⁰ per gli stessi parametri crittografici, contrariamente alle aspettative basate sulle controparti classiche, in cui l’approccio di Lee e Brickell è più efficiente di quello di Prange. Tuttavia, l’approccio ibrido riduce significativamente la dimensione e la profondità dei circuiti quantistici, rendendo le stime più realistiche e agevolando l’esecuzione parallela su piattaforme di calcolo quantistiche separate. L’analisi quantitativa dei costi computazionali porta a una conclusione significativa: tutti i crittosistemi basati su codici esaminati da istituzioni di grande prestigio come il NIST, in particolare BIKE, HQC e Classic McEliece, superano inequivocabilmente la soglia predefinita per la complessità computazionale. In altre parole, questi crittosistemi si rivelano computazionalmente più esigenti rispetto ai corrispondenti cifrari simmetrici con chiavi di dimensioni adeguate. Tuttavia, lo studio rivela una vulnerabilità critica nel crittosistema Classic McEliece. La parallelizzazione di questo algoritmo su diverse unità di elaborazione quantistiche erode la sua sicurezza, portandola al di sotto della soglia di sicurezza mirata di un fattore di 16. Un contributo accessorio di questa ricerca è la creazione di un insieme di circuiti quantistici capaci di risolvere comuni problemi algebrici e algoritmici, tra cui l’eliminazione di Gauss-Jordan su campi finiti, la classificazione di stringhe binarie e il calcolo del peso di Hamming, che possono rappresentare un notevole interesse indipendente nel campo del calcolo quantistico.
Quantum circuits for information set decoding : quantum cryptanalysis of code-based cryptosystems
PERRIELLO, SIMONE
2023/2024
Abstract
The emergence of quantum computing represents a profound challenge to the security of widely-adopted public-key cryptographic systems, which rely on the computational complexity of tasks such as factoring large integers or solving discrete logarithms. To confront this challenge, esteemed organizations like the U.S. National Institute of Standards and Technology (NIST), the Chinese Association for Cryptologic Research (CACR), and the European Telecommunications Standards Institute (ETSI) are actively engaged in the formulation of cryptographic primitives capable of withstanding both classical and quantum attacks. These novel cryptographic systems, collectively termed post-quantum cryptosystems, are at the forefront of standardization efforts. Among the leading contenders in this standardization endeavor, linear code-based cryptosystems, deriving their strength from the computational complexity of the Syndrome Decoding Problem (SDP), have gained significant recognition. The SDP is defined as the task of retrieving an error vector when provided with the parity check matrix of a randomly generated linear block error correction code and the syndrome of the error, as computed through the same matrix. Classically, the most effective technique for solving the SDP is the Information Set Decoding (ISD) method, which, notably, exhibits exponential complexity with respect to the parameters of the cryptosystems. Current quantum approaches to the SDP, on the other hand, do not surpass the quadratic speedup offered by adapting Grover’s algorithm to the ISD technique, and provide only asymptotic estimates of their computational cost, potentially hiding non-trivial constant and polynomial factors. The central focus of this study revolves around the precise computational complexity evaluation of quantum solvers for the SDP, tailored to cryptography-grade code parameters. Our approach introduces quantum circuits designed for universal quantum gate-based computing models, that are build upon the foundations laid by classic ISD techniques. Our scrutiny extends to both complete quantum solutions to the SDP and hybrid methodologies that effectively partition the computational load between classical and quantum computing resources. In our investigation, the approach stemming from Prange’s approach to the ISD technique stands out, as it displays a substantial enhancement in computational efficiency. Notably, it leads to a reduction in both the depth of quantum circuits and the depth-times-width metric by factors ranging from 2¹² to 2²⁴ applicable to concrete cryptography-grade parameters. Surprisingly, our findings reveal that the gains achieved through the approach inspired by Lee and Brickell’s ideas, which materialize as a hybrid classical-quantum algorithm, are somewhat modest. These enhancements range from 2¹⁰ to 2²⁰ for the same cryptographic parameters, a result contrary to expectations based on classical counterparts, where Lee and Brickell’s approach prevails over Prange’s one. However, the hybrid approach substantially reduces the size and depth of quantum circuits, rendering the estimates more realistic and facilitating parallel execution on separate quantum computing platforms. Our quantitative analysis of computational costs brings forth a significant conclusion: all code-based cryptoschemes under the scrutiny of esteemed organizations such as NIST, particularly BIKE, HQC, and McEliece, unequivocally surpass the predefined threshold for computational hardness. Put simply, they prove to be computationally more demanding than the task of breaking a corresponding symmetric cipher with appropriately-sized key lengths. Furthermore, a critical vulnerability in the Classic McEliece cryptoscheme is unveiled. Parallelizing this algorithm across multiple quantum processing units erodes its security, plunging it below the targeted security threshold by a factor of 16. An ancillary contribution of this research is the development of a set of quantum circuits capable of solving common algebraic and algorithmic problems, including Gauss-Jordan Elimination over finite fields, bit string sorting, and Hamming weight computation, which may be of independent interest in the field of quantum computing.File | Dimensione | Formato | |
---|---|---|---|
Thesis.pdf
accessibile in internet per tutti
Descrizione: Tesi
Dimensione
1.33 MB
Formato
Adobe PDF
|
1.33 MB | Adobe PDF | Visualizza/Apri |
I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/10589/220713