Lattice theory evolved from 19th-century geometry of numbers into one of the most active areas of modern cryptography. The computational pivot began with the 1982 LLL algorithm, and reached its cryptographic apex through Ajtai's 1996 worst-case hardness result and Regev's 2005 LWE problem — both of which now underpin the world's first NIST-standardized post-quantum cryptographic algorithms.

Part I — Foundational Mathematical Works

Minkowski, Geometry of Numbers (1896)

Author: Hermann Minkowski
Link: Wikipedia overview

Hermann Minkowski initiated the formal study of lattice points in 1896 at age 26. His two celebrated convex body theorems establish that any centrally symmetric convex body of sufficient volume must contain a nonzero lattice point, and provide tight bounds on the product of successive minima of a lattice. This work is the mathematical ancestor of every lattice reduction and shortest vector algorithm developed over the following century.

See also: Geometry of Numbers

Birkhoff, Lattice Theory (1940; 3rd ed. 1967)

Author: Garrett Birkhoff
Link: Wolfram MathWorld

Birkhoff's series of papers in 1933–1937 and the textbook Lattice Theory unified and extended the algebraic lattice structure (partially ordered sets under meet/join operations). While distinct from the geometric/integer lattices used in cryptography, this foundational framework gave rigorous language to lattice structures across modern algebra, projective geometry, and logic.

Grätzer, General Lattice Theory (1978; 2nd ed. 1998); Lattice Theory: Foundation (2011)

Author: George Grätzer
Link: Google Books

Since its first edition, General Lattice Theory became the standard graduate reference for algebraic lattice theory, with nearly 900 exercises and 193 research problems. The 2011 Foundation volume focuses on distributivity, congruences, modularity, varieties, and free products, covering the mathematical core needed to understand lattice structures across computer science and algebra.

Part II — Lattice Reduction: Key Algorithms

Hermite Reduction (1850)

Author: Charles Hermite
Link: Hermite's Constant and Lattice Algorithms (Nguyen)

Hermite published the first reduction algorithm for arbitrary dimension in 1850, providing an upper bound on the length of the shortest vector in a lattice. The resulting Hermite constant — bounding the ratio of the shortest vector length to the lattice determinant — remained a central object in reduction theory for over a century. Though Hermite proved existence without producing an efficient algorithm, his constant directly motivates the quality metrics used in LLL and BKZ today.

See also: Hermite's Constant | Lattice Reduction

LLL Algorithm (1982)

Authors: Arjen K. Lenstra, Hendrik W. Lenstra Jr., László Lovász
Paper: Factoring polynomials with rational arithmetic (1982)
Links: Wikipedia | Expository paper (arXiv 2024)

The most important practical lattice reduction algorithm ever devised — a polynomial-time method that outputs a "reduced" basis of relatively short, nearly orthogonal vectors. Its primary motivation was polynomial factorization, but it immediately found applications in breaking knapsack-based cryptosystems, solving integer programs in fixed dimension, and Diophantine approximation. The algorithm achieves an approximation factor of for the shortest vector in dimension . It remains the workhorse of lattice cryptanalysis.

See also: LLL Algorithm

Babai's Nearest Plane / Closest Vertex Algorithm (1986)

Author: László Babai
Link: Techniques in Lattice Basis Reduction (arXiv)

Building on LLL, Babai showed in 1986 that a reduced basis can be used to efficiently approximate the Closest Vector Problem (CVP) within a factor of . The Nearest Plane Algorithm and the Closest Vertex Algorithm became the standard tools for approximate CVP solving and are used in cryptanalytic attacks on lattice-based schemes.

BKZ Algorithm (Block Korkin–Zolotarev, 1987)

Author: Claus P. Schnorr
Paper: A hierarchy of polynomial time lattice basis reduction algorithms (1987)
Link: Block KZ Bases and Successive Minima (Schnorr)

Schnorr introduced the BKZ algorithm as a refinement of LLL, using a block-size parameter to interpolate between LLL (block size 2) and the optimal but exponential Korkin–Zolotarev reduction. Larger block sizes yield shorter output vectors at the cost of exponential runtime in . BKZ and its improved variants (BKZ 2.0, progressive BKZ) are the state-of-the-art tools for attacking lattice-based cryptosystems and set practical security parameters for NIST standards.

See also: BKZ Algorithm

Korkin–Zolotarev (HKZ) Basis Reduction (1877 / Algorithm 1983)

Authors: Aleksandr Korkin, Yegor Ivanovich Zolotarev (definition, 1877); Ravi Kannan (algorithm, 1983)
Link: Wikipedia

The KZ-reduced basis achieves orthogonality defect at most , far better than LLL's , but with exponential complexity. It is theoretically important as the "gold standard" of reduction and is used for multiple CVP instances in the same lattice, where the amortized cost is lower. BKZ is a practical family of algorithms parameterized to approximate KZ quality.

See also: Lattice Reduction

Part III — Lattice Hardness & Cryptographic Foundations

Ajtai's Worst-Case to Average-Case Reduction (1996)

Author: Miklós Ajtai
Paper: Generating Hard Instances of Lattice Problems (STOC 1996)
Link: EMS Press chapter

Arguably the most important theoretical result in lattice-based cryptography. Ajtai showed that solving certain average-case lattice problems (the Short Integer Solution / SIS problem) is at least as hard as solving worst-case instances of lattice problems such as approximating the Shortest Vector Problem (SVP). This established the first rigorous cryptographic hardness guarantee not dependent on unproven assumptions — any algorithm that breaks the average-case problem instantly breaks worst-case SVP.

Regev's LWE (2005/2009)

Author: Oded Regev
Paper: On Lattices, Learning with Errors, Random Linear Codes, and Cryptography — STOC 2005; JACM 2009
Links: arXiv | PDF

Regev introduced the Learning With Errors (LWE) problem — the task of recovering a secret vector from noisy linear equations over . He proved a quantum reduction from worst-case GapSVP and SIVP to LWE, meaning LWE is at least as hard as these lattice problems in the quantum setting. This paper (cited over 6,000 times) also presented an efficient LWE-based public-key encryption scheme. Regev received the 2018 Gödel Prize for this work. LWE is now the foundation of ML-KEM (Kyber) and ML-DSA (Dilithium), the NIST post-quantum standards.

See also: LWE & Ring-LWE

Micciancio & Goldwasser, Complexity of Lattice Problems (2002)

Authors: Daniele Micciancio, Shafi Goldwasser
Links: UCSD page | Google Books

This Springer textbook provides the definitive self-contained treatment of the computational complexity of lattice problems for cryptography. It covers the strongest inapproximability results for SVP, relations between computational lattice problems (SVP, CVP, SIVP), and proofs that cryptographic functions can be built from worst-case hardness assumptions. Essential reading for anyone entering lattice cryptography research.

Micciancio & Regev, Lattice-based Cryptography (2009)

Authors: Daniele Micciancio, Oded Regev
Link: PDF

This influential survey chapter covers the main constructions in lattice-based cryptography up to 2008, including hash functions, one-way functions, and public-key encryption from SIS and LWE. It explains the worst-case hardness guarantees, implementation efficiency, and simplicity advantages over classical cryptographic schemes, and argues convincingly for lattice-based cryptography as the best post-quantum candidate.

Ring-LWE: Lyubashevsky, Peikert, Regev (2010)

Authors: Vadim Lyubashevsky, Chris Peikert, Oded Regev
Paper: On Ideal Lattices and Learning with Errors over Rings — Eurocrypt 2010; JACM 2013
Link: Semantic Scholar

Ring-LWE introduces an algebraic variant of LWE over polynomial rings (ideal lattices), retaining the worst-case hardness guarantees of LWE while dramatically improving efficiency. The reduction goes from worst-case approx-SVP on ideal lattices to Ring-LWE via a quantum algorithm. Ring-LWE is the mathematical foundation for CRYSTALS-Kyber (ML-KEM) and CRYSTALS-Dilithium (ML-DSA) — two of the three NIST-approved post-quantum standards.

See also: Ring-LWE

Peikert, A Decade of Lattice Cryptography (2016)

Author: Chris Peikert
Paper: Foundations and Trends in Theoretical Computer Science, 2016
Links: ACM Digital Library | Author page

This monograph surveys the major developments in lattice cryptography from 2005–2015, covering SIS/LWE and their variants, ring-based constructions, and practical implementations such as BLISS, NewHope, and Frodo. It provides a thorough treatment of hardness reductions, structured lattices, and the transition from theory to practice — essential for anyone working in post-quantum cryptography.

Part IV — Cryptographic Schemes

GGH Cryptosystem (1997) — and its Cryptanalysis (1999)

Authors: Oded Goldreich, Shafi Goldwasser, Shai Halevi (GGH); Phong Nguyen (cryptanalysis)
Links: Wikipedia | Cryptanalysis (Semantic Scholar)

The GGH cryptosystem (1997) was the first lattice-based public-key encryption scheme with a formal hardness argument, based on the difficulty of CVP. The public key is a "bad" (highly non-orthogonal) basis; the private key is a "good" (nearly orthogonal) basis, and decryption uses Babai's algorithm. Nguyen demonstrated a critical flaw in 1999: ciphertexts leak enough information to enable lattice attacks even in practical dimensions, making the scheme cryptographically broken. Its historical importance is as the first serious proof-of-concept of lattice cryptography.

NTRU Cryptosystem (1996/1998)

Authors: Jeffrey Hoffstein, Jill Pipher, Joseph H. Silverman
Paper: NTRU: A Ring-Based Public Key Cryptosystem (ANTS 1998)
Link: Original paper PDF

NTRU was the first practical lattice-based cryptosystem, announced at the Crypto 1996 rump session. It operates in a polynomial ring and achieves small key sizes and high speed relative to RSA/ECC. Security rests on the hardness of finding short vectors in NTRU lattices. NTRU's ring structure inspired the later Ring-LWE framework, and NTRU lattices are directly used in Falcon (FN-DSA), one of the NIST post-quantum signature standards.

See also: NTRU

Gentry's Fully Homomorphic Encryption (2009)

Author: Craig Gentry
Paper: Fully Homomorphic Encryption Using Ideal Lattices (STOC 2009)
Links: IBM Research | PDF (CMU)

Gentry's 2009 paper (cited over 9,500 times) solved the 30-year-old open problem of constructing a fully homomorphic encryption (FHE) scheme — one allowing arbitrary computation on encrypted data without decryption. The construction uses ideal lattices, exploiting their additive and multiplicative homomorphism modulo a public-key ideal in a polynomial ring. The "bootstrapping" technique that makes the scheme fully (rather than somewhat) homomorphic remains the blueprint for all subsequent FHE research.

See also: Fully Homomorphic Encryption

CRYSTALS-Kyber (ML-KEM) and CRYSTALS-Dilithium (ML-DSA) — NIST Standards (2024)

Authors: Bos, Ducas, Kiltz, Lepoint, Lyubashevsky, Schwabe, Seiler, Stehlé and collaborators
Standard: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA) — approved August 2024
Links: NIST announcement | CWI news

In August 2024, NIST approved FIPS 203 (Module-Lattice-Based Key-Encapsulation Mechanism, derived from CRYSTALS-Kyber) and FIPS 204 (Module-Lattice-Based Digital Signature Standard, derived from CRYSTALS-Dilithium) as the world's first post-quantum cryptographic standards. Both are based on the hardness of Module-LWE, a generalization bridging plain LWE and Ring-LWE. The algorithms were developed largely by IBM researchers in collaboration with academic partners.

See also: Kyber / ML-KEM | Dilithium / ML-DSA

FALCON (FN-DSA) — NIST Standard (2024/2025)

Authors: Pierre-Alain Fouque, Jeffrey Hoffstein, Paul Kirchner, Vadim Lyubashevsky, Thomas Pornin, Thomas Prest, Thomas Ricosset, Gregor Seiler, William Whyte, Zhenfei Zhang
Paper: Fast-Fourier Lattice-Based Compact Signatures over NTRU
Links: falcon-sign.info | Paper PDF | Wikipedia

Falcon is a hash-and-sign lattice-based signature scheme built on NTRU lattices, instantiating the Gentry–Peikert–Vaikuntanathan (GPV) framework with fast Fourier sampling for signature generation. It achieves the smallest combined key + signature sizes of any standardized lattice signature scheme. Security is based on the hardness of SIS over NTRU lattices, for which no efficient classical or quantum algorithm is known. It was selected by NIST in 2022 for future standardization (FN-DSA, planned for FIPS 206).

See also: Falcon Signatures | Lattice Signatures

Summary Table

WorkYearAuthorsCategoryKey Contribution
Geometry of Numbers1896MinkowskiFoundationConvex body theorems, lattice point bounds
Lattice Theory1940BirkhoffFoundationAlgebraic lattice theory textbook
General Lattice Theory1978GrätzerFoundationStandard graduate reference
LLL Algorithm1982Lenstra, Lenstra, LovászReductionFirst polynomial-time lattice reduction
Babai's CVP Algorithms1986BabaiReductionNearest Plane / Closest Vertex for approx-CVP
BKZ Algorithm1987SchnorrReductionBlock-based reduction, better than LLL
Ajtai Hardness (SIS)1996AjtaiHardnessWorst-case to average-case reduction
GGH Cryptosystem1997Goldreich, Goldwasser, HaleviCryptoFirst lattice PKE (later broken)
NTRU1998Hoffstein, Pipher, SilvermanCryptoFirst practical lattice crypto
Complexity of Lattice Problems2002Micciancio, GoldwasserReferenceSVP/CVP complexity, inapproximability
LWE & Cryptography2005RegevHardnessLWE problem, quantum reduction, PKE
Lattice-based Cryptography Survey2009Micciancio, RegevSurveyLattice crypto landscape
FHE using Ideal Lattices2009GentryCryptoFirst fully homomorphic encryption
Ring-LWE2010Lyubashevsky, Peikert, RegevHardnessEfficient LWE on ideal lattices
A Decade of Lattice Crypto2016PeikertSurveyComprehensive survey 2005–2015
Falcon / FN-DSA2017–2024Fouque et al.CryptoNIST signature standard, NTRU lattices
ML-KEM / ML-DSA2024NIST / IBM et al.StandardFirst approved PQC standards