Main | Organizers | Speakers | Scientific Scope | Contact


Scientific scope


It has been known since 1994 that public key systems in use today are vulnerable to attacks by quantum computers. This vulnerability was only theoretical until recently, when IBM, Google and others began to build working quantum computers. These computers are not yet large enough to be a threat, but NSA has advised American businesses not to rely on current methods to protect data more than thirty years into the future.

Thirty years is none too long a time in which to establish a new secure postquantum public key infrastructure; many practical problems need to be solved. NIST and ETSI are currently investigating relevant standards and evaluating proposed protocols.

Modern public key cryptography is largely based on heuristic assumptions. There are many foundational questions which have been open for decades. For example the fundamental assumption cryptography is that there are no polynomial-time algorithms for solving any NP-hard problem. A consequence of this assumption is that there are cryptographically interesting problems that are hard to solve in the quantum setting. Identifying such problems seems to be very difficult. Instead of confronting problems like this directly, we propose to bring together cryptographers, complexity theorists, and mathematicians to investigate related problems which may be easier to solve and which may yield useful insights into foundational questions. For example many of the foundational problems about polynomial time problems have analogs at the level of computability.

Our program will cover both practical postquantum cryptography and related areas which are more abstract. Principal topics are listed below.

Code-based cryptography. Cryptographic primitives based on the hardness of decoding random linear codes are historically the first post-quantum systems. Since the late 1970s schemes like McEliece encryption have withstood a long series of cryptanalytic attacks. In order to embed a trapdoor that enables decryption, one converts a structured code with good decryption capabilities like a Goppa code by linear transformations into a "random-looking" code C. An attacker now faces the problem to either distinguish C from a purely random code using the properties of the underlying structured code, or to directly decode C. In general, the last attack is the best (generic) attack that is known. Recently, there has been some significant progress on decoding binary linear codes C of dimension n. A new trend in code-based cryptography is to use linear codes that are different than the Goppa code initially proposed by McEliece (MDPC codes, Rank codes, quasi-cyclic codes, and others). This is motivated by the needs to decrease the size of the public key, motivating also many new algorithmic challenges.

Lattice-based cryptography. Lattice-based cryptography is the generic term for constructions of cryptographic primitives that involve lattices, either in the construction itself or in the security proof. Lattice based constructions are currently important candidates for post-quantum cryptography. The first lattice-based cryptographic construction was introduced by Miklos Ajtai in 1996. Fundamentally, Ajtai's result was a worst-case to average-case reduction, i.e., he showed that a certain average-case lattice problem, known as Short Integer Solutions (SIS), is at least as hard to solve as a worst-case lattice problem. Also in 1996, Jeffrey Hoffstein, Jill Pipher, and Joseph H. Silverman introduced a lattice-based public-key encryption scheme, known as NTRU. However, their scheme is not known to be at least as hard as solving a worst-case lattice problem. The first lattice-based public-key encryption scheme with provable security was introduced by Oded Regev in 2005, together with the Learning with Errors problem (LWE). Since then, much follow-up work has focused on improving Regev's security proof and improving the efficiency of the original scheme. Much more work has been devoted to constructing additional cryptographic primitives based on LWE and related problems. In 2009, Craig Gentry introduced the first fully homomorphic encryption scheme, which was based on a lattice problem. Currently, lattice-based cryptographic constructions are considered to be the leading candidates for public key post-quantum cryptography. We are therefore going to explore the most recent trends in lattice-based cryptography during our program.

Multivariate cryptography. Multivariate cryptography is usually defined as the set of cryptographic schemes using the computational hardness of the Polynomial System Solving problem over a finite field. Solving systems of multivariate polynomial equations is proven to be NP-hard or NP-complete. That is why those schemes are often considered to be good candidates for post-quantum cryptography. The first multivariate scheme (named C*) based on multivariate equations was introduced by Matsumoto and Imai in 1988. These authors not only introduced a multivariate scheme but in fact a general principle to design public-key cryptosystems using multivariate equations. There are now plenty of proposals based on this principle that are attractive because they offer the possibility to have very short asymmetric signatures that require only a small amount of resources on embedded devices. After an intense period of cryptanalysis, few schemes emerged as the most robust solutions: HFE (Hidden Field Equations) and UOV (Unbalanced Oil and Vinegar), both developed by J. Patarin in the late 1990s. Variants of theses schemes have been submitted to the post-quantum standardization process organized by NIST. A major question in this area is to understand the security of these schemes in the quantum setting.

Group-based cryptography.

In addition we want to consider related areas which may cast some light on unanswered problems in postquantum cryptography. Below we list some avenues to explore.

Complexity theory. A key generation algorithm is essentially a way of generating an instance of a computational problem which is hard to solve in the absence of some secret information (the private key). Ordinary complexity theory has definite disadvantages when it comes to analyzing the frequency and distribution of hard instances. Other approaches include

Generic case complexity. Analysis in terms of the time necessary to generate a hard instance, i.e., as a challenger-solver problem.

Statistical algebra. Many NP-hard problems have analogs over various algebraic structures. Techniques from computational algebra afford insights into the distribution of hard instances.