Looks like the name “CRYSTALS-KYBER” is a Star Wars reference (kyber crystals). At least one of the authors of CRYSTALS-KYBER (Peter Schwabe) published an earlier PQC algorithm called “NewHope”, another Star Wars reference.
And “CRYSTALS-DILITHIUM” is, obviously, a Star Trek reference. :)
Not sure what browser you use, but in most you can select what you wanna search for, click "Search on $searchEngine for $term" and there you go! For PQC, I get Wikipedia link with "PQC can refer to: Post-quantum cryptography" in the description as the 3rd result on Google.
Not sure what classifies as "fair amount", but for me it took about 1-2 seconds to find the Wikipedia link ;)
I agree that it is not a lot of work for a single person, but if you add it up over 100,000 readers (just a ballpark guess, I am sure dang can tell us exact numbers), it sums up to at least 2 days of cumulative wasted time, assuming that everyone looked it up. Obviously, not everyone will look it up because of laziness or indifference, but those readers will not understand what the title is about, which is not an ideal situation either.
A similar situation arises with trains, where it is often better to not hold open the door for someone who is late by a few seconds, as the cumulative delay for everyone else in the train exceeds the waiting period of the single person for the next train.
ssh(1), sshd(8): use the hybrid Streamlined NTRU Prime + x25519 key
exchange method by default ("email@example.com").
The NTRU algorithm is believed to resist attacks enabled by future
quantum computers and is paired with the X25519 ECDH key exchange
(the previous default) as a backstop against any weaknesses in
NTRU Prime that may be discovered in the future. The combination ensures that the hybrid exchange offers at least as good security
as the status quo.
We are making this change now (i.e. ahead of cryptographically-
relevant quantum computers) to prevent "capture now, decrypt
later" attacks where an adversary who can record and store SSH
session ciphertext would be able to decrypt it once a sufficiently
advanced quantum computer is available.
I personally think NIST should be disregarded, but you can disregard NIST and still end up with CRYSTALS-KYBER as your default PQC KEM, on its own merits, which can include the fact that NIST's standardization spurs so much implementation of CRYSTALS-KYBER that it becomes a de facto standard in addition to a de jure standard. (Same for signatures, and so on).
People with qualms about NIST might also reasonably have qualms about AES. And there is a common cipher that people use outside of AES --- Chapoly. But it would be downright weird to use, like, Serpent or Twofish; it would be the cryptography equivalent of a "code smell". Chapoly and AES are the de facto standards, and OpenSSH supports both.
Again though: my question is just, what does this (frankly weird) Bernstein complaint have to do with any of it? Bernstein himself is a NISTPQC participant; he's on one of the (large) winning signature teams.
(I think all the technical details here are super interesting, but not especially motivating; I'm not a cryptographer and you should disregard me as well, but my basic take on QC crypto attacks is "Rodents of unusual size? I don't think they exist.")
I don't wish NIST had conducted the proceedings more professionally, not because I'm a nihilist about standards but because I don't know enough to critique how they ran this. I've read the whole post upthread (by the way: if you're scratching your head, the trick is to read the red text, across several pages, all the way through, and then come back and pay attention to the rebuttals you think are interesting) and don't feel any more equipped to say anything about it. What I will say is that a significant chunk of all the world's public key encryption expertise got sunk into this event.
One reason KYBER got standardized quickly is that PQC KEMs are time-sensitive if you believe the quantum attack threat is plausibly material within the next 10-15 years. Your adversary in these attacks will almost certainly be state signals intelligence groups, and the expense involved in building attack hardware dwarfs the expense of collecting traffic today to decrypt in 2034. If you're a PQC believer, you want something out the door soon.
I don't understand the special sway you think Bernstein has, versus all the other cryptographers that participate in NISTPQC, with the OpenSSH team. I worry that people believe stuff like this because they know who Bernstein is and what OpenSSH is, and don't off the top of their head know who Tancrède Lepoint is. Note also that the KYBER team includes Peter Schwabe, whose name you should definitely know if you're a Bernstan.
Again, you're not really being asked to trust NIST here, so much as you are the CRYSTALS team. If you think the CRYSTALS team has been subverted by NSA, you're pretty far outside of the mainstream of cryptography thinkers; notably, this isn't a claim Bernstein has made, or is likely ever to make, unless someone dares him to†.
Bernstein being "involved in never-ending drama" is the reason it's legal to export strong cryptography from the US today and the reason much of this PQC work got done at all. He's clearly a person who often fights in cases where almost everyone else surrendered instead, which is presumably what you mean by "the problem is him," but I don't see why you describe it as a "problem". His inclination to tell hard truths, even when faced with corruption and intimidation, has frequently served the public interest.
It was often a huge problem for the people who he was fighting with, though. Are you one of them?
It doesn't seem reasonable to say that Bernstein is the reason much of this PQC work got done at all.
He was one of the earliest PQC popularizers and probably coined the term. But asserting that he enabled everyone else's work is a little like saying that the person who coined "misuse-resistant authenticated encryption" enabled all the different misuse-resistant schemes; the underlying issue was plainly evident, and people were obviously going to work on it.
Your last sentence falls afoul of the HN guidelines, and your comment would be far stronger without it. Which is unfortunate, since there's an interesting and curious conversation to be had about the significance of Bernstein's role in PQC.
There's some technical details that I'm not good enough to summarize, but a large gist of it seems to be that the NISTPQC seems to have gone back on it's word about being transparent through the standardization process and only ever solicited private input after round 2 and round 3 and used that non-published input to make claims about the strength of at least one contender for the standardization. And the way they've done this appears to reek of Dual EC style manipulation again from what DJB brings up? at least as far as how the process is working. I don't believe he's claiming that there's any NSA back doors but alluding to there being a private party that is steering things in ways that might not be good.
Along with also apparently some possible remarks about DJB doing something wrong also (I couldn't tell from this at least what it was. I haven't done any complete readings yet).
Been waiting on this announcement for a while. As someone who took a crypto class in college, but isn't a crypto expert (just knows the basics and basic theory), this is very neat to see. I'm looking forward to never implementing it myself :)
> In addition, NIST has engaged with third parties that own various patents directed to cryptography, and NIST acknowledges cooperation of ISARA, Philippe Gaborit, Carlos Aguilar Melchor, the laboratory XLIM, the French National Center for Scientific Research (CNRS), the University of Limoges, and Dr. Jintai Ding. NIST and these third parties are finalizing agreements such that the patents owned by the third parties will not be asserted against implementers (or end-users) of a standard for the selected cryptographic algorithm
> NIST expects to execute the various agreements prior to publishing the standard. If the agreements are not executed by the end of 2022, NIST may consider selecting NTRU instead of KYBER. NTRU was proposed in 1996, and U.S. patents were dedicated to the public in 2007.
NIST is going the proper route to ensure that any standards they publish can be freely implemented without implementers having to pay patent royalties. That's the reason for your second quote - if KYBER patent holders don't want to agree, they should know that NIST won't choose them for the standard.
Just to clarify: My understanding is that the authors of Kyber aren't the patent holders in question-- rather a third party has patents which may read on Kyber and several other of the NIST finalists.
It's really unfortunate the the licensing terms weren't announced at the same time: Depending on how they're written the result may still be unattractive to use, and since they've already announced the selection NIST probably just lost some amount of negotiating leverage.
(As the obvious negotiation would be "agree to these terms we find reasonable, or we just select NTRU prime")
Key part here is "are finalizing". It's still possible for at least some of the deals to fall through. I guess NTRU is the backup plan just in case and/or a method to apply pressure by saying the public is now aware there's a plan B. I exüect this passage to imply at least one negotiation has been going poorly.
It would probably be interesting to look up who of these people also has patents outside of the USA. If there really is someone being particularly stubborn, one might reasonably expect them to enforce the non-US patent variant outside of the USA.
> Overall assessment. One important feature of NTRU is that because it has been around for longer, its IP situation is more clearly understood. The original designers put their patents into the public domain , in addition to most of them having expired.
> As noted by the submitters, NTRU may not be the fastest or smallest among the lattice KEM finalists, and for most applications and use cases, the performance would not be a problem. Nonetheless, as NIST has selected KYBER for standardization, NTRU will therefore not be considered for standardization in the fourth round.
I've been following this space for a while and this is a good question, but I think the answer is really a "ranges from 10 years to never".
There's a lot of investment currently in the quantum computer space (+ a lot of hype and scams). Yet this is still all very early research and far away from any practical use. The challenges to really build a QC that can break cryptography are enormous - and it is absolutely a possibility that they're too big to overcome.
This article asserts that D-Wave and other quantum annealing devices will be able to mount attacks long before a machine exists that can run Shor's algorithm with error-corrected qubits in sufficient quantity.
To second what the sibling comment has said, "quantum annealing" claims by DWave are considered fairly overblown (on some rare occasions even misleading/scammy). If the claims of this article held, they would have been much better known in the field and published in much more popular venues.
The record for factoring using a quantum computer is 21. Don't read that as 21 bits. 3*7. This has been the record for 12 years and that is arguably a result that is "cheating" with a priori knowledge of the factors.
There are some other examples of people factoring special-form composites that are particularly easy to factor on quantum computers, but those are basically stunts with no impact.
To threaten RSA, quantum computers need to increase the number qubits 6 orders of magnitude and improve the error correction at least 2 orders of magnitude. Check out this blog post for an illustration of where we are at: https://sam-jaques.appspot.com/quantum_landscape
I think what you are asking may better be answered by ignoring PQC and following CNSA recommendations for up to TOP SECRET. The crypto is likely what you already use, but it defines how to get enough bits of security from an algorithm.
There is a table of transition algorithms on the second or third page, depending on your screen size. 
Sure. If that 1 expert bothered to post a falsifiable prediction like “x% likely this’ll happen by year y”, the rest of us could read their argument and update our predictions.
Unfortunately that’s pretty uncommon so everyone has to go by base rates (crypto algorithms seem to last x years historically) and vague guesses (quantum computer capabilities seem to be doubling every x years so I dunno maybe enough qbits by 2050)
It is generally accepted that elliptical curge cryptography is a bit easier to break with Shor's algorithm than RSA. Something like half as hard, but it probably would not make any real difference in practice. So the paper is directly applicable to elliptic curves to the extent that it is applicable to anything.
I want to see these actually being implemented in current software ASAP (layered with traditional crypto). As-is it's possible to capture encrypted traffic out of the air, store it for however many decades are needed, and then decrypt it in future.
It's worth noting that the relevant timeframe to implement PQC isn't just when quantum computers become sufficiently fast to break current crypto (assuming the answer isn't never). It can take a decade or longer to re-encrypt data and/or to update cryptographic infrastructure.
Given that (varied) expert option on quantum computing being able to break current public key cryptography seems to mostly fall in the 10-20 year range, there is some, at least mild, urgency to start using PQC for the most sensitive data relatively soon.
Well good thing that some of the cryptographers that created Falcon  (the ones who developed Algorand) for post-quantum cryptography for digital signatures use cases is considered to be 'standardised' as such.
This tells me that Algorand is one of the more serious blockchain projects out there with top cryptographers as evidenced by Falcon.
10-20 years. As soon as we have atomically precise manufacturing, there are multiple approaches to making stable, scalable quantum computers that work. I see APM being possible on that time horizon. One company, Zyvex, has already prototyped those capability in the lab.
Because of communication cost considerations the lattice candidates use problems small enough that another substantial improvement in attacks could leave them vulnerable (no shock that they use small problems: if you're really not communication cost constrained use McEliece and don't worry about it).
If you do use lattice key agreement, be sure to use it in a hybrid configuration (combined with ECC like ed25519 or Curve448) to avoid the (small but hard to assess) risk that your security upgrade could actually be a security downgrade.
Well, most modern cryptography is based on assumptions that can not be proven, so having different standards based on different assumptions is probably the only way to safeguard against if one of the assumptions would be proven false in the future.
To nitpick, afaik, its not that they cannot be proven, its that they have not been, and look very hard to prove, which is slightly different (not my area of expertise, but i assume this would be tied to p vs np)
I was thinking, if you could definitively prove these assumptions are hard, that would prove P != NP, because if P=NP that would imply there would be an algorithm to solve these types of problems, since they are the type of thing that can be solved in polynomial time with the key, but cannot without a key. (I'm a bit out of my depth here)
Hash functions too. If P=NP then you can reverse a hash in polynomial time.
NP is the set of all functions that you can verify a solution to in polyomial time, and the solution of the inverse-hash function is just a plaintext that hashes to the right value, and obviously you can check if a plaintext is right in polynomial time by just hashing it and comparing the hashes. Thus reversing a hash function is in NP, so if P=NP it's in P.
There's some subtlety here in that "reversing" a hash function really just means coming up with a plaintext that generates the right hash, not the original one, but you can put any polynomial-time set of constraints on the plaintext and finding a plaintext that satisfies those constraints (and hashes to the right value) is still in NP, so the subtlety really doesn't save you much.
Side point, but since we're talking quantum, we should really be saying BQP=NP not P=NP, BQP being the problems solvable in polynomial time on a quantum computer, it's a superset of P and a subset of NP, but we don't know if it's equal to either or both. I.e. P=NP implies BQP=NP, BQP != NP implies P != NP, BQP != P implies P != NP, but the reverse of all of those statements isn't known to be true.
Maybe it safeguards them from looking like they've screwed this up, but in terms of providing a concrete recommendation to system implementers, how does this safeguard anything? How does offering multiple algorithms in the PQC category help me make systems safer? What am I actually supposed to do here (how do I reflect this hedge in a system design)?
They didn't feel the need to provide multiple recommendations during the AES, or the SHA-3 process, even though Rijndael and Keccak used different constructions relative to RC6/TwoFish and SHA-2/Blake2. Why now?
The recommendations look clear to me: you should use CRYSTALS-Dilithium (unless you need smaller signatures, in which case use FALCON), but you should also be prepared to switch to SPHINCS+ on short notice if someone breaks CRYSTALS-Dilithium (or structured lattices in general).
So best practice would seem to be to implement both CRYSTALS-Dilithium and SPHINCS+, set CRYSTALS-Dilithium as the default, and provide a switch (config setting, whatever) to switch to SPHINCS+. If you have long-term keys, you should have both forms set up & ready to use.
> They didn't feel the need to provide multiple recommendations during the AES, or the SHA-3 process, even though Rijndael and Keccak used different constructions relative to RC6/TwoFish and SHA-2/Blake2. Why now?
SHA-3 was explicitly alternative reccomendation. The entire point was to come up with something that was not based on sha-2, because they were worried that the attacks on md5/sha1 could be extended to sha2 (which didn't really happen the way people were worried about). Even to this day, general advice is not to use sha3.
Less clear cut for aes, but at time of standardization (and even now afaik), triple des was considered secure, so its not like there wasn't a secure alternative.
These standards arent meant as implementation guides. You still need cryptography knowledge to securely use them.
Life's hard and the world is uncertain. If NIST could make an algorithm that they could prove was 100% safe with no possibility of future cryptoanalytical breakthroughs, i am sure they would, but that is beyond current state of the art.
You mean like a one-time pad? I'm sure the folks at NIST know about it; it is completely unbreakable and had been around for a while. Use is not really practical though, so typically reserved for very specific use cases.
One time pads fall into the symmetrical encryption category. There is no huge issue with symmetrical encryption with respect to the possibility someone might invent a quantum computer. The things people are working on for a post quantum world and NIST is attempting to standardize are in the asymmetrical encryption category.
This is a silly nitpick. I think its pretty well understood from context i meant a practical quantum safe key agreement algorithm. One time pad cannot be used for that purpose at all, let alone practically.
The selection criteria for SHA-3 included internal state being greater than the output size. SHA-1 and SHA-2 both repeat this mistake of MD5. SHA-2 has variants that don't have this problem, but sha-256 and sha-512 aren't among them.
I'm having trouble finding it now but I recall someone complaining about the constants for 512 leaving something to be desired.
This isn’t a mistake per se, it’s how that class of hash functions—and really, almost every hash function ever—is implemented. It’s called the Merkle-Damgård construction. It adds some very good properties and is the basis for how hash functions can be used in hash tree constructions and such.
But proving that the input state is evenly mixed among the output state is THE hard thing to prove (the hash function equivalent of the difficulty of factoring integers), so for the sake of ecosystem diversity NIST chose a hash function based on different principles for SHA-3. It’s not a criticism of SHA-2 that the difference was called out.
The constants are the fractional bits of of successive cube roots. This is effectively a nothing-up-my-sleeve random number selection. If there are problems with this, that in itself would be a serious cryptographic result.
A point rendering the choice even more curious: Germany and the Netherlands have recommended the use of encryption not relying on the shortest vector problem . The two suggestions of FrodoKEM (relying on hardness of the learning with errors problem) and Classic McEliece (relying on hardness of decoding random codes?) aren't lattice-based apparently.
LWE's hardness is based on SVP (ignoring issues of tightness, which isn't unique to FrodoKEM). The difference between FrodoKEM + Kyber/Saber isn't relying on SVP/not (they all essentially do), but is on relying on LWE over structured lattices or not.
At a very high level, all of the three rely on an n x n matrix at a certain point. The "structured lattice" schemes (Kyber/Saber) make structural assumptions about this matrix, say that each row is a cyclic shift of the previous row. This turns an O(n^2) object into an O(n) object, giving many performance improvements. The downside is that the additional structure can plausibly be used for attacks (but the best attacks ignore the structure, so this is a "potential issue", not a current issue).
Something that worries me, if someone cracks our current encryption using quantum computers couldn't they be logging everything we say right now and everything we say right now is actually unsecure to someone 10 years in the future?
Yes. That's, for instance, why people say the KEM problem has more urgency than the signature problem; a PQC KEM is what you need today if you're worried that someone's archiving your TLS sessions so they can break them with the quantum computer their government promised them for Christmas in 2034. Even if your KEX involves a signature, your adversary can't time-travel back to 2022 to break it with their 2034 scooty-puff quantum edition. But if all you've got is classical ECC and RSA, you're in trouble.
If you assume the PQC KEM doesn't interact with classical ECDH, you might want to get some kind of PQC KEM rolled out as quickly as you can, in a dual construction with ECDH; the worst that happens is, your new KEM isn't quantum-safe (or anything-safe), but your ECDH holds up. But that's (if you believe in quantum attacks on crypto) still better than no PQC KEM at all.
Yes. This is why the work is being done now, and there will be an urgency in moving PQC algorithms from academia to commercial use. Everything that has been stolen in data breaches up until then will be broken once QC are viable.
Good news is that we are likely more than 10 years away from QCs being useful enough to do this.
Here's the warning: Lattice-based cryptography is much more risky than commonly acknowledged. This applies, in particular, to lattice KEMs under consideration within the NIST Post-Quantum Cryptography Standardization Project (NISTPQC) as of October 2021. The above document...
Yes, many. I believe he's on the SPHINCS+ team (was standardized), Classic McCliece (round 3, not standardized), and NTRU_PRIME (round 3, passed over for Kyber). Perhaps more, but he has significant skin in the game.
from skimming it, his main argument is that Kyber relies on many constructions (e.g. cyclotomic polynomials) that are actively under attack - researchers have been successfully chipping away at them and show no signs of stopping.
he also alleges that NIST have been moving the goal posts to favor Kyber, and they've been duplicitous in their narrative.
Cyclotomic polynomials are incredibly standard in the field. The only researcher I know of who has issues with them is DJB, and there has not been significant advances in cryptanalysis due to usage of cyclotomics (with the exception of problems not used by NIST candidates, meaning the whole SOLIQUAY thing)
Thanks for taking the time to write this up. But, woof, it's a bit more than ELI5. :) The python code makes it a little more clear since I'm not familiar with some of the notation. However, it does seem kind of magic that 'e' is derived during the encryption and then sort of vanishes. I also don't quite get the bounded vs uniform vector sampling calls (one for s and the other for chi). But this at least greases the wheels so to speak, so thanks!
Thanks for the feedback! Roughly speaking, that all has to do with making e vanish later, so perhaps I need to revisit that section.
Quickly (cause I probably won't for a few days), (q//2)m can be seen as a form of error correction. You can check (either pen+paper or programmatically) that, provided |e| < q/4, if noisy_m = (q//2) m + e, then round(noisy_m / (q/4)) = m. So e vanishes because it is bounded (not uniform), + we encode m as (q//2)*m (i.e. in the "most significant bits" of the number).