NIST announces first PQC algoritms to be standardized


182 points | by isido 44 days ago


  • elromulous 44 days ago
    PQC = post quantum cryptography
    • cpeterso 43 days ago
      Looks like the name “CRYSTALS-KYBER” is a Star Wars reference (kyber crystals). At least one of the authors of CRYSTALS-KYBER (Peter Schwabe) published an earlier PQC algorithm called “NewHope”, another Star Wars reference.

      And “CRYSTALS-DILITHIUM” is, obviously, a Star Trek reference. :)

    • haggy 44 days ago
      Ah thank you. I figured the 'Q' stood for quantum but you saved me a fair amount of googling :)
      • capableweb 43 days ago
        "fair amount of googling"?

        Not sure what browser you use, but in most you can select what you wanna search for, click "Search on $searchEngine for $term" and there you go! For PQC, I get Wikipedia link with "PQC can refer to: Post-quantum cryptography" in the description as the 3rd result on Google.

        Not sure what classifies as "fair amount", but for me it took about 1-2 seconds to find the Wikipedia link ;)

        • johndough 43 days ago
          I agree that it is not a lot of work for a single person, but if you add it up over 100,000 readers (just a ballpark guess, I am sure dang can tell us exact numbers), it sums up to at least 2 days of cumulative wasted time, assuming that everyone looked it up. Obviously, not everyone will look it up because of laziness or indifference, but those readers will not understand what the title is about, which is not an ideal situation either.

          A similar situation arises with trains, where it is often better to not hold open the door for someone who is late by a few seconds, as the cumulative delay for everyone else in the train exceeds the waiting period of the single person for the next train.

  • chasil 44 days ago
    OpenSSH has already chosen NTRU-Prime. Will there be a retrofit of CRYSTALS-KYBER? Or has the market already chosen?

    DJB is an author on the SPHINCS+ team; glad to see that his work will be part of the standard.

    • layer8 44 days ago
      OpenSSH has merely chosen that as its current default. Surely multiple algorithms will be supported in the future as they have in the past.
      • chasil 44 days ago
        There was considerable strife for Daniel J. Bernstein during this competition.

        It would not surprise me if OpenSSH only chooses to add SPHINCS+ and refuses the others.

        • tptacek 43 days ago
          What does Bernstein's process complaint about NIST's process complaint about him have to do with which ciphersuites OpenSSH will support?
          • chasil 43 days ago
            OpenSSH appears to have disregarded NIST, and made their own determination on a pq-kex.

            Should NIST be disregarded?

            NTRU-prime is not a finalist, but OpenSSH has decided that the NIST designation is irrelevant.


            ssh(1), sshd(8): use the hybrid Streamlined NTRU Prime + x25519 key exchange method by default (""). The NTRU algorithm is believed to resist attacks enabled by future quantum computers and is paired with the X25519 ECDH key exchange (the previous default) as a backstop against any weaknesses in NTRU Prime that may be discovered in the future. The combination ensures that the hybrid exchange offers at least as good security as the status quo.

            We are making this change now (i.e. ahead of cryptographically- relevant quantum computers) to prevent "capture now, decrypt later" attacks where an adversary who can record and store SSH session ciphertext would be able to decrypt it once a sufficiently advanced quantum computer is available.

            • tptacek 43 days ago
              I personally think NIST should be disregarded, but you can disregard NIST and still end up with CRYSTALS-KYBER as your default PQC KEM, on its own merits, which can include the fact that NIST's standardization spurs so much implementation of CRYSTALS-KYBER that it becomes a de facto standard in addition to a de jure standard. (Same for signatures, and so on).

              People with qualms about NIST might also reasonably have qualms about AES. And there is a common cipher that people use outside of AES --- Chapoly. But it would be downright weird to use, like, Serpent or Twofish; it would be the cryptography equivalent of a "code smell". Chapoly and AES are the de facto standards, and OpenSSH supports both.

              Again though: my question is just, what does this (frankly weird) Bernstein complaint have to do with any of it? Bernstein himself is a NISTPQC participant; he's on one of the (large) winning signature teams.

              (I think all the technical details here are super interesting, but not especially motivating; I'm not a cryptographer and you should disregard me as well, but my basic take on QC crypto attacks is "Rodents of unusual size? I don't think they exist.")

              • chasil 43 days ago
                I agree with your sentiment. I wish NIST had conducted the proceedings more professionally, and this collapse in confidence is their own fault.

                The OpenSSH decision to promote NTRU-prime from an experimental feature to the preferred key exchange was breathtakingly rapid, and final. It is a tacit assertion that NIST is no longer relevant.

                DJB was on several teams, and I think that OpenSSH would lend greater credence to him than any other council, deservedly so.

                We might end up with SPHINCS+, but I will be surprised if KYBER is adopted.

                This moved very fast.

                • tptacek 43 days ago
                  I don't wish NIST had conducted the proceedings more professionally, not because I'm a nihilist about standards but because I don't know enough to critique how they ran this. I've read the whole post upthread (by the way: if you're scratching your head, the trick is to read the red text, across several pages, all the way through, and then come back and pay attention to the rebuttals you think are interesting) and don't feel any more equipped to say anything about it. What I will say is that a significant chunk of all the world's public key encryption expertise got sunk into this event.

                  One reason KYBER got standardized quickly is that PQC KEMs are time-sensitive if you believe the quantum attack threat is plausibly material within the next 10-15 years. Your adversary in these attacks will almost certainly be state signals intelligence groups, and the expense involved in building attack hardware dwarfs the expense of collecting traffic today to decrypt in 2034. If you're a PQC believer, you want something out the door soon.

                  I don't understand the special sway you think Bernstein has, versus all the other cryptographers that participate in NISTPQC, with the OpenSSH team. I worry that people believe stuff like this because they know who Bernstein is and what OpenSSH is, and don't off the top of their head know who Tancrède Lepoint is. Note also that the KYBER team includes Peter Schwabe, whose name you should definitely know if you're a Bernstan.

                  • chasil 43 days ago
                    The major question will be what ends up in TLS.

                    Aside from adherence to DJB, the question will be what can be trusted?

                    We have been down this road before.


                    • tptacek 43 days ago
                      Again, you're not really being asked to trust NIST here, so much as you are the CRYSTALS team. If you think the CRYSTALS team has been subverted by NSA, you're pretty far outside of the mainstream of cryptography thinkers; notably, this isn't a claim Bernstein has made, or is likely ever to make, unless someone dares him to†.


        • simcop2387 43 days ago
          I can't seem to find anything on google, did NISTPQC ever reply?
        • google234123 43 days ago
          Bernstein seems to be involved in never ending drama. Maybe the problem is him?
          • kragen 43 days ago
            Bernstein being "involved in never-ending drama" is the reason it's legal to export strong cryptography from the US today and the reason much of this PQC work got done at all. He's clearly a person who often fights in cases where almost everyone else surrendered instead, which is presumably what you mean by "the problem is him," but I don't see why you describe it as a "problem". His inclination to tell hard truths, even when faced with corruption and intimidation, has frequently served the public interest.

            It was often a huge problem for the people who he was fighting with, though. Are you one of them?

            • tptacek 43 days ago
              It doesn't seem reasonable to say that Bernstein is the reason much of this PQC work got done at all.

              He was one of the earliest PQC popularizers and probably coined the term. But asserting that he enabled everyone else's work is a little like saying that the person who coined "misuse-resistant authenticated encryption" enabled all the different misuse-resistant schemes; the underlying issue was plainly evident, and people were obviously going to work on it.

              Your last sentence falls afoul of the HN guidelines, and your comment would be far stronger without it. Which is unfortunate, since there's an interesting and curious conversation to be had about the significance of Bernstein's role in PQC.

              • chasil 43 days ago
                Bernstein's original lawsuit in the 90s resulted in the lift of ITAR restrictions on strong cryptography.


                "The ruling in the case declared that software was protected speech under the First Amendment, which contributed to regulatory changes reducing controls on encryption."

                • tptacek 43 days ago
                  I'm talking about PQC, not his suit against the government.
              • kragen 30 days ago
          • TedDoesntTalk 43 days ago
            Can you summarize? That’s a PDF I can’t read.
            • simcop2387 43 days ago
              There's some technical details that I'm not good enough to summarize, but a large gist of it seems to be that the NISTPQC seems to have gone back on it's word about being transparent through the standardization process and only ever solicited private input after round 2 and round 3 and used that non-published input to make claims about the strength of at least one contender for the standardization. And the way they've done this appears to reek of Dual EC style manipulation again from what DJB brings up? at least as far as how the process is working. I don't believe he's claiming that there's any NSA back doors but alluding to there being a private party that is steering things in ways that might not be good.

              Along with also apparently some possible remarks about DJB doing something wrong also (I couldn't tell from this at least what it was. I haven't done any complete readings yet).

    • bwesterb 44 days ago
      NTRU-Prime, NTRU, Kyber and SABER are all great KEMs. NIST could've chosen any one of them. NIST never standardised Ed25519 and OpenSSH still uses it, which is perfectly fine.
    • CircleSpokes 44 days ago
      I imagine they would add support for the standardized algorithms and still support the ones they are currently using too.
  • baby 43 days ago
    Shameless plug: I wrote about all these schemes in Chapter 14 on post-quantum cryptography of Real-World Cryptography

    These are meant as a gentle introduction to the ideas and intuitions behind the schemes. The book is recent but some of that stuff (hash-based signatures) I started writing back in 2015 and is available on my blog:

    At the time the schemes had not yet been chosen, fortunately I picked the right ones :) don't have to rewrite that chapter (yet).

  • jjice 44 days ago
    Been waiting on this announcement for a while. As someone who took a crypto class in college, but isn't a crypto expert (just knows the basics and basic theory), this is very neat to see. I'm looking forward to never implementing it myself :)
  • ENOTTY 44 days ago
    What's up with this?

    > In addition, NIST has engaged with third parties that own various patents directed to cryptography, and NIST acknowledges cooperation of ISARA, Philippe Gaborit, Carlos Aguilar Melchor, the laboratory XLIM, the French National Center for Scientific Research (CNRS), the University of Limoges, and Dr. Jintai Ding. NIST and these third parties are finalizing agreements such that the patents owned by the third parties will not be asserted against implementers (or end-users) of a standard for the selected cryptographic algorithm


    > NIST expects to execute the various agreements prior to publishing the standard. If the agreements are not executed by the end of 2022, NIST may consider selecting NTRU instead of KYBER. NTRU was proposed in 1996, and U.S. patents were dedicated to the public in 2007.

    • hn_throwaway_99 44 days ago
      NIST is going the proper route to ensure that any standards they publish can be freely implemented without implementers having to pay patent royalties. That's the reason for your second quote - if KYBER patent holders don't want to agree, they should know that NIST won't choose them for the standard.
      • nullc 44 days ago
        Just to clarify: My understanding is that the authors of Kyber aren't the patent holders in question-- rather a third party has patents which may read on Kyber and several other of the NIST finalists.

        It's really unfortunate the the licensing terms weren't announced at the same time: Depending on how they're written the result may still be unattractive to use, and since they've already announced the selection NIST probably just lost some amount of negotiating leverage.

        (As the obvious negotiation would be "agree to these terms we find reasonable, or we just select NTRU prime")

    • rdpintqogeogsaa 44 days ago
      Key part here is "are finalizing". It's still possible for at least some of the deals to fall through. I guess NTRU is the backup plan just in case and/or a method to apply pressure by saying the public is now aware there's a plan B. I exüect this passage to imply at least one negotiation has been going poorly.

      It would probably be interesting to look up who of these people also has patents outside of the USA. If there really is someone being particularly stubborn, one might reasonably expect them to enforce the non-US patent variant outside of the USA.

    • madars 43 days ago
      > If the agreements are not executed by the end of 2022, NIST may consider selecting NTRU instead of KYBER.

      It is especially interesting that NTRU (nor NTRU Prime, a different proposal) is _not_ advancing to the 4th round. Wouldn't you want to encourage more analysis for your (implied) runner-up?

      • willglynn 43 days ago
        Not only that, NIST says:

        > Overall assessment. One important feature of NTRU is that because it has been around for longer, its IP situation is more clearly understood. The original designers put their patents into the public domain [113], in addition to most of them having expired.

        > As noted by the submitters, NTRU may not be the fastest or smallest among the lattice KEM finalists, and for most applications and use cases, the performance would not be a problem. Nonetheless, as NIST has selected KYBER for standardization, NTRU will therefore not be considered for standardization in the fourth round.

        "NTRU is obviously legal and perfectly suitable, but we're not picking it." I find this to be a baffling position given the as-yet-unsolved patent issues with KYBER.

        • silverspoonin 43 days ago
          >NTRU may not be the fastest or smallest.

          "It's slower and uses more memory" goes a long way in encouraging the evaluation of other options.

    • formerly_proven 44 days ago
      Crypto that requires royalties won't be widely implemented, so you basically don't need to bother standardizing it.
  • chrispeel 44 days ago
  • isido 44 days ago
  • RcouF1uZ4gsC 44 days ago
    HN Crypto and Quantum Experts.

    What is your prediction when classical public key encryption using elliptical curve cryptographic becomes practically vulnerable to quantum computers, such that we would need these PQC algorithms.

    10 years out? 20 years out? 50 years out? 100 years out?

    • hannob 44 days ago
      I've been following this space for a while and this is a good question, but I think the answer is really a "ranges from 10 years to never".

      There's a lot of investment currently in the quantum computer space (+ a lot of hype and scams). Yet this is still all very early research and far away from any practical use. The challenges to really build a QC that can break cryptography are enormous - and it is absolutely a possibility that they're too big to overcome.

      • chasil 44 days ago
        This article asserts that D-Wave and other quantum annealing devices will be able to mount attacks long before a machine exists that can run Shor's algorithm with error-corrected qubits in sufficient quantity.

        • latenightcoding 43 days ago
          Quantum Annealing is not a threat for cryptography. You can safely dismiss these sort of articles.
        • krastanov 43 days ago
          To second what the sibling comment has said, "quantum annealing" claims by DWave are considered fairly overblown (on some rare occasions even misleading/scammy). If the claims of this article held, they would have been much better known in the field and published in much more popular venues.
    • sweis 43 days ago
      The record for factoring using a quantum computer is 21. Don't read that as 21 bits. 3*7. This has been the record for 12 years and that is arguably a result that is "cheating" with a priori knowledge of the factors.

      There are some other examples of people factoring special-form composites that are particularly easy to factor on quantum computers, but those are basically stunts with no impact.

      To threaten RSA, quantum computers need to increase the number qubits 6 orders of magnitude and improve the error correction at least 2 orders of magnitude. Check out this blog post for an illustration of where we are at:

    • IncRnd 44 days ago
      I think what you are asking may better be answered by ignoring PQC and following CNSA recommendations for up to TOP SECRET. The crypto is likely what you already use, but it defines how to get enough bits of security from an algorithm.

      There is a table of transition algorithms on the second or third page, depending on your screen size. [1]


    • NavinF 44 days ago
      "When will 256 bit ECC become insecure?" :

      The community prediction is 22% by 2032 which seems way too high IMO. I predict 5% due to advances in automated algorithm search and 0% due to quantum computers in that time frame.

      • marcosdumay 44 days ago
        Why would a croudsoursing site know that? This is the kind of question where 1 expert will fare better than the average of 90% of the people.
        • NavinF 43 days ago
          Sure. If that 1 expert bothered to post a falsifiable prediction like “x% likely this’ll happen by year y”, the rest of us could read their argument and update our predictions.

          Unfortunately that’s pretty uncommon so everyone has to go by base rates (crypto algorithms seem to last x years historically) and vague guesses (quantum computer capabilities seem to be doubling every x years so I dunno maybe enough qbits by 2050)

          • marcosdumay 43 days ago
            > quantum computer capabilities seem to be doubling every x years so I dunno maybe enough qbits by 2050

            Ok, let's get a try from a mildly informed person, that is also probably better than the 90% average...

            The number of qbits seems to be growing linearly, at about 7 qbits every 2 years. Extending that trend says that none of us will ever see a quantum computer break 256-bits ECC.

            But I really doubt the trend will hold. Quantum computing seems prone to surprise gains, and those are unpredictable by their nature.

            About this:

            > crypto algorithms seem to last x years historically

            I don't think we have enough data to decide on an average, but the distribution does surely look fat-tailed, so any statistic summary you make from it will be useless.

            If history tells anything, it is that algorithms that have minor attacks will be broken quickly, and algorithms that don't have minor attacks will survive for very long.

    • bwesterb 44 days ago
      It's hard to say. Here is a great paper that tries to answer this question.

      See Figure 11. Optimistically 15 years. Pessimistically 35 years. But anything can happen.

      • Zamicol 43 days ago
        The linked study is about RSA, not elliptical curve cryptography
        • upofadown 43 days ago
          It is generally accepted that elliptical curge cryptography is a bit easier to break with Shor's algorithm than RSA. Something like half as hard, but it probably would not make any real difference in practice. So the paper is directly applicable to elliptic curves to the extent that it is applicable to anything.
        • krastanov 43 days ago
          Does that matter? Both are based on some hidden subgroup problem and both are breakable in a similar way.
    • Asraelite 44 days ago
      I want to see these actually being implemented in current software ASAP (layered with traditional crypto). As-is it's possible to capture encrypted traffic out of the air, store it for however many decades are needed, and then decrypt it in future.
    • ghaff 44 days ago
      It's worth noting that the relevant timeframe to implement PQC isn't just when quantum computers become sufficiently fast to break current crypto (assuming the answer isn't never). It can take a decade or longer to re-encrypt data and/or to update cryptographic infrastructure.

      Given that (varied) expert option on quantum computing being able to break current public key cryptography seems to mostly fall in the 10-20 year range, there is some, at least mild, urgency to start using PQC for the most sensitive data relatively soon.

    • upofadown 43 days ago
      We have not been able to implement even a single logical qubit of the sort required to run Shor's algorithm (we would need thousands). It is impossible to extrapolate from zero.
    • danuker 44 days ago
      I expect you'd see a large increase in Bitcoin Days Destroyed, perhaps unrelated to market volume, should someone break ECDSA.

      Bitcoin uses ECDSA to validate whether coins were spent by the owner of an address.

      • rvz 44 days ago
        Well good thing that some of the cryptographers that created Falcon [0][1] (the ones who developed Algorand) for post-quantum cryptography for digital signatures use cases is considered to be 'standardised' as such.

        This tells me that Algorand is one of the more serious blockchain projects out there with top cryptographers as evidenced by Falcon.



      • jleahy 44 days ago
        The modern wallets don’t publish the public key though, so this is not likely to help.
        • baby 43 days ago
          That's only Bitcoin though
          • jleahy 43 days ago
            He did say bitcoin specifically.
    • adastra22 44 days ago
      10-20 years. As soon as we have atomically precise manufacturing, there are multiple approaches to making stable, scalable quantum computers that work. I see APM being possible on that time horizon. One company, Zyvex, has already prototyped those capability in the lab.
    • kvathupo 44 days ago
      Not an expert, but you should upgrade now to prevent attackers from stealing your encrypted data today, and decrypting it later. That said, you'll have to determine if your data is worth stealing.
    • baby 43 days ago
      I'm betting on never
    • bawolff 44 days ago

      Any predictions on these time scales are pretty much pointless.

  • oconnore 44 days ago
    > Additionally, SPHINCS+ will be standardized to avoid only relying on the security of lattices for signatures

    > Both BIKE and HQC are based on structured codes, and either would be suitable as a general-purpose KEM that is not based on lattices

    What's up with this caveat? Why would the standard require algorithms not based on lattices assuming there is confidence in the lattice based approach?

    Is this a security concern, or is there some performance (ops/sec or size) related trade-off?

    • nullc 44 days ago
      The security story for lattices hasn't been very stable.

      Consider the graph in the Classic McEliece marketing materials, showing the exponent in the attack costs for lattice-based crypto:

      Because of communication cost considerations the lattice candidates use problems small enough that another substantial improvement in attacks could leave them vulnerable (no shock that they use small problems: if you're really not communication cost constrained use McEliece and don't worry about it).

      If you do use lattice key agreement, be sure to use it in a hybrid configuration (combined with ECC like ed25519 or Curve448) to avoid the (small but hard to assess) risk that your security upgrade could actually be a security downgrade.

    • bawolff 44 days ago
      Presumably to hedge their bets. If suddenly someone finds a major problem with latices, its good to have an alternative waiting in the wings.

      See also sha-3 vs sha-256

      • oconnore 44 days ago
        If NIST feels the need to hedge their bets, why are they publishing at all? The whole point of these recommendations is so that I, a non-expert, don't have to reason about cryptographic bets.
        • kickling 44 days ago
          Well, most modern cryptography is based on assumptions that can not be proven, so having different standards based on different assumptions is probably the only way to safeguard against if one of the assumptions would be proven false in the future.
          • bawolff 44 days ago
            To nitpick, afaik, its not that they cannot be proven, its that they have not been, and look very hard to prove, which is slightly different (not my area of expertise, but i assume this would be tied to p vs np)
            • adastra22 44 days ago
              I it not tied to P vs NP as far as I’m aware. But it is the same sort of situation: number theory assumptions that are completely unproven despite many attempts.
              • pezezin 43 days ago
                According to Wikipedia, all the proposed PQC schemes are proven to be NP-Hard, so you could say that their security depends on P != NP:
              • bawolff 43 days ago
                I was thinking, if you could definitively prove these assumptions are hard, that would prove P != NP, because if P=NP that would imply there would be an algorithm to solve these types of problems, since they are the type of thing that can be solved in polynomial time with the key, but cannot without a key. (I'm a bit out of my depth here)
                • adastra22 43 days ago
                  For the stuff underlying asymmetric keys, yes. The hash function stuff doesn’t have backdoors.
                  • gpm 43 days ago
                    Hash functions too. If P=NP then you can reverse a hash in polynomial time.

                    NP is the set of all functions that you can verify a solution to in polyomial time, and the solution of the inverse-hash function is just a plaintext that hashes to the right value, and obviously you can check if a plaintext is right in polynomial time by just hashing it and comparing the hashes. Thus reversing a hash function is in NP, so if P=NP it's in P.

                    There's some subtlety here in that "reversing" a hash function really just means coming up with a plaintext that generates the right hash, not the original one, but you can put any polynomial-time set of constraints on the plaintext and finding a plaintext that satisfies those constraints (and hashes to the right value) is still in NP, so the subtlety really doesn't save you much.


                    Side point, but since we're talking quantum, we should really be saying BQP=NP not P=NP, BQP being the problems solvable in polynomial time on a quantum computer, it's a superset of P and a subset of NP, but we don't know if it's equal to either or both. I.e. P=NP implies BQP=NP, BQP != NP implies P != NP, BQP != P implies P != NP, but the reverse of all of those statements isn't known to be true.

          • oconnore 44 days ago
            Maybe it safeguards them from looking like they've screwed this up, but in terms of providing a concrete recommendation to system implementers, how does this safeguard anything? How does offering multiple algorithms in the PQC category help me make systems safer? What am I actually supposed to do here (how do I reflect this hedge in a system design)?

            They didn't feel the need to provide multiple recommendations during the AES, or the SHA-3 process, even though Rijndael and Keccak used different constructions relative to RC6/TwoFish and SHA-2/Blake2. Why now?

            • gdavisson 43 days ago
              The recommendations look clear to me: you should use CRYSTALS-Dilithium (unless you need smaller signatures, in which case use FALCON), but you should also be prepared to switch to SPHINCS+ on short notice if someone breaks CRYSTALS-Dilithium (or structured lattices in general).

              So best practice would seem to be to implement both CRYSTALS-Dilithium and SPHINCS+, set CRYSTALS-Dilithium as the default, and provide a switch (config setting, whatever) to switch to SPHINCS+. If you have long-term keys, you should have both forms set up & ready to use.

            • bawolff 44 days ago
              > They didn't feel the need to provide multiple recommendations during the AES, or the SHA-3 process, even though Rijndael and Keccak used different constructions relative to RC6/TwoFish and SHA-2/Blake2. Why now?

              SHA-3 was explicitly alternative reccomendation. The entire point was to come up with something that was not based on sha-2, because they were worried that the attacks on md5/sha1 could be extended to sha2 (which didn't really happen the way people were worried about). Even to this day, general advice is not to use sha3.

              Less clear cut for aes, but at time of standardization (and even now afaik), triple des was considered secure, so its not like there wasn't a secure alternative.

              These standards arent meant as implementation guides. You still need cryptography knowledge to securely use them.

        • bawolff 44 days ago
          Life's hard and the world is uncertain. If NIST could make an algorithm that they could prove was 100% safe with no possibility of future cryptoanalytical breakthroughs, i am sure they would, but that is beyond current state of the art.
          • Beldin 43 days ago
            You mean like a one-time pad? I'm sure the folks at NIST know about it; it is completely unbreakable and had been around for a while. Use is not really practical though, so typically reserved for very specific use cases.
            • upofadown 43 days ago
              One time pads fall into the symmetrical encryption category. There is no huge issue with symmetrical encryption with respect to the possibility someone might invent a quantum computer. The things people are working on for a post quantum world and NIST is attempting to standardize are in the asymmetrical encryption category.
            • bawolff 43 days ago
              This is a silly nitpick. I think its pretty well understood from context i meant a practical quantum safe key agreement algorithm. One time pad cannot be used for that purpose at all, let alone practically.
        • adastra22 44 days ago
          Non-cryptographers should not be implementing NIST standards. You should be using higher level APIs written by cryptographers which do employ NIST standards in the details.
          • astrange 43 days ago
            Implementing them for fun might turn you into a cryptographer, though, especially (or only?) if you manage to find everything you get wrong.
          • oconnore 43 days ago
            If you're including cryptography in a system design, you are almost certainly relying on NIST standards to select algorithms.
        • runjake 44 days ago
          They aren't publishing until 2024 at the earliest. This is just a head's up that they will be publishing in the future.

          Presumably, they'll have a better idea by then.

        • Spooky23 43 days ago
          Perhaps they want industry to start working on software or hardware to leverage these algorithms?

          Cryptography is driven by defense applications. For all us civilian types know, these algorithms have been around for 30 years.

      • hinkley 44 days ago
        Particularly sha-3 vs sha-512, which turned out to have issues.
        • adastra22 44 days ago
          SHA-512 doesn’t have any issues.
          • hinkley 43 days ago
            The selection criteria for SHA-3 included internal state being greater than the output size. SHA-1 and SHA-2 both repeat this mistake of MD5. SHA-2 has variants that don't have this problem, but sha-256 and sha-512 aren't among them.

            I'm having trouble finding it now but I recall someone complaining about the constants for 512 leaving something to be desired.

            • adastra22 43 days ago
              This isn’t a mistake per se, it’s how that class of hash functions—and really, almost every hash function ever—is implemented. It’s called the Merkle-Damgård construction. It adds some very good properties and is the basis for how hash functions can be used in hash tree constructions and such.

              But proving that the input state is evenly mixed among the output state is THE hard thing to prove (the hash function equivalent of the difficulty of factoring integers), so for the sake of ecosystem diversity NIST chose a hash function based on different principles for SHA-3. It’s not a criticism of SHA-2 that the difference was called out.

              The constants are the fractional bits of of successive cube roots. This is effectively a nothing-up-my-sleeve random number selection. If there are problems with this, that in itself would be a serious cryptographic result.

    • kvathupo 44 days ago
      A point rendering the choice even more curious: Germany and the Netherlands have recommended the use of encryption not relying on the shortest vector problem [1]. The two suggestions of FrodoKEM (relying on hardness of the learning with errors problem) and Classic McEliece (relying on hardness of decoding random codes?) aren't lattice-based apparently.

      Perhaps NIST knows something we don't ; ^ )

      [1] -

      • markschultz 43 days ago
        LWE's hardness is based on SVP (ignoring issues of tightness, which isn't unique to FrodoKEM). The difference between FrodoKEM + Kyber/Saber isn't relying on SVP/not (they all essentially do), but is on relying on LWE over structured lattices or not.

        At a very high level, all of the three rely on an n x n matrix at a certain point. The "structured lattice" schemes (Kyber/Saber) make structural assumptions about this matrix, say that each row is a cyclic shift of the previous row. This turns an O(n^2) object into an O(n) object, giving many performance improvements. The downside is that the additional structure can plausibly be used for attacks (but the best attacks ignore the structure, so this is a "potential issue", not a current issue).

        • kvathupo 43 days ago
          Ah, thanks for the clarification!
    • latenightcoding 43 days ago
      Some people believe you can generalize Shor's algorithm to attack lattice-based cryptography.
  • forty 44 days ago
    Coincidentally, we have just published this today, if you want to play with PQ crypto in JavaScript
    • buu700 44 days ago
      Similarly, I just published this a few days ago:

      Edit: lol, actually it looks like you guys borrowed some of my code for that. (Which is totally fine and part of the point of open source!)

      • forty 43 days ago
        Apparently yes! I'm told we did use your other older project ntru.js as mentioned in the readme :) thanks for sharing your code!
  • bioemerl 43 days ago
    Something that worries me, if someone cracks our current encryption using quantum computers couldn't they be logging everything we say right now and everything we say right now is actually unsecure to someone 10 years in the future?
    • tptacek 43 days ago
      Yes. That's, for instance, why people say the KEM problem has more urgency than the signature problem; a PQC KEM is what you need today if you're worried that someone's archiving your TLS sessions so they can break them with the quantum computer their government promised them for Christmas in 2034. Even if your KEX involves a signature, your adversary can't time-travel back to 2022 to break it with their 2034 scooty-puff quantum edition. But if all you've got is classical ECC and RSA, you're in trouble.

      If you assume the PQC KEM doesn't interact with classical ECDH, you might want to get some kind of PQC KEM rolled out as quickly as you can, in a dual construction with ECDH; the worst that happens is, your new KEM isn't quantum-safe (or anything-safe), but your ECDH holds up. But that's (if you believe in quantum attacks on crypto) still better than no PQC KEM at all.

    • snapetom 43 days ago
      Yes. This is why the work is being done now, and there will be an urgency in moving PQC algorithms from academia to commercial use. Everything that has been stolen in data breaches up until then will be broken once QC are viable.

      Good news is that we are likely more than 10 years away from QCs being useful enough to do this.

  • dsp 44 days ago
    • code_biologist 44 days ago
      Here's the warning: Lattice-based cryptography is much more risky than commonly acknowledged. This applies, in particular, to lattice KEMs under consideration within the NIST Post-Quantum Cryptography Standardization Project (NISTPQC) as of October 2021. The above document...

      There's a linked PDF paper with more detail.

    • mixedmath 44 days ago
      What does "djb" mean here?
    • kzrdude 43 days ago
      Is djb involved in any of the standardized algorithms here by the way?
      • markschultz 43 days ago
        Yes, many. I believe he's on the SPHINCS+ team (was standardized), Classic McCliece (round 3, not standardized), and NTRU_PRIME (round 3, passed over for Kyber). Perhaps more, but he has significant skin in the game.
    • bawolff 44 days ago
      Isn't that the point of having "hybrid" mode?
      • api 44 days ago
        HMAC(pqc_shared_secret, ecc_shared_secret)
    • forty 44 days ago
      What's the "obligatory djb warnings"? Something like "any crypto that's not mine isn't great"? ;)
      • sterlind 44 days ago
        from skimming it, his main argument is that Kyber relies on many constructions (e.g. cyclotomic polynomials) that are actively under attack - researchers have been successfully chipping away at them and show no signs of stopping.

        he also alleges that NIST have been moving the goal posts to favor Kyber, and they've been duplicitous in their narrative.

        he favors NTRU, which iirc isn't his.

        • markschultz 43 days ago
          Cyclotomic polynomials are incredibly standard in the field. The only researcher I know of who has issues with them is DJB, and there has not been significant advances in cryptanalysis due to usage of cyclotomics (with the exception of problems not used by NIST candidates, meaning the whole SOLIQUAY thing)
        • mti 43 days ago
          NTRU also relies on cyclotomic rings, so if distrust in cyclotomics was a good reason to reject Kyber, it would apply to NTRU too.
        • forty 43 days ago
          My understanding is that he worked on NTRU Prime, which would have somehow benefited from NTRU being choosen.
    • 0des 44 days ago
      should really be higher up.
  • carride 43 days ago
  • kragen 43 days ago
    Presumably since Dual_EC_DRBG it is counterproductive to rely on NIST's recommendations for secure cryptography. What should we rely on instead?
  • sbf501 44 days ago
    Waiting for the ELI5 sites to explain Kyber and LWE. :)
    • markschultz 44 days ago
      I wrote up an introduction to a (severely unoptimized for pedagogical purposes) version of FrodoKEM

      It's the same base scheme as Saber/Kyber, although as Saber/Kyber are over algebraically structured lattices they are significantly more efficient.

      • sbf501 43 days ago
        Thanks for taking the time to write this up. But, woof, it's a bit more than ELI5. :) The python code makes it a little more clear since I'm not familiar with some of the notation. However, it does seem kind of magic that 'e' is derived during the encryption and then sort of vanishes. I also don't quite get the bounded vs uniform vector sampling calls (one for s and the other for chi). But this at least greases the wheels so to speak, so thanks!
        • markschultz 43 days ago
          Thanks for the feedback! Roughly speaking, that all has to do with making e vanish later, so perhaps I need to revisit that section.

          Quickly (cause I probably won't for a few days), (q//2)m can be seen as a form of error correction. You can check (either pen+paper or programmatically) that, provided |e| < q/4, if noisy_m = (q//2) m + e, then round(noisy_m / (q/4)) = m. So e vanishes because it is bounded (not uniform), + we encode m as (q//2)*m (i.e. in the "most significant bits" of the number).

    • baby 43 days ago
      I wrote a chapter containing explanations on these here: