Advancing Our Bet on Asymmetric Cryptography

  • Interesting. It does seem that being more agile in PKI deployment is going to be a requirement in the next few years as we grapple with rolling out a potentially interesting variety of PQ signatures and hybrids.

    Especially considering exploding PQ signature and key sizes, this looks increasingly like a data synchronization problem between the server and clients. I wonder if we could kill two birds with one stone by using trust expressions consisting of a set of certificate indexes against a trust store database, instead of trust store versions and exclusion labels. In that model, a trust store is just a centrally managed list where each certificate is assigned a unique 64-bit index.

    For example, a client says "I use trust store database XYZ with certificate indexes: <ordered, integer compressed 64-bit index list, maybe a couple hundred bytes>". The server constructs (or pulls a cached copy of) a trust chain from one of the listed roots and sends it to the client. Intermediate certificates may also be stored in the trust store database - and cached on the client. In subsequent requests, the client may include those intermediate indexes in their request, allowing the server to respond with a shorter chain. Clients with an old, long trust chain might have a long first exchange, but after caching intermediates can have a much faster/shorter negotiation. As certificates expire, they are removed from both the trust store database, as well as the client's cache - naturally moving the 'working window' of certificates forward over time.

    This shifts a bit of work on the server, but dramatically reduce complexity on the client. The client just states which certificates it has and what algorithms it supports, and the onus is placed on the server to return the "shortest" chain that the client can use as a proof.

  • One perpetual source of concern that I have is how will this work in practice? NIST has not standardized algorithms as of yet. NSA has come out in opposition of hybrid schemes (note that NSA is also a big fan of CsfC, which uses entirely seperate dual layers of crypto, which could be how they would have a hybrid scheme - one layer classical, one layer pqc. Will they? I ahve no clue). But this protocol is still a draft.

    OpenSSH has chosen their own algorithm that afaik was on the NIST shortlist for PQC but not a final candidate and incorporated it in OpenSSH. That's not standardized either.

    Given that Govt (which mandates encryption requirements via blunt tools like saying they will only purchase things that meet their requirements) and Industry are going two ways, and industry is doing whatever they think best without waiting for standardization, it feels like this is going to be a source of headaches to support properly in the future due to the diversity of schemes.

    I am actually in favor of what Google / OpenSSH are doing, enabling new things shouldn't be breaking stuff, and should just be a net positive in their own bubbles, but the govt opposition and foot dragging makes this harder.

  • Doubling down on my bet nobody reading this will live to see a quantum computer that breaks year 2000 era crypto.

  • Hmm... this is about PQ cryptography while I was expecting a status update of Ed25519 in WebCrypto which, sadly, is still available only via experimental platform flag: https://caniuse.com/mdn-api_subtlecrypto_verify_ed25519

  • I just finished the Security Cryptography Whatever episode [0] about this and when Eric Rescorla is going on about how they almost threw the Web PKI overboard for a blockchain but it was too slow I was like, "just use a layer 2 like Lightning! It's fast, like Lightning!" But then they described SCTs and I was like, "well OK, way worse name, but they got there"

    [0]: https://securitycryptographywhatever.com/2024/05/25/ekr/

  • This is a really well written article.

  • This has been causing a number of issues with proxys, we use nginx and we have started to see problems with chrome users and handshakes not working properly.

  • undefined

  • undefined

  • If this lowers performance, can we just turn it off and forget about it?

  • Inside Google "Advancing our bet" is a euphemism for shutting down (hat tip to Fiber). I'm deeply surprised that an article came out with that title that is actually true, given how negatively that phrase is seen.

  • To tl;dr for people:

    - As we've known for years, cryptographically-relevant quantum computers(CRQC) likely could wreck digital security pretty massively

    - For HTTPS, 2 out of its 3 uses of cryptography are vulnerable to CRQC

    - The currently accepted algorithms that fix these vulnerabilities transmit 30+ times the data of current solutions, which for more unreliable network conditions(like mobile) can introduce latency by as much as 40%

    - Because attackers could store data now and decrypt it later with a CRQC, some applications need to deploy a solution now, so Chromium has enabled Kyber(aka ML-KEM) for those willing to accept that cost

    - However, other algorithms are being worked on to reduce that data size, but server operators for your applications at the moment can generally only use one certificate, which older clients like smart TVs, kiosks, etc are unlikely to support

    - So they're advocating for "trust anchor negotiation" by letting clients and servers negotiate on what certificate to use, allowing for servers to allow multiple at the same time

    Honestly really impressively written article. I've understood the risk that a cryptographically-relevant quantum computer would pose for years, but I didn't really know/understand what was being done about it, or the current state of things.

  • Is ”advancing our amazing bet” a nod to the Google Fiber turndown?

  • [flagged]

  • [flagged]

  • [flagged]

  • [flagged]

  • As far as I understood it, the tech is already there (Lattice-based algorithms, etc.) but nobody has bothered to deploy it yet.

    Probably a similar issue as IPv6.