One of our favorite blog posts is our “crypto right answers” post. It’s intended to be an easy-to-use guide to help engineers pick the best cryptography choices without needing to go too far down a rabbit hole. With post-quantum cryptography (PQC) recently transitioning from an academic research topic to a more practical cryptography concern we figured it’s time for an update of our cryptography recommendations.

One thing that makes recommending PQC challenging is that historically, we’ve been able to provide “better” answers for classical cryptography. Faster and bigger hashes, stronger password KDFs, easier-to-use primitives… These things all have the same fundamental “shape”: you can take an existing design and drop in something else to make it better. MD5 and BLAKE3 are not comparable in strength, but you can just use BLAKE3 in place of MD5 and get something that’s just far better with minimal API changes.

Occasionally we get safer designs and features that make primitives faster or easier to use, like built-in nonces or hashes with MAC modes. We also get some novel building blocks like XOFs, but unless you’re a protocol designer you probably aren’t impacted. Furthermore, we’ve typically packaged even the potentially dangerous elements in ways that make them difficult to misuse most of the time.

Unfortunately, things get way more complicated when we extend this to PQC. Firstly, not all new things have the same “shape” as old things. Secondly, there are far more tradeoffs. We have classical signatures that are fast, small, safe, easy to use, et cetera; but once you go PQC, you have to pick some properties and lose others. That means a more complex conversation about what you’re trying to accomplish. Finally, a lot of PQC systems, even really promising ones, have turned out to be completely busted, so it’s harder to make recommendations that we’re confident will stand the test of time. Our previous Crypto Right Answers post from 2018 builds on top of stuff Ptacek and Percival did before that, and some of our recommendations have been completely fine over more than a decade. PQC is significantly more fickle.

Because this is such a different beast, we’re splitting up this post. We still want to be able to help with the “cheat sheet” use case where we just make recommendations for people who’ve heard the spiel before (if that’s you jump to the right answers), but unlike the previous version, we need to give you way more of a spiel. Strap in.

### Why now?

The drum banging comes from the realization that quantum computers large enough to pose a threat to current cryptographic algorithms could be realized in the next 20 years, which by Mosca's Theorem means we should start moving about now. This led several governments, including the US, to legislate the need to plan a migration to systems that resist quantum cryptanalysis.

Secrets can be long lived. Governments, for example, tend to work with secrets with life cycles over 25 years. On top of that, there are also grounds for “store now, decrypt later” attacks, meaning that an adversary might store encrypted messages in the present, to decrypt them once a powerful enough quantum computer is available. This type of attack has historical precedent, so it’s not an unrealistic concern.

Part of the migration plan is a coordinated effort from federal agencies to inventory technologies that might be vulnerable, and from NIST to standardize post-quantum cryptography. Leading to NIST launching a contest in search of algorithms that are secure both from classical and quantum adversaries. The contest is still going, but in 2022 we caught a glimpse of four early winners, and the standardization of three of these has just been completed and released on August 13, 2024.

Unsurprisingly, the migration plan also found it necessary to mobilize private organizations and push for wider adoption of these standards. Migrations can take considerable time (it took 12 years to leave Python 2 behind) so a head start can help a lot.

So, let’s do some straight to the point reconnaissance work and have a look at the protagonist of the story.

### Why are quantum computers different?

A quantum computer operates over qubits. A single ideal qubit is a superposition of the bases $∣0⟩$ and $∣1⟩$, which represent their classical counterparts bits $0$ and $1$, but are in fact orthogonal vectors that form the basis for a linear vector space. What that means, is that a single qubit can express any linear combination of the vectors $∣0⟩$ and $∣1⟩$. So, how is that helpful?

Well, one thing is that a single qubit can encode both $0$ and $1$… at the same time. Let’s look at the composition of a qubit, $φ$.

$$∣φ⟩ = α∣0⟩ + β∣1⟩$$

Although by definition $α$ and $β$ are what’s called probability amplitudes
(thus their squares have to add up to $1$), the qubit allows a state of
$(∣0⟩ + ∣1⟩)/\sqrt{2}$, which encodes that it is $0$ and $1$ with equal
probability. More importantly, this property scales. Two qubits can encode
$00$, $01$, $10$ and $11$ simultaneously, and so on as the number of qubits
grows. This allows for *mathemagical* algorithms that can take these ambiguous
states and make them converge to a valid answer, which for simplicity we can
think of as processing all states in parallel, and that’s where most of the
gains of quantum computing come from.

Of course, it is not as simple as that, this definition is of an ideal qubit, and dealing with the non-ideal woes of real life poses a significant engineering challenge. Quantum computers are noisy, errors add up over long computations, and can quite simply fail to produce results.

At this point, it’s probably not sounding that bad, but let’s see what we can do with one.

### Quantum algorithms

Like classical computers, once quantum computers were theorized, the research on how they could be used started right away. And lo and behold, in the 90s, the following decade, Peter Shor and Lov Grover independently concocted quantum algorithms that improved upon the cryptanalysis of classical cryptographic algorithms, and made cryptography the most notable collateral damage to a functioning quantum computer. The algorithms were named after themselves as, Shor’s and Grover’s algorithm, and they attack asymmetric and symmetric cryptography, respectively.

Shor’s algorithm (it’s actually a couple of algorithms, but because they’re mathematically almost the same and didn’t come up at the same time, it became a thing to just say “Shor’s algorithm”) can efficiently solve problems such as factorization of large numbers and the discrete logarithm problem. Additionally, Grover’s algorithm reduces quadratically the time complexity for searching databases and consequently increasing the effectiveness of brute force attacks against symmetric keys and hashes. Grover’s algorithm is difficult to implement and can be easily mitigated by using 256-bit keys. Basically, a 128-bit key can be broken on $2^{64}$ steps by Grover’s, and using 256-bit keys would require $2^{128}$ computations, which is a secure margin.

The more pressing challenge arises from Shor’s algorithm, which poses a significant threat to current public-key cryptographic systems. RSA for example, just its use has cubic complexity with respect to the modulo size. Shor’s has polynomial complexity with respect to the modulo size. Increasing key sizes doesn’t scale well here. We need new algorithms, and the NIST contest already brought us some.

### Post-quantum algorithms

Announced in 2022 with further additions to be made later, were Dilithium, Falcon, and SPHINCS+ for digital signatures, and Kyber for key exchange. These algorithms were picked from a large pool of applicants, and are standardized as ML-DSA (Dilithium), SLH-DSA (SPHINCS+), and ML-KEM (Kyber). From the 2022 selection only FN-DSA (Falcon) is still waiting to be standardized.

In the signature realm, NIST quite explicitly shows a preference for ML-DSA, as
it’s easier to implement safely and doesn’t use as many resources. FN-DSA was
chosen to satisfy applications that require smaller public key and signature
sizes, even though its implementation is difficult, especially on constrained
devices given the amount of resources it consumes and propensity to
side-channel attacks. SLH-DSA was standardized because it’s hash-based and well
understood, while both ML-DSA and FN-DSA are lattice-based. Lattice-based
algorithms being somewhat novel, NIST didn’t want to rely solely on novelty.
Which *seems* like a contradiction given that ML-KEM was the only key exchange
winner announced and is lattice-based.

Lattice-based cryptography started in the 90s and relies on the presumed difficulty of solving certain mathematical problems involving lattices, such as the Shortest Vector Problem (SVP), the Closest Vector Problem (CVP), or any of the related problems, like the Short Basis Problem (SBP), Approximate Short Vector Problem (apprSVP), Approximate Closest Vector Problem (apprCVP) and more. To simplify—greatly—a specific lattice can be defined with different basis, there are basis that are “good” (almost orthogonal and short), which make the problems listed above easy to solve, but given a “bad” basis (almost parallel and large) the problems are hard. So you can construct a public key cryptosystem by having a public key comprise of the bad basis, which will allow the public to encode data to a point in the lattice, but decoding it will be hard enough that only the owner of the good basis can do it efficiently.

This is just the basic idea behind lattices, there are several different cryptosystems that use lattices with different trade offs, key sizes, easier proofs, between others. Just learning with errors (LWE) which is one of the more popular ones, has three variations, LWE, Ring-LWE and Module-LWE. For an intuitive look into the subject you can watch these Chalk Talk videos, or if you’re feeling adventurous, you can get an intro from one of the pioneers in this recorded summer class.

However, that’s where the future additions come into play. The contest for key exchange is not over, while Kyber is an early winner, NIST still has a fourth round going in order to find alternative key exchange algorithms. In the signature side of things, a similar development, a lot of people were dissatisfied with the signature choices because of the sizes and performance, prompting NIST to almost immediately launch a call for more post-quantum signatures alternatives.

In fact, an important characteristic of the PQ algorithms is that their artifacts—keys, signatures and ciphertexts—are rather large compared to their classical counterparts. You’ll see that a part of the issues of migrating to PQ algorithms is dealing with the larger artifacts, since suddenly they don’t fit in an unfragmented IPv6 packet.

We’ve also lost the ability to Diffie-Hellman. While we don’t think about it much as it’s not used frequently in this way, DH provides non-interactive key exchange. This means that a party can exchange a key with another without interactions, for example, an online client and an offline client. This DH property is often used as a form of authentication that does away with the burden of sending signatures back and forth. Efforts to find schemes that can serve as a PQ substitute to DH in protocols that depend on this property are ongoing.

The novelty of lattices creates another issue. It means they have not been subjected to the same extensive cryptanalysis as their classical counterparts. To leverage decades of cryptanalytic research while still enhancing security against quantum attacks, developers of post-quantum cryptographic (PQC) algorithms themselves recommend employing a hybrid approach that combines classical and PQC algorithms. So, let’s have a look at two popular hybrid schemes, but keep in mind you don’t actually have to care or understand this next section, but we think it’s productive to at least show some of the complexity that underlies hybrid schemes.

### Make way for the hybrids

This hybrid signature protocol is a simple signature protocol that combines elliptic curve signature using Ed25519 and post-quantum signature using the lattice algorithm ML-DSA-65. The hybrid signature of a message is a concatenation of the elliptic curve signature and the post-quantum signature of the same message using the sender’s elliptic curve and post-quantum secret keys. To verify the signature, the verifier splits the signature and verifies each part separately with the sender’s public keys such that the signature is only valid if both signatures are valid. For a forgery in this protocol, the adversary needs to forge both signatures, so this protocol is at least as secure as the most secure of both primitives.

This hybrid key-exchange protocol combines elliptic curve cryptography with quantum-resistant lattice based public-key cryptographic algorithms. The protocol is designed such that an attack to the protocol able to recover the derived key must attack both the classical and the post-quantum primitive. For the classical part of the protocol, you use an ephemeral elliptic curve Diffie-Hellman using X25519. In this protocol, we have the receiver elliptic curve public key $PK^{ec}_{r}$, and generate an ephemeral key pair ($SK^{ec}_{eph}$, $PK^{ec}_{eph}$). Then, we compute the elliptic curve shared secret $ss_{ec} = X25519(SK^{ec}_{eph}, PK^{ec}_{r})$. For the post-quantum part, we use the ML-KEM-768. Given the receiver post-quantum public key $PK^{pq}_{r}$ we compute the ciphertext $C_{pq}$ and the post-quantum shared secret $ss_{pq}$ with $C_{pq}$, $ss_{pq} = ENC(PK^{pq}_{r})$ where $ENC$ is the ML-KEM-768 Key Encapsulation function.

With both shared-secrets, we compute the shared 256-bit key $K_{E}$ using a KDF to construct a dual PRF that takes the shared secrets as keys. A dual PRF is a function takes two keys and an input, such that if an adversary controls one of these keys the output is still indistinguishable from random. There are several ways to construct a dual PRF from a KDF, the image below is just one of them. So, $K_{E} = KDF_{ss_{ec}, ss_{pq}} (PK^{pq}_{r} || C_{pq} || PK^{ec}_{r} || PK^{ec}_{eph})$. The protocol returns the key $K_{E}$, the ephemeral elliptic curve public key $PK^{ec}_{eph}$ and the post-quantum encapsulation ciphertext, $C_{pq}$. $K_{E}$ is used then to derive the key and nonce for encryption of a plaintext, it is then necessary to send to the receiver the $PK^{ec}_{eph}$ and $C_{pq}$, so it can recover $K_{E}$ for decryption. The receiver recovers the key $K_{E}$ in three steps. First, one executes the elliptic curve Diffie-Hellman using his secret key $SK^{ec}_{r}$ and the received ephemeral public key $PK^{ec}_{eph}$ obtaining the first shared secret $ss_{ec} = X25519(SK^{ec}_{r},PK^{ec}_{eph})$. Using the ciphertext $C_{pq}$ and his private key $SK^{pq}_{r}$ one is able to recover the second shared secret $ss_{pq} = DEC(C_{pq},SK^{pq}_{r})$ using the ML-KEM-768 Key Decapsulation function. With both shared secrets the key can be obtained $K_{E} = KDF_{ss_{ec}, ss_{pq}}(PK^{pq}_{r} || C_{pq} || PK^{ec}_{r} || PK^{ec}_{eph})$. Note that we also tie the post-quantum public key and the ciphertext here, the reasoning being that key encapsulation is vulnerable to single key compromise impersonation, and tying the derived key to the receiver’s public key solves that, Kyber has protections against it, but this means we don’t need to trust it directly and that we get to reuse old proofs of general key encapsulation and move on. Similarly, tying the ciphertext just makes it easier to say this is robust against chosen ciphertext attacks. We also tie the kitchen sink just to be safe.

### What you should do about this

Regarding the move to PQC, a lot of the important places where these changes need to happen likely don’t depend on you directly; they’re controlled by underlying libraries and infrastructure. However, you need to identify the places where you are relying on public-key cryptography and figure out if there are any long-term changes that you need to be preparing for. For example, in hybrid protocols, a classical and a PQ keypair are needed, and the PQ keys are 4 orders of magnitude larger than their classical counterparts.

So, does your system depend on very low end devices running these algorithms? Do you have enough bandwidth to deal with the larger public key and signature/ciphertext sizes? Enough processing power to deal with the extra workload? These are the questions for the moment. Cryptographic extraction and key derivation: The HKDF scheme PQ signatures, for example, are plagued with devious trade-offs. But if you deeply need fast signing and verification, you’ll be paying with large public key sizes and/or signatures, and the reverse is also true. SQIsign, a coveted contestant for digital signatures for boasting a 128 bytes public key and a 335 bytes signature, has a hefty 128 billion cycles signing routine. Even with the new contest it's not looking like we'll have a digital signature scheme that is a panacea. Luckily, most applications of digital signatures are not in a rush to get post-quantum secure, at least not as much as key exchange, which offers more acceptable trade offs.

Below we compiled two tables that should help you get a feeling for the impact of the change. They’re compiled from the NIST submissions, and Curve25519 was added for comparison. One thing to note, the signatures were measured in a Haswell processor, and key exchange in a Coffee Lake. And we found benchmarks for Curve25519 in those microarchitectures. But we plan to update this running our own benchmarks on the same CPU, as it would also allow us to measure memory footprint.

Algorithm |
NIST level |
Secret key size | Public key size | Signature size | Cycles (sign/verify) |
---|---|---|---|---|---|

ML-DSA-44 | 2 | 2528 | 1312 | 2420 | 333k/118k |

ML-DSA-65 | 3 | 4000 | 1952 | 3293 | 529k/179k |

ML-DSA-87 | 5 | 4864 | 2592 | 4595 | 642k/279k |

Falcon-512 | 1 | 1281 | 897 | 666 | 1M/80k |

Falcon-1024 | 5 | 2305 | 1793 | 1280 | 2M/160k |

Ed25519 | 1 | 32 | 32 | 64 | 42k/130k |

Algorithm |
NIST level |
Secret key size | Public key size | Ciphertext size | Cycles (enc/dec) |
---|---|---|---|---|---|

ML-KEM-512 | 1 | 1632 | 800 | 768 | 45k/59k |

ML-KEM-768 | 3 | 2400 | 1184 | 1088 | 68k/82k |

ML-KEM-1024 | 5 | 3168 | 1568 | 1568 | 97k/115k |

X25519 | 1 | 32 | 32 | - | 125k/125k |

### Real World Post Quantum Examples

Some applications are already charting this new ground:

**OpenSSH**: They have been using PQC hybrid protocols by default since
version 9.

**Chrome**: You can go to `chrome://flags/`

and enable
`TLS 1.3 hybridized Kyber support`

which implements the TLS hybrid draft
proposal.

**Firefox Nightly**: You can turn on `security.tls.enably_kyber`

in
`about:config`

.

**BoringSSL**: They have implemented X25519+Kyber768, and have been playing
with adding Dilithium support
(with some difficulties).
Consequently, you have an nginx version compiled with BoringSSL that supports
it.

**Open Quantum Safe**: a set of forks of commonly used libraries with the aim
of supporting PQC algorithms.

**Zig**: If you’re an enthusiast for upcoming programming languages, here is an
interesting one for you that I’ve enjoyed a few weeks with. Their crypto lib
has post-quantum support since version 0.11.

**Signal**: libsignal, and the Signal app consequently, has introduced an
update to their key agreement protocol that makes it post-quantum. X3DH is now
PQXDH.

**iMessage**: iMessage uses a post-quantum key agreement protocol called PQ3.
Very similar to PQXDH.

## PQC Right Answers

#### Encrypting Data

*Latacora, 2018:* KMS or XSalsa20+Poly1305.

*Latacora, 2024:* KMS or XSalsa20+Poly1305.

Grover’s algorithm doesn’t hurt 256-bit keys that badly. It’s hard to implement for 192 and 256-bit keys because it’s a long serial computation, and quantum computers don’t seem to like that at all with all the error propagation. Even if you ignore all of those practical problems, you turn a 256-bit margin into a 128-bit one, which is fine.

In fact, if you’re worried about encryption, key length, symmetric “signatures”, hashing, random IDs, password handling, online backups, our opinion hasn’t changed that much from our previous post.

We’re not going too much into the details here, since the motivations haven’t changed since the last time. But briefly, if you have the opportunity to just use a KMS, do it. It’s going to save you time, and further headaches with key management. These services usually come already certified for HIPAA, FedRAMP and more, which helps to tick boxes faster if you’re a small company.

If you can’t use KMS, there are a lot of good choices. We choose XSalsa20+Poly1305 for many reasons. It’s the default for libsodium, so it’s easy to find bindings for it in whatever language you’re using. It has an extended nonce, which means you can use random nonces without fear.

Are there other good options? Yeah, but worrying about how many encryptions I’m going to trigger under one key with AES-GCM is time better spent elsewhere.

*Avoid:* Non-authenticated encryption. Very few applications have valid reasons
to skip on authenticated encryption, and usually because there is something
else providing the authentication.

#### Symmetric key length

*Latacora, 2018:* 256-bit keys.

*Latacora, 2024:* Yep, 256-bit keys.

This is important for PQ. And for classical it’s such an easy choice to make, the performance gain for using smaller keys is not all that. Just using a 256-bit key means that you don’t have to worry about multitarget attacks or other odd scenarios you might find yourself in.

*Avoid:* Overly large keys, cipher “cascades”.

#### Hashing Algorithm

*Latacora, 2018:* SHA-2.

*Latacora, 2024:* Same.

I’m not gonna judge anyone using SHA-3 or Blake2/3 though, SHA does get a bump for being approved by NIST, and being available pretty much anywhere. We like Blake a whole lot, but you don’t want to trust random users on github with porting it to every new hot language that comes out, while SHA is frequently part of the standard libraries.

Also, truncated versions have an extra protection against length extension attacks, since the hash the adversary sees is not the whole state of the function, so it can’t be kickstarted. So, you can use SHA-512/256 and buy yourself resistance to developers hashing poorly serialized strings.

*Avoid:* SHA-1, MD5 and MD6.

#### Symmetric “Signatures”

*Latacora, 2018:* HMAC.

*Latacora, 2024:* Still HMAC.

HMAC is ubiquitous. And it’s important to stay boring here, because HMAC has several critical applications: Securing APIs, cookies, tokens, and so on. Also important to compare the tag with a time-constant function, good libraries that provide HMAC usually provide a secure compare function.

*Avoid:* HMAC-MD5, HMAC-SHA1 and such. The underlying hash function has to be
safe.

#### Random IDs

*Latacora, 2018:* Use 256-bit random numbers.

*Latacora, 2024:* You should get 100 lava lamps, point a camera to them and use
the frames as seed for a PRNG.

You see, lava lamps never repeat their patterns, by having a bunch of them… I’m kidding.

Use 256-bit random numbers. Be careful with the source though, use /dev/urandom.

*Avoid:* Using userspace random number generators like OpenSSL RNG, havaged,
prngd, egd or /dev/random.

#### Password Handling

*Latacora, 2018:* In order of preference, use scrypt, argon2, bcrypt, and then
if nothing else is available PBKDF2.

*Latacora, 2024:* Small change. In order of preference, use argon2id, scrypt,
bcrypt, and PBKDF2 only if FIPS compliance is forcing you too.

A few things are motivating the changes. Scrypt cryptanalysis advancing with
some side-channels. Argon2 cryptanalyis also advanced a bit, with the `i`

and
`d`

variants having a reduced security margin, but the `id`

managed to stay as
a safe choice. Now not only it’s a balance between GPU attacks and side-channel
resistance, but for keeping a better security margin.

*Avoid:* Normal cryptographic hashes like any SHA, any Blake, any MD. For the
sake of the people, please.

~~Diffie-Hellman~~ Key Exchange

*Latacora, 2018:* Probably nothing. Or use Curve25519.

*Latacora, 2024:* X25519+ML-KEM-768, or rather P256+ML-KEM-768 if you’re
worried about compliance.

I regret to inform you that the only post-quantum Diffie-Hellman contender we had was broken in 2022. And when I say broken, I mean an i5 laptop could break it in a day. There is CSIDH, that could probably step up at some point, but it’s been somewhat abandoned, likely for difficulties finding the correct parameters to achieve the desired security margins, so it really doesn’t fit. We ended up needing to rename the section.

Now, let’s not speed things unnecessarily, while you should move to X25519+ML-KEM-768 as soon as they become available in your stack, you should avoid going on github looking for repositories to try and stitch things together yourself. If you can, wait for a major library to expose this.

Of course, if you have the option of running pre-shared keys through the Noise framework, you’d be essentially relying on symmetric crypto, and provided you use a 256-bit key you’ll be post-quantum secure. Wireguard has a pre-shared symmetric key feature for example. And now that Signal, iMessage, and some browsers are already PQ secure, you have the means to ‘pre-share’ the key. Which is a fair way to go around the whole issue if you’re just trying to set up a Wireguard service.

Now, if you are deeply in need of a non-interactive post-quantum scheme, people are trying to figure out non-interactive key exchange at the moment with protocols like Swoosh. These could help cover the hole that the lack of a post-quantum Diffie-Hellman is leaving. But it’s still hot out of the oven, and you could burn your mouth.

*Avoid:* Using pure ML-KEM implementations at least for a while longer. Using
Kyber, it’d be akin to using Rijndael instead of AES and it could get you in
trouble in an audit.

#### Asymmetric Signatures

*Latacora, 2018:* Use Nacl or Ed25519.

*Latacora, 2024:* You’re probably fine with our 2018 advice for most
applications, most signature keys have short life cycles, meaning that it’s not
vulnerable to “harvest now, decrypt later”.

If for some reason you have to deal with long-standing keys and signatures, we’d recommend Ed25519+ML-DSA-65, or P256+ML-DSA-65 if you depend on getting a thumbs-up from the government.

Also, some post-quantum Diffie-Hellman slack will be picked up by signatures. So if you’re looking for deniable authentication that was provided via static DH keys, you’ll have to set up some devious schemes like ring signatures or designated verifier signatures. But if you can stay boring, stay boring. Cryptography is not the area to get spicy, at least not without calling in the experts.

*Avoid:* Again, avoid using pure ML-DSA implementations. Using Dilithium
implementations instead of ML-DSA.

#### Asymmetric encryption

*Latacora, 2018:* Use Nacl/libsodium (box / crypto_box).

*Latacora, 2024:* Ok, this is tricky, because it’s affected by quantum
computing. If you’re really dependent on encrypting data like this, there are
schemes we can implement, namely doing the hybrid key exchange as explained in
the first section, and using the key in an AEAD, basically a
“hybrid_pq_crypto_box” or a “PQ-HPKE”. However, if possible I’d wait for
established cryptographic libraries to implement them.

Libsodium maintainers have said that they’ll implement it as soon as everything is squared, which should be soon given the ML-KEM standard just came out. Same thing with Go crypto, which will probably implement PQ HPKE.

#### Website Security

*Latacora, 2018:* Use AWS ALB/ELB or OpenSSL with LetsEncrypt

*Latacora, 2024:* Same.

Honestly, when it comes to post-quantum, this is likely not up to you. Clients and TLS providers need to implement ciphersuites with hybrid key exchanges. For now, I’d keep it to our old advice.

There are a few places where you can start playing with it though, and figure out early how your systems will handle it.

There are also forks of BoringSSL and OpenSSL for testing purposes where the maintainers also modified clients to work with post-quantum hybrid schemes. Clients like curl, nginx, haproxy, Chromium and more. If you use Chrome or Firefox Nightly, you can enable hybridized Kyber support too, as we’ve shown above, keep in mind these are experimental features that have limited support, but most Google services support it.

If you’re in AWS, you can use their s2n-tls implementation, which has hybrid key exchange schemes. AWS Certificate Manager has also been playing with post-quantum schemes.

*Avoid:* Third party, offbeat TLS libraries.

#### Client-server application security

*Latacora, 2018:* Use AWS ALB/ELB or OpenSSL with LetsEncrypt.

*Latacora, 2024:* Again, same story.

Keep it to AWS ALB/ELB or OpenSSL, with LetsEncrypt. Everybody has their hands on the trigger to release a hybridized post-quantum scheme.

In 2018, we also discussed the wonders of TLS 1.2+, Curve25519 and ChaChaPoly, which are all still valid. And now in 2024 we can tell you that TLS 1.3 is pretty neat too, with the protocol by itself guaranteeing forward secrecy, it’s hard to complain.

*Avoid:* Designing your own encrypted transport, which is a genuinely hard
engineering problem; using TLS below 1.3 in a default configuration, like, with
“curl”; using “curl”, IPSEC.

#### Online backups

*Latacora, 2018:* Tarsnap.

*Latacora, 2024:* Tarsnap.

## Changelog

**August 14, 2024**: NIST standardized Kyber as ML-KEM, Dilithium as ML-DSA and SPHINCS+ as SLH-DSA. The post has been updated to reflect that.