What are the pros and cons of standardization?

Annotations

Enter a comma separated list of user names.
July 1, 2020

The authors make a bold statement:

If we've learned 3 important things about cryptography design in the last 20 years, at least 2 of them are that negotiation and compatibility are evil. The flaws in cryptosystems tend to appear in the joinery, not the lumber, and expansive crypto compatibility increases the amount of joinery. Modern protocols like TLS 1.3 are jettisoning backwards compatibility with things like RSA, not adding it. New systems support just a single suite of primitives, and a simple version number. If one of those primitives fails, you bump the version and chuck the old protocol all at once.

I understand the criticism about the many algorithms that OpenPGP supports and that it may be too long and too difficult to implement all of them smoothly and correctly. It is a frequent criticism. As a non-expert user, I myself feel unconfortable when I have to choose create a key (I use the default, knowing that it might not be the best option).

However, I must admit I don't see what is the issue with negociation (and compatibility). A standard is the product of a negociation, and if people use it, that is because it offers them something, probably a tool box that you can adopt and adapt. ProtonMail decided to use ECC by default (which is supported by most of the implementation) while GnuPG use RSA 2048 as default. But if she use up-to-date application, Alice can send emails to Bob without worrying about key types because of standardization. With Signal, your interlocutor need to use Signal as well.

By the way, the comparison with TLS 1.3 seems quite vague to me. It was negociated during months at the IETF (the same standard body as OpenPGP) and many servers do not support it so that we still need TLS 1.2 and even TLS 1.1 in some cases. It might not be the greatest comparison...