Last week, I wrote an article titled Is it time to rethink SSL?, where I pointed out that SSL’s reliance on Certificate Authorities is inherently flawed due to the fact that there is no guarantee that a CA will never be compromised. One of the key problems with the CA model of certificate authentication is that once an application decides to trust a CA, there is no realistic rollback from that decision if the CA eventually proves to be untrustworthy.
Sure, the CA certificate can be removed from the list of trusted certificates, but this is done at the risk of breaking SSL for thousands of websites. The internet, as it stands at the moment, is a CA addict. It is entirely dependent on a few hundred CA vendors in order to function properly. Without VeriSign, you are unable to buy products from Amazon. Without Thawte, your Google Mail is open for anyone to see. As I mentioned before, when Comodo was compromised, application vendors decided not to revoke its CA certificates, because that would have broken too many websites. Instead, there were only a few warnings to users in the know that some Comodo certificates were not really valid. Doesn’t that give you a nice warm and cozy feeling?
So how do we escape CA dependence? A few years ago, a group of researchers at Carnegie Mellon University started working on a project to provide better authentication for self-signed certificates. The aptly named Perspectives Project took the approach that the main function of a Ceritificate Authority was to prevent a Man In The Middle (MITM) attack during the whole process of negotiating a secure connection. However, by setting up a variety of ‘Notaries’ with different network routes to the same site, it would be possible for a client application to query these Notary servers to determine whether they could get the same certificate from the website in question. If all of the Notaries agree that the certificates match, then you can be pretty sure that the certificate you are getting for the website that you are connecting to is the actual certificate provided by the website, and not a certificate presented to you by an eavesdropper intercepting your communications. Its a great idea, and it works pretty well, but there are a few minor problems with it.
In order to limit the number of connections that are made in the chain of requests, the Perspectives project designed the solution in such a way that Notaries would keep a cache of the certificates for servers that it had connected to. The problem here was that if the server you were connecting to changed its certificate and the Notaries had not updated before you connected to the server, then the certificate would fail because the Notaries would return a different certificate to the one that you had just received from the server.
At the 2011 BlackHat Conference, last month, Moxie Marlinspike presented his own implementation of Perspectives that addresses many of these problems. His project is called Convergence and is directly built on top of Perspectives with a number of very cleverly thought out enhancements. The first improvement is to deal with Notary Lag. To handle this, the client application sends the certificate that it received from the website to the Notary. If the certificate that the Notary has in its cache doesn’t match, it will requery the website to update its cache. This ensures that the certificate matching always takes place against the most current certificate, but retains the advantage of caching for the large majority of cases. The second improvement in Convergence is that it addresses privacy concerns by taking a two-pronged approach to the problem.
Firstly, Convergence keeps a local cache of certificates so that not all connections need to be authenticated against a Notary. This means that you only connect to a Notary the first time that you get a certificate, or if the certificate for a site changes. This not only improves performance, but also hides repeated connections to the same website. The other approach to the privacy problem was to introduce something known as ‘Notary Bouncing’. This tactic uses one of the Notaries as a proxy to all of your other configured Notaries. The proxy Notary is oblivious to the content of the request because each proxied connection is encrypted. This means that the Notary Bounce is unaware of which certificate you are authenticating, while the other Notaries are unaware of who it is that is requesting the certificate authentication. Notary servers can be configured in a variety of ways, so that particular Notaries might check validity against CAs, while others may simply check that matching certificates have been retrieved.
Convergence exists as an open source Firefox Addon, that basically overrides Firefox’s default CA checking mechanism. After installing it, I found that it provided an almost completely seamless experience browsing the vast majority of SSL enabled sites. Obviously I ran into a few problems with SSL on internal LAN based services, since the Notaries that I chose to make use of could not connect to these servers to validate the certificates. To get around this, I set up a couple of Notary servers within my own LAN. Unfortunately, this does have the side effect of making my Notary checking more susceptible to MITM attack. So this is a potential problem that might need some address.
There has been some other critique levelled at Convergence, but frequently observations by critics are simply ungrounded. For instance, a group of critics have pointed out that while Convergence provides a very strong approach to dealing with MITM attacks, it does not actually validate that a certificate is being used for the correct purpose. There is nothing stopping somebody from creating their own certificate for a phishing website, and presenting it to you. All of the Notaries would agree that the certificate received for the site is the same and is therefore valid, even if the site itself is dodgy.
This is absolutely true, but I think that it is a useless observation. Firstly, there is relatively little validation performed by the majority of Certificate Authorities as it is. Currently, a CA will gladly issue a certificate to anybody without checking whether their intentions are nefarious. Secondly, Convergence still allows you to make use of CA’s if you want to, but it doesn’t tie you to them forever. Another criticism is that it relies on the prevalence of Notary servers, and these critics ask who will bear the cost of these. Actually setting up a Notary server is incredibly easy and I now have three of them running at different locations around the planet. Certainly, I would expect nearly every ISP would have no problem running a Notary server if it helped to ensure increased security for end users.
More serious problems are addressed directly in Moxie Marlinspike’s presentation. One is called the ‘Citibank problem’, so named because Citibank has a very uncommon approach to SSL where it issues multiple certificates for the same domain. This means that for the Citibank website, Convergence simply does not work and probably never will until Citibank changes how it does SSL. Marlinspike waves this aside as such a rarity that it is not really a problem. However, until Convergence is tested against a wider range of sites, we will not know how pervasive the problem really is. The other problem is ‘captive portals’, often used for ‘wifi hotspots’, which are those services that do not allow you to connect to any other Internet Services until you have paid for access. Usually these services redirect all of your internet traffic to an SSL enabled website where you enter credit card details to gain access to the rest of the Internet. If you’re using Convergence, you cannot authenticate the certificate of the captive portal until you are granted more global access. So, it is clear that Convergence still needs some work. But overall, it is a fantastic approach to getting around CA dependence. Now we just need to see whether other mainstream application developers will be willing to at least provide the option to use it.