While there were many memes born from the inauguration of US President Joe Biden on 20 January, none have proven as prolific as Bernie…
Earlier this year Facebook announced a 95% reduction in certain kinds of spam.
Taken at face(book) value, that sounds like a tremendous breakthrough, but there’s less here than meets the eye, because the “certain kinds” are basically only those that are internal to Facebook, and the solutions are hard to generalise to the broader spam problem.
What Facebook has done is essentially allow users to provide feedback about which messages from Facebook applications are unwanted. By consolidating such feedback, Facebook can block further unwanted messages to most other users, and even sometimes completely block an antisocial application. If Facebook can be clever enough to learn like that, why can’t your email reader?
The answer is that it could, if only email weren’t so darned complicated. In the Facebook situation, all the offending messages are being both generated and read from within Facebook. The good folks at Facebook have complete control of the entire lifespan of such messages. They know exactly who sent the message, how many such messages were sent, and so on. None of this is true for your email reader.
The idea of letting users vote about spam is a good one, and not a new one; researchers at IBM and elsewhere have demonstrated the value of letting users vote about which messages are spam, and using those votes to decide which similar messages to block in the future. But those experiments have also highlighted the difficulties.
The world of email is one of many independent actors, interacting according to well-specified standard protocols, all of which are often ignored or misunderstood. If your mail reader gives you a button to click on when you think a message is spam, what should happen when you do so? Obviously your mail reader needs to send your vote (which may itself be wrong or accidental) to some server that collects it, consolidates it, and feeds the result into your spam filter.
But all of the actors in this scenario are heterogeneous. Your organisation may have any number of mail reading interfaces, each of which needs to provide a button and behave similarly when it is pressed. You might be using any of a number of spam filters, which may or may not be prepared to accept voting data, for which there is no standard representation.
Worst of all, the server that collects the spam votes can’t necessarily trust all the information it gets; your machine may be compromised by a virus, for example, that deliberately corrupts the antispam voting database by labeling good messages as spam or spam messages as good.
Facebook doesn’t have any of these problems when it deals with mail from Facebook applications to Facebook users. It can watch exactly what users do with messages, and map that back directly to the applications that send them. For similar reasons, spam wasn’t a big problem back in the day when email was often a closed garden, and AOL users could only send to other AOL users.
A single authority in charge of everything makes it easier to enforce rules and policies. But who wants a single authority in charge of the whole internet? The cure would be worse than the disease.
The lack of a central authority is one of the defining features of the internet, and reflects its origins in the effort to build a network that could survive nuclear war. The result is a net that is remarkably decentralised, democratic, and chaotic.
The only way to end the chaos would be to regiment the net to an unprecedented degree, essentially to guarantee strong authentication for everyone who sends an email or does anything else on the net. This would be nice for anyone who hates spam, but more importantly, a boon for any government that wants to crush dissent, or any corrupt organisation that wants to halt all leaks and criticism.
That’s a terrible tradeoff, but I’m not terribly worried about it ever happening. The net’s design favours the most powerful force in the universe: chaos. I wouldn’t bet against it.