I have a t-shirt created by Dinosaur Comics. It’s a nerdy shirt with way too much text on it, but its premise is pretty cool: what would happen if you travelled back in time and you had all of the information to create the technology of today way back in the past? Of course, take the credit!
Actually, when I bought this shirt, a friend and I had been discussing what we would be able to do during a technological blackout. I mean, sure I can program in a few high-level languages, I know how to perform a wide range of systems administration tasks, and I can build a computer out of component parts… I know the basic principles of electricity and magnetism and how a light-bulb works, but how many of these things would I actually be able to build up out of raw material? Probably not much. My t-shirt would help me a fair bit of the way. But as things become more digital, one wonders what would happen if the internet actually had to disappear overnight. I’m not alone when I think about this. CNN recently ran an article along similar lines.
In David Eagleman’s CNN article, he suggests that there are at least four ways that the internet could ‘go down’. His first suggestion is that an overwhelming solar flare could cause a geomagnetic storm that would wipe out communications and computing infrastructure in an instant. Solar flares have always captured my imagination ever since I read the first Simon Travaglia BOFH stories back in the mid-90s. The bastard operator frequently used ‘solar flares’ as an explanation for computing issues when offering support.
In actuality, solar activity may have serious consequences for all things digital and we are certainly headed toward a period of increased solar activity. Eagleman then proposes that cyber warfare may result in activity directed toward disabling internet access for large sections of the global population. While a little dramatic as a hypothesis, the concept is not untenable, although his third proposition is a lot more likely: political mandate.
Eagleman looks back to June 2010, when the Homeland Security committee in the US approved a bill that literally gave the president the ability to wield an “internet kill-switch”. This would have effectively allow the President of the United States to disconnect many of the major networks that power the internet within the US. That wouldn’t just affect American users though, as much of our internet traffic gets routed through American owned networks.
The provision for this has since been removed, but it is clear that there is a global trend for governments to gain power over internet access and the traffic that skips over borders in milliseconds. Eagleman finishes with the suggestion that undersea cables could get cut, destroying access to the internet for large portions of the world population. While there is some merit in this, it is unlikely that enough cables could be cut to cause a worldwide outage.
Eagleman concludes his article with a proposition that we need to create a backup of all of the information required to build an internet, and that information needs to be put onto physical media so that it can survive a digital apocalypse. I like the idea, even if I am a little dubious about the real possibility of the full-scale destruction of all current technology. For many things, we have the information at hand. Probably the largest portion of information on how the internet works and how it has evolved has come from the IETF and is well documented in the RFC library.
For less obscure information, there is a plethora of sites dedicated to explaining how things work. And, of course, we have the uber-encyclopedia of everything: Wikipedia. The only problem with all of this information is that it is all in a digital format.
Our libraries are struggling in the face of digital media. It is said that Julius Caesar accidentally set fire to the Library of Alexandria destroying much information from the global knowledge pool in one fell swoop. I often wonder if we’re in the slow process of accidentally burning our own libraries as the digital shift switches up a gear.
I think that, at the risk of sounding anachronistic, we really need to start thinking about how we can invest in libraries as a genuine backup service for information available on the internet. Data moves fast online and there is a lot of it generated every second. It is obviously not feasible to store everything on paper, but certainly DVDs of information could be catalogues and stored in libraries. The technology required to read that data could be protected and documented on paper so that we could bootstrap a disaster recovery scenario from scratch.
Now, we just need to come up with ways to fund a project like this. Any ideas?