Google on Monday revealed new tools for Google Maps on that will help users report road accidents and other incidents more efficiently. “First, we’re…
The web is in trouble. Big trouble. It’s not the first time. It was also in trouble in the mid ’90s. But then Google saved it, ushering in a new era of search. People forget how impossible good content was to find in those days. Critics routinely lambasted the web for the rubbish on it.
Those were the days of Altavista, Hotbot, Lycos and a host of search engines that would regularly return results of dodgy, random websites. It’s not that there wasn’t good content on the web, it’s just that we couldn’t find it very easily.
Google, with its lauded PageRank algorithm, was able to mine the beautiful diamonds in a very rough rough, mercifully elevating them to front pages of search results. Suddenly, when we searched, we found the good stuff. The bad stuff didn’t disappear, it was relegated to its justified position in the content hierarchy: Obscurity. It’s why Google suddenly emerged as the undisputed king of search and, all around it, rivals crumbled.
But now, it’s back to the future. Once again, the web is threatened. And there is doubt about whether this time a knight in shining armour like the Google of the 90s will be able to fix it. It seems an impossible problem to solve.
Companies like Facebook and Apple would have you think the solution lies in their closed walled-garden platforms where quality is policed and options are limited. These are controlled, rigid and templated worlds – or “velvet prisons” as Newsweek columnist Jacob Weisberg eloquently puts it.
So what’s changed? Well for one, the internet has exploded. There are more people using the internet than ever before. Close to two-billion people are online, which on a global scale, means about 30% penetration amongst a world population that is approaching seven billion.
Humans are now producing and distributing more online content than ever before. In the days of the early web this was given expression via bulletin boards, then forums, and then broadly the so-called “Web 2.0” movement. Now it’s called social media. It’s democratised content, given voice to the voiceless, and it’s smashing traditional power and information structures. This is a good thing. It’s good for diversity.
But we’ve also discovered that, in the words of former Lead Digg Architect Joe Stump at a recent Johannesburg tech conference, “humans like to produce a lot of crap”. There is a dystopian fact of life here — people, not trained, versed or sensitised in the skill of content production are producing content at an alarming rate with scant respect or understanding of the craft.
The content game is wide-open. It’s no longer monopolised by professional writers and journalists, and yes, it should never be again. But with this, an uncomfortable, nagging truth has emerged: People who don’t really have content’s best interests at heart are producing content at a furious rate.
This doesn’t bother respected media commentator Clay Shirky who puts this into the “Let it happen” category. Much of the nonsense is innocuous, and some of it serves important sub-cultural functions beyond the actual form of that content. LoLCats is an important meme and phenomenon, regardless of what the elite may think of it.
So, let it happen then. Why be a control freak about it? Why take an elite view? We are not censors. But there is a problem: When content like this clogs search engines, suffocates search results and gets in the way of the good stuff.
Google appears to be losing the battle. For years its sharp engineers seemed to stay ahead of the spammers, the gamers and the so-called black-hat search engine optimisers who were bent on manipulating search engine rankings.
But Google can’t win because it is outnumbered: It’s the web versus Google. No matter how ingenious its engineers or how complex its Pagerank algorithms, people seem to be cracking the search engine. And it’s not Google’s fault. It’s the result of a practice called Search Engine Optimisation (SEO).
SEO is killing the web. SEO is pushing search result quality down and artificially inflating junk. It’s breaking the web. In an age where every person and every company is effectively a media company, quality is being overrun by quantity. Many SEO practitioners seem to care less about the quality of their service or content than their Google rank.
Some companies tack search engine-friendly WordPress blogs to their main sites for no reason other than to elevate their rankings in search results. These WordPress sites are then shovelled full of content with the primary aim of getting into search engines and getting the user to their site to purchase. Educating and informing the user is a secondary aim, or at worst not a consideration.
SEO should be 20% of a site’s effort. The other 80% of search engine attraction should be driven by quality and relevance of content or a service.
And here is the crux of the problem: Content is not the end-in-itself, but becomes a means-to-an-end. It’s not content meant to give you — the user, viewer, consumer — independent advice produced with care, balance and objectivity. Its primary reason is for SEO.
There will be those who argue that there is no reason why they can’t do both. I argue they can’t. They are compromised. Quality content is always an end in itself. If it’s a means to an end – it has another agenda. It’s not content I want to read. In the end, these sites risk merely becoming nothing more than content farms.
Maybe this is why closed, quality search niches like Google News and Google Scholar were created? Maybe this is the real future of Google, the real future of search, the real future of the web? If this is the case, we’ve come full circle as a civilisation. We’ve said goodbye to the anarchic, chaotic experiment that was the web and hello to controlled, monitored platforms.