AI-Enabled Samsung Galaxy Z Series with Innovative Foldable Form Factor & Significantly Improved Screen Delivers New User Experiences Across Productivity, Communication & Creativity The…
Social search, at what cost?
In 2011 “Social Search” became a hot topic as Google introduced “Realtime Search” via the Twitter “firehose” and Bing began to index Facebook. In July, Twitter didn’t renew its deal with Google and Realtime Search was no longer offered. In the past week Google came under fire for including “your world” in their iconic search results and analysts began speaking about antitrust investigations.
While analysts discuss the social search race in terms of adoption, partnerships and the invaluable data gleaned from the platforms, what they fail to talk about is the technological investment needed to process the data.
In 1997, Larry and Sergey’s Pagerank algorithm was considered to be a highly advanced piece of code that matched pages to queries based on relevance, popularity and links to authoritative content. The datacentre infrastructure needed to service the current query volume is a staggering and oft-forgotten feat of engineering that Google spends a lot of time researching, perfecting, and keeping secret. While I’m not revered for my mathematical skill, what we need to take into account when trying to predict costs for datacentres is:
Size of the Space: The number of racks of servers your datacenter is going to have; think about how much spacing you need between the racks and the walls, between the racks themselves and the amount if aisle space you might need in order to replace servers effectively. You also need to think about space for HVAC (Heating, Ventilation and Air Conditioning). Google takes this stuff quite seriously and has placed its datacentres on sites that use natural resources to power some of the elements. Considered by some the best place to work at in the world, also means that Google has to cater for the on-site employees that ensure the site remains fully functional — and these dudes aren’t exactly hibernating in little geek-sized holes.
Power Issues: calculate the maximum electrical draw per unit, multiply that by the number of units required and add in 20%-30% for safety’s sake. Then think about the peripheral power needed for the security systems, office equipment and cooling. You get the product of the rack power and the peripheral power and that tells you the size of the UPS you need to run those. You can safely add 80% more to that to ensure the standby batteries for the units remain fully charged.
Cooling: think about the fan on your hard drive and motherboard, it is there to cool that machinery down so that it can continue processing data – now multiply that by 1-million and you begin to understand the cooling issues associated with a massive datacentre site. Most datacentre engineers aim for one ton of air conditioning per 20kVA of power and aim for two complete air changes per hour which includes the correct air filtration and humidification.
Location: probably one of the most important and understated aspects of data centre existence. The distance to the telephone company, fire and police departments, area amenities like restaurants, gyms and the power grid are all important. If we look at the bigger picture, then Google’s investments into solar energy start to make sense when it looks at how to power these centres in the future.
With the above in mind for the current situation, and social search requiring even more datacentre resource, the question has to be: How does Google pay for all of this? The answer can be found in the idiom “If you are not paying for it, you’re not the customer, you’re the product“. Services like Google+ are free to the end user, but the data that is being gleaned from it is used to inform Google’s advertisers on how to effectively sell to the user. Google Board Chairman and ex-CEO Eric Schmidt describes Google+ as being an identity engine that holds your personal data (what you search, share etc) and refines the search results it shows you based on that. What we can infer from that is that the advertising revenue accumulated from showing these enhanced search results (read ads) is going to have to power the computational resources needed to compute the data.
Applications that promote social sharing are popping up all over the net, and one that catches the eye is Kapture, which lets merchants reward users for sharing content online. Google and Facebook are definitely taking it seriously, as advisors on board already include Facebook’s New York Engineering lead and the Principal of New Business Development at Google, Alexis Giles.
If we take into account that both companies want to integrate the data from these applications, the current data centre conundrum has only just begun.