AI-Enabled Samsung Galaxy Z Series with Innovative Foldable Form Factor & Significantly Improved Screen Delivers New User Experiences Across Productivity, Communication & Creativity The…
Most of your website’s traffic doesn’t come from humans
You work hard on your your website right? You enjoy every spike in traffic like it was the first. After all, it must mean people are enjoying the content on your site.
That’s true for the traffic you can see. It turns out, however, the majority of visits to your site don’t actually show up because they don’t come from humans.
Incapsula, a provider of cloud-based security for web sites, released a study today showing that 51% of website traffic is automated software programs, and the majority is potentially damaging — automated exploits from hackers, spies, scrapers, and spammers.
The company says that typically only 49% of a website’s visitors are actual humans and that the non-human traffic is mostly invisible because it is not shown by analytics software.
This means sites are carrying a large hidden cost burden in terms of bandwidth, increased risk of business disruption and worse.
Here’s a breakdown of an average website’s traffic:
- 5% is hacking tools searching for an unpatched or new vulnerability in a web site.
- 5% is scrapers.
- 2% is automated comment spammers.
- 19% is from “spies” collecting competitive intelligence.
- 20% is from search engines — which is non-human traffic but benign.
- 49% is from people browsing the internet.
The data was collected from a sample of 1 000 websites that are enrolled in the Incapsula service.
“Few people realize how much of their traffic is non-human, and that much of it is potentially harmful,” said Marc Gaffan, co-founder of Incapsula.
Incapsula offers a service aimed at securing small and medium sized businesses. It has a global network of nine data centers that analyse all traffic to a customer’s site and blocking harmful exploits in real-time, while also speeding up page loading times through cached content closer to users.
“Because we have thousands of web sites as customers, we spot exploits way ahead of others and we can then block them for all our customers. That’s the benefit of scale. We also maintain a virtual patch service that prevents harmful exploits days and sometimes weeks before a patch is ready,” said Gaffan.
There is no software or hardware installation required by the customer, a small change in a site’s DNS records directs traffic through Incapsula’s data centers. And all analytics, and search engine rankings, are unaffected by the change.
Websites are significantly faster because the company caches content and keeps it close to where users are located.
An important aspect of the service is that it is in compliance with the Payment Card Industry data security standard (PCI) which is essential for online merchants. They risk losing their ability to process credit card payments if they don’t meet strict PCI requirements.
The company offers a free service for sites with less than 25GB of monthly bandwidth, and premium plans start at US$49 a month.
I’m curious to try this service because looking at my server logs I get hit by about 28 “robots” daily, and while some are from legitimate sources such as Google, Yahoo, Microsoft, the majority are unidentified and together, they use as much as one-third of my bandwidth.
This means that the human user experience suffers because my server is trying to deal with all the “non-human” traffic generated by software programs hitting the site.
Incapsula’s ability to block exploits before a patch is available is another attractive feature. I don’t have time to keep up with the many security patches sent out, and then installing and upgrading multiple programs is a chore I’d rather do without.