My recent post about Google’s battle with spammers seems to have confused some people in the Search Engine Optimization industry.
There’s a perception that I’m advising web sites not to improve the quality of their content. I’m not advising anything, I’m reporting that a little known Google patent appears to be playing a major role in how a post-Panda Google now ranks web sites and web pages. It also explains some of the bizarre, random changes in rankings that have bedeviled SEO efforts by webmasters.
If Google detects that a web page has been changed between visits from its spider, it will check to see if the changes are designed to improve the rank of that page in its search index. This will flag the site as a potential spammer and trigger a reassessment of the site’s rank in Google’s index. In between the reassessment the site’s rankings will fluctuate randomly.
This creates the situation where if a site owner tries to improve the quality of a page, by rewriting passages to make them clearer, adding additional information, links, video, etc, this could result in a spammer flag from Google, and a period of randomly fluctuating index ranking. Thus, trying to improve the quality of your site could sink your rank.
Yet Google is constantly telling web sites to improve the quality of their content to gain a better ranking.
That’s a seriously messed up situation.