AI-Enabled Samsung Galaxy Z Series with Innovative Foldable Form Factor & Significantly Improved Screen Delivers New User Experiences Across Productivity, Communication & Creativity The…
Google moves to demote offensive search results
Google has launched an effort to flag and demote “upsetting-offensive” results on its search engine.
According to Search Engine Land, the company employs 10 000 human quality raters to help rank its results — and a new guideline has been added that allows for the flagging of content that is blatantly untrue.
Senior engineer Paul Haahr told the publication that Google was avoiding the vague term “fake news,” and was instead looking for results that featured “demonstrably inaccurate information”.
Quality raters’ jobs entail searching real queries Google has seen and ranking the results based on a 200-page set of guidelines.
The “upsetting-offensive” section has recently been added to these guidelines, and includes content that features graphic violence, the promotion of hate against a group of people, and how-to’s on human trafficking.
As an example for whether a result is “upsetting-offensive” rather than merely “upsetting”, Google uses a search on the Holocaust. It flags a Holocaust denial page as offensive, whereas a page on the history of the Holocaust, while upsetting, is a “factually accurate source of historical information.”
Google will now demote offensive and upsetting results when users search for topics, like the Holocaust
When quality raters deems a result offensive, a page isn’t immediately removed from Google. Rather, the information gets sent to Google’s coders who then use the data to create algorithms that avoid similar results.
The guidelines also acknowledge that people may be searching specifically for “upsetting-offensive” results.
“When the user’s query seems to either ask for or tolerate potentially upsetting, offensive, or sensitive content, we will call the query a ‘Upsetting-Offensive tolerant query’,” it writes. “Giving users access to resources that help them understand racism, hatred, and other sensitive topics is beneficial to society.”
Google does not see this as a perfect system, and has acknowledged that it won’t be a smooth-sailing operation just yet.
“We will see how some of this works out. I’ll be honest. We’re learning as we go,” Haahr admitted to Search Engine Land.