'Project Owl': Google Launch New Quality Improvements – But Do They Go Far Enough?
by Lindsay Rowntree on 8th May 2017 in News
Google announced their latest set of quality improvements for search, nicknamed 'Project Owl', following recent criticism around the promotion of fake news and problematic Featured Snippets. Writing exclusively for ExchangeWire, Simon Schnieders, founder and CEO, Blue Array, asks whether the changes actually address the issue at hand.
Some particularly controversial snippets that denied the Holocaust ever happened and claimed that women are evil have been featured at the top of search results and in voice search queries.
Google’s 'autocomplete' function, which predicts what’s being typed in, has also received similar levels of criticism.
In response, the world’s largest search engine has pledged that results from reliable sources will appear above conspiracy theories and unsavoury racist material by making three key changes for users.
These include:
- Launching a new feedback drop down menu and policies about why suggestions might be removed
- Offering alternative answers for Featured Snippets
- Providing a new emphasis on 'authoritative' content to improve search quality
So, what’s the problem?
Put simply, Google’s latest attempts to prevent fake news from dominating their search engine simply don’t go far enough.
With millions of daily users worldwide, Google have a social responsibility to deliver effective algorithmic alternatives for users as the arbitrators of truth for a new generation of search users.
The issue is particularly apparent with Featured Snippets, where Google elevates one of their search results above others because they believe one result is better than another.
Currently, it only offers one answer – why not provide alternatives?
Featured snippets are used with Google Assistant on Android phones and in Google Home. However, the answers it provides can post a serious issue, particularly when Google were recently returning an answer for the query 'are women' – responding that all have 'some degree of prostitute' and 'a little evil' in them.
“Inaccurate results are often featured in search results due to 'Google bombing' tactics employed by internet-savvy groups that force a website to be ranked highly”, Ben Gomes, vice-president of engineering for Google Search, said.
“These tactics include linking to the offending website from several other sources and hiding text on a page that is invisible to humans but which the search engine’s algorithms can read.
“In a world where tens of thousands of pages are coming online every minute of every day, there are new ways that people try to game the system. In order to have long-term and impactful changes, more structural changes [to Google’s search engine] are needed.”
“Our goal remains the same”, Gomes said in Google’s official blog post. “To provide people with access to relevant information from the most reliable sources available. And, while we may not always get it right, we’re making good progress in tackling the problem.”
Featured Snippets – not effective for voice search in their current form
With voice search exploding, Featured Snippets are being positioned by the search engine as being the de facto answer.
Google Home, for instance, doesn’t ask if you’d like to listen to more results, instead citing the source and answer as an audible response. Sometimes is doesn’t even do that. Surely this means that, for many, the answer Google provides is going to be taken as gospel, whether factually accurate or not.
As such, Google needs to consider the implications of their role as the arbitrators of truth for a new generation of searchers, who maybe don’t know that the truth isn’t always 'the truth' when algorithmically determined.
Follow ExchangeWire