To fight the fake news problem, Google is asking for feedback from users. These websites might have “misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories.”.
Google came under fire in recent months for prominently displaying search results that were offensive or highly inaccurate (or, in some cases, both).
The changes have been in the works for four months, but Google hadn’t publicly discussed majority until now.
Google’s second tool “Ranking Changes” is for ranking pages on content. It’s also lost lawsuits in Japan and Germany over the search suggestions. This results in more effort on the part of users to look for particular information that can be considered fraudulent or hate speech, which Google has already made more hard to find, anyway. They will also be able to report offensive autocomplete suggestions – the suggested phrases that show up when you begin typing a query in the search engine. Google has indicated that it will be changing the systems and continually making enhancements to reduce the amount of fake news, and improve the user experience at the same time. The problem has changed its shape a little but Google is still committed towards its long-term efforts of improving upon the info displayed in any user’s search results. Google has committed to not censoring the search results.
Considering Google receives something around 5.5 billion search queries per day, that’s roughly 13.8 million searches returning false information. “While our search results will never be flawless, we’re as committed as always to preserving your trust and to ensuring our products continue to be useful for everyone”. As part of that process, we have evaluators – real people who assess the quality of Google’s search results – give us feedback on our experiments. But, it won’t be using these assessments to determine single page rankings.
While Project Owl is one of several steps that Google has been taking to address the problem of fake news and offensive content, it’s too early to tell whether or not it has succeeded, or will succeed in the future. If Google, Facebook or other companies trying to block false information err in their judgment calls, they risk being accused of censorship or playing favorites.
Google has repeatedly downplayed the prevalence of fake-news.
In a bid to curb circulation of fake news and inappropriate online, Google has enhanced its evaluation methods and made algorithmic updates to surface more authentic content using the following to methods.