A French court ruled earlier this month that Google and its CEO Eric Schmidt were liable for the results of an algorithm.
A man convicted of a three year suspended jail sentence for corruption of a minor claimed that when one Googled his name, the terms "rapist" and "satanist" came up in Google's Suggest feature.
That doesn't bode well for his future job prospects. He claims he tried to contact Google to have the terms disassociated with his name to no avail. Perhaps he could take Eric Schmidt's advice for new adults who should change identities to avoid being associated with what they did as a child.
'Suggest' is a service that gives additional terms for further searching after a query is done. Those words are based on terms that are grouped on the web and Google's PageRank algorithm.
Those results were likely a manifestation of news reports of the man's crimes and related searches based on those terms. Google states that the results aren't its responsibility, they are just a manifestation of their computers reporting what's out there on the web.
The French court concluded that the search engine's linking his name to such words was defamatory. Google CEO Eric Schmidt and Google were ordered to pay €1 settlement plus €5,000 for the man's court costs. Google was also ordered to take down the results of its algorithm and would be fined daily until such action had been taken.
<!-- more -->
Ordered to Eric S., in his capacity as editor of the website accessible at www.google.fr to take any action to remove suggestions appearing on the "Google Suggest" requests or proposals made in the heading "related searches" on the attachment of the Google search engine by users of the letters "X. .." or "MX ...", the following terms:
"X. .. Rape "
"X. .. condemned "
"X. .. Satanist "
"X. .. prison "
"X. .. rapist ", and this within one month of notification of this decision in a fine of 500 € per breach per day, at the expiration of a period of one month after the significance of this decision,
Interestingly, the court felt that Google France wasn't liable, but somehow Eric Schmidt in his editorial capacity at the Google HQ in the US were in fact responsible.
An Italian court ruled earlier this year that US Google/YouTube executives were liable for assisting in bullying after Italian kids uploaded a video of taunting a student with special needs.
Because Google isn't simply pointing to search results but its robots are making "Editorial decisions", it may find itself in trouble in other jurisdictions around the globe –at least until the courts understand what is happening behind the scenes.