Google algorithm and the auto-complete function has landed the search engine giant back in court again. This time it is a man from Japan who alleges that he lost his job with the auto complete bringing up criminal acts when his name is typed into the search box. The man demands that the defaming words be removed and that he gets compensated for all the embarrassment he has had to go through.
Clearly a case of reputation damage and though the court has ruled that the offending words be removed Google has towed the ‘not subject to Japanese law’ line. However, the present case is about compensation and yet again Google defence points out that such cases are rare and the algorithm decides things on what is already available online.
A man from Italy won a case against Google for similar reasons and so did a Frenchman who had to deal with words like ‘rapist’ being suggested with his name. Google does screen for pornography, profanity etc, so can easily oblige if a court sees it as having caused reputation damage to a plaintiff.
However, with Google claiming to have more and more requests from governments to remove content all this could well eat into the brand losing its sheen of being impartial based on its algorithm, which gets tweaked every so often to weed out spam and information of low value among other things. So will Google pay up??
If there’s one thing women trust, it’s that the sun will rise in the East, you have to pay taxes, and that birth control pills will keep you away from unwanted worries.
Pfizer, the major birth control pill makers seem to have shaken that trust. The company had to recall million packs of these pills, which were wrongly packed. The company announced this flaw, before any woman could possibly become pregnant. Read the rest of this entry »
Siri’s obvious unwillingness to help a woman to find abortion clinics has created ire. The latest controversy could not have come at a time when the company is facing tough competition from their Android-powered counterpart. Angered by Siri’s failure to locate even one result has had people sign mass petitions to Apple. Blogs and comments are keeping the controversy alive thus putting Apple in a delicate situation. We have to see how they tackle this serious issue.
As always, the problem with Apple is they fail to understand that to develop such an effective feature, Siri has to learn about the user; it should customize itself to know every detail about an institution or an individual. Guess what? Siri transforms your speech into a search algorithm and uses the 3rd part search engine to obtain result. If the search engine fails to return the information, Apple gets the blame for not having programmed that way. Read the rest of this entry »