TechNews Pictorial PriceGrabber Video Tue Nov 26 03:36:08 2024

0


Construction worker arrested after Facebook mistranslates greeting into a threat
Source: Hillary Grigonis


Facebook’s auto-translate feature allows users to connect beyond language barriers but one incorrect translation of a simple “good morning” proved to be a bit of a disaster for one Palestinian man. When the construction worker posted a picture of his work site, the Facebook translate feature reportedly turned the Arabic “good morning” into a Hebrew “attack them” and an English “hurt them,” resulting in the man’s arrest.

According to Israel newspaper Haaretz, the man was arrested when Israeli police spotted the auto-translate message last week, which was accompanied by a photo of the man leaning against a bulldozer enjoying what appears to be a morning coffee and cigarette at a West Bank construction site near Jerusalem. The confusion came from the system misidentifying a similar Arabic word which means “to hurt.”

The incorrect translation flagged the post, notifying local authorities who also use algorithms to flag potential threats. The police responded because of both the translation and the image. According to police, bulldozers have been used in terrorist attacks before as hit-and-run vehicles.

After questioning and after an Arabic-speaking office read the original post, the police realized the error and the man was released after a few hours.

In a statement, Facebook apologized for the error. The company says that, while the auto-translate algorithms improve every day, misinterpretations happen occasionally. Facebook said it is working to prevent the error from happening again.

Artificial intelligence is behind Facebook’s translation feature — when the company switched entirely to its own system last year, the software handled around 2 billion translations a day in 40 languages. Additional options allow users to report bad translations and rate translated text.

The most popular social media network uses a number of different algorithms to monitor posts, but a string of recent events has the platform promising more human oversight to the process in a number of different areas. A set of ad-targeting metrics for hate groups and fake news promoted in the trending section are just a few examples of incidents that prompted changes in the platform’s monitoring systems. Earlier in 2017, the company said it would add 3,000 moderators to review posts flagged by users for violating the Community Standards.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |