Facebook has been touting its artificial-intelligence services as a way to quickly identify those promoting terrorist content on the platform, but not all of these services are as foolproof as the company would have us believe.
Last week, Israeli police arrested a Palestinian construction worker when Facebook’s translation tool mistakenly translated the Arabic for “good morning” to “attack them” in Hebrew, and “hurt them” in English, according to Gizmodo.
No ad to show here.
Police were notified of the post, and were suspicious of the accompanying image that showed the worker leaning on a bulldozer, which has been used as a weapon in previous terrorist attacks. No Arabic-speaking police officer saw the post before arresting the man and holding him for several hours.
According to Israeli newpaper Haaretz, Facebook’s service transliterated the innocent Arabic caption into a word that doesn’t exist, but does slightly resemble the verb “to hurt”.
The company has been using its own translation service since ditching Microsoft’s AI in 2016. Ironically, the company moved away from Microsoft because it was geared towards translating written website text, and not more informal human communication.
Facebook told Gizmodo in a statement that it was taking “steps to address this particular issue”. The company apologised to the man and his family.
Featured image: Elvert Barnes via Flickr (CC 2.0, resized)