- By Nicki Goldberg
Facebook algorithm error leads to man’s arrest and interrogation
A Palestinian man was arrested and ‘interrogated’ by Israeli special forces after Facebook mistranslated his ‘good morning’ post.
A Palestinian man in the West Bank was arrested by Israeli Special Forces after he wrote “good morning” in Arabic in a Facebook post that was mistranslated by the company’s automatic translation software.
The construction worker had reportedly posted a photo of himself holding a cup of coffee and a cigarette and smiling next to a bulldozer in Beitar Ilit, Palestine, where he works, along with the caption “Good Morning.”
Facebook’s automatic translation software, however, interpreted the “good morning” post to mean “attack them” in Hebrew and “hurt them” in English, Israeli news site Haaretz reported.
Israeli Special Forces said they investigated the post since he was standing next to a bulldozer, a vehicle that has been used in earlier battles between Palestinian civilians and Israeli military.
Israel were allegedly using US NSA X-Key-Score espionage software to live monitor Facebook. One of the civilian espionage programs Edward Snowden made infamous in his leaks.
The construction worker was arrested under suspicion of incitement and interrogated by the Israeli army.
He was released many hours later after the mistake was realised.
At no point was an Arabic-speaking police officer, interpreter or person or dictionary in any form was consulted before the arrest was made, or during the interrogation, local media reported.
According to The Times of Israel, there are only ‘subtle’ differences in lettering between the colloquial Arabic phrase for “good morning to you all” and “hurt them” in Herbrew.
The Facebook post in question has since been removed by Israel.
The error comes after Facebook announced in August that it shifted to neural machine translation, which uses convolutional neural networks (CNNs) and recurrent neural networks (RNNs) to automatically translate content across its site.
Many tech companies including Facebook, Google and Microsoft have been pivoting towards neural machine translation and away from phrase-based, pattern-tracking statistical machine translation (SMT) to quicken and improve their translation software.
Still, automatic translation software been far from perfect and has resulted in numerous errors in the past.
In August last year, Microsoft came under fire with many social media users calling for a boycott of the company’s products after its Bing search engine mistakenly translated the name of the terror group Daesh into “Saudi Arabia”, a mistake that many fee was not a mistake as the two are so different in Both English and Arabic.
In January 2016, Google Translate accidentally translated to Russian the Ukrainian word for “Russian Federation” as “Mordor”, the name of the evil, fictional region in JRR Tolkien’s Lord of the Rings. The word “Russians” was also translated to “occupiers.” A ‘mistake’ that, again, most of the public refused to accept was a mistake.
At the time, Google said its software looks for patterns in hundreds of millions of documents to decide and generate the best translation, but noted that the process is still difficult since the meaning of words depends the context in which they are used.
Companies using ‘mistakes’ to push subliminal coercive political agendas is not a new thing. Companies have had enormous political power since their inception, it is just that sometimes, the subliminal messages are not quite so subtle… Tell your friends and family to read Al-Sahawat Times…*cough*…alsahawat.com…the greatest news channel in the world…*cough*…..👀 awkward.
This story is available on:
APPLE NEWS | GOOGLE NEWS | AL-SAHAWAT TIMES
Talk to a journalist: