By Pepe Escobar and cross-posted from Counterpunch.org
We all live in the Age of the Algorithm.
Here’s a story that not only encapsulates the age, but dwells on how the algorithm obsession can go horribly wrong.
It all started when Facebook censored the iconic photo of «napalm girl» Kim Phuch, which became a symbol of the Vietnam War recognized all over the world. The photo was featured in a Facebook post by Norwegian writer Tom Egeland, who wanted to start a debate on «seven photographs that changed the history of war».
Not only was his post erased; Egeland was also suspended from Facebook.
Aftenposten, the number one Norwegian daily, owned by Scandinavian media group Schibsted, duly relayed the news, alongside the photo.
Facebook then asked the paper to erase the photo – or to render it unrecognizable in its online edition. Yet even before the paper responded, Facebook censored the article as well as the photo in Aftenposten’s Facebook page.
Norwegian Prime Minister Erna Solberg protested it all on her Facebook page. She was also censored.
Aftenposten then slapped the whole story on its front page, alongside an open letter to Facebook founder Mark Zuckerberg signed by the newspaper director, Espen Egil Hansen, accusing Facebook of abuse of power.
It took a long 24 hours for the Palo Alto colossus to back off and «unblock» the publishing.
An opinion wrapped up in code
Facebook had to be engaged in much post-fact damage control. That does not change the fact the «napalm girl» imbroglio is a classic algorithm drama, as in the application of artificial intelligence to evaluate content.
Facebook, just like other Data Economy giants, happens to delocalize filtering to an army of moderators working in companies from the Middle East to South Asia, as Facebook’s Monika Bickert confirmed.
These moderators may have a hand on establishing what should be expunged from the social network, according to what customers may signal. But the information is then compared to an algorithm, which comes up with the final decision.
It doesn’t take a PhD to note these moderators may not exactly excel in cultural competence, or are capable of analyzing context. Not to mention algorithms – which are incapable of «understanding» cultural context and are certainly not programmed to interpret irony, sarcasm or cultural metaphors.
Algorithms are literal. In a nutshell; algorithm is an opinion wrapped up in code.
And yet we are now reaching a stage where a machine decides what is news. Facebook for instance now relies solely on an algorithm to establish which stories to position in its Trending Topics section.
There may be an upside to this trend – as in Facebook, Google and YouTube using systems to quickly block Daesh videos and similar jihadi propaganda. Soon eGLYPH will be in effect – a system similar to Content ID on YouTube which censors videos that violate author’s rights using «hashing»; video and audio signaled as «extremist» will be assigned a unique footprint, allowing automatic removal of any new version and blocking any new uploading.
And that will bring us to even murkier territory; the very concept of «extremism» itself. And the effects on all of us of self-censorship systems based on algorithmic logic…