As the civil war in Ethiopia becomes an event of ethnic cleansing, Facebook again allows genocide

0
18

Facebook’s model of generating social chaos for profit that we have all become familiar with is on full display in Ethiopia, as the report shows us: Just as it happened in Myanmar, the nation’s military leaders exploited the spread of disinformation about Facebook to encourage violence against a regional minority population and to organize lethal violence against it. And just as it has done everywhere, the social media giant says it is taking steps to correct the abuse of its platform, but with ineffective PR tricks that have done next to nothing to slow the looming genocide.

Ethiopia has been embroiled in a civil war since mid-2020, after the federal administration of Prime Minister Abiy Ahmed refused to recognize the region’s newly elected government, leading Tigray forces to attack a government military base, to which Abiy responded by launching a military offensive in the region last November. Abiy forces, representing the larger Amhara region, have continued to wreak havoc in the Tigray region, whose ethnic population makes up about 7% of Ethiopia’s total. When Eritrean soldiers aligned with Abiy invaded the Tigray, a wave of reports of mass murder of soldiers and gang rapes ensued, recounting shallow pits surrounding villages and mutilated bodies floating along rivers. Eritrea withdrew its forces in June.

The tide of war changed last month when a Tigray counter-offensive composed of an alliance of allied forces with the other ethnic minorities of Ethiopia near Addis Ababa. Abiy state of emergency declared November 2, inviting citizens to take up arms.

“There are sacrifices to be made, but those sacrifices will save Ethiopia”, Abiy said on Twitter Saturday. On Facebook he urged Ethiopians to “bury” the rebels; that post was removed In Addis Ababa, the city administration called on citizens to use guns to defend their neighborhoods. House-to-house searches were conducted for Tigray sympathizers.

Many of them have turned to Facebook to organize ethnic attacks, as well as to threaten and intimidate minorities. The form of this online behavior is already familiar: Robins-Early describes how journalist Lucy Kassa was targeted by a flood of online harassment after reporting a teen’s burns in an apparent incendiary weapon attack. A pro-government account posted her photo and address, calling for her arrest. Death threats and sexual harassment followed; yet the post on Facebook remains active.

The ethnic cleansing campaign organized by Facebook spread widely and promptly, with the company’s litany of inaction speaking for itself:

Last month, a video went viral on Facebook showing a man telling a large crowd of people that anyone associated with certain ethnic minorities is “the enemy.” It was re-released multiple times before the platform removed it. The same story that called for Kassa’s arrest also appeared to celebrate Fano, a notorious Amhara militia, for carrying out an extrajudicial murder. That post that stays online. Another account with over 28,000 followers posted an instructional video on how to use an AK47 with a caption prompting all Amhara to watch it. The post has been running since April and has nearly 300,000 views. In September, a local media outlet posted unproven allegations on Facebook that members of the Qimant ethnic minority were responsible for a shooting. That same day, a government-aligned militia and mob attacked a village in Qimant, shooting and burning houses. The post remains on Facebook.

Facebook continues to insist that it is taking serious steps to crack down on posts that violate its terms of service in Ethiopia, claiming it has heavily hired moderation staff there to remove the threatening material. “Over the past two years, we have focused and actively invested in Ethiopia, adding more staff with local expertise, operational resources and additional editing capabilities to expand the number of local languages ​​we support to include Amharic, Oromo, Somali and Tigrinya,” The company told Vice. “We have worked to improve our proactive detection so that we can remove the most harmful content on a large scale.”

But the researchers told Vice that Facebook’s big talk is hollow: moderation and fact-checking in Ethiopia, they say, is actually run by “a group of volunteers who submit spreadsheets to Facebook to investigate and often have to. explain to staff why the content on their platform is dangerous. “

“They completely lack context,” researcher Berhan Taye told Vice. “Whenever we talk to them, they ask for a context. This was a big problem: they don’t understand what’s going on in the country. “

The company also regularly ignores researchers when they report violent or hateful content, telling them that the posts don’t violate Facebook’s policies.

“The reporting system is not working. Proactive technology, which is artificial intelligence, doesn’t work, ”Taye said.

If this sounds familiar, it should. When the Myanmar army used fake Facebook accounts to organize ethnic cleansing violence against the Rohingya, allowed posts to stay online as long as The New York Times published an account of the platform’s guilt in genocidal violence. In independent commission of inquiry from the UN Human Rights Council found that both the specific violence and the ethos that promoted it were promptly disseminated on Facebook: “Myanmar authorities encouraged those who preach hatred and silenced those who support tolerance and human rights, “notes the report. “By creating an environment in which extremist discourse can flourish, human rights violations are legitimized and incitement to discrimination and violence is facilitated.”

Facebook responded by deleting the accounts of several Myanmar military leaders, including Senior General Min Aung Hlaing, Commander-in-Chief of the Myanmar Armed Forces. It also closed numerous pages of groups and other networks focused on inciting anti-Rohingya violence, removing 484 pages, 157 accounts and 17 groups in 2018 alone. However, these deletions were not for their hateful content, but rather for “behavior. not authentic coordinated “.

This logic is already familiar to Facebook users in the US: when the company announced in 2020 which was eliminating a large number of QAnon conspiracy pages and groups, it also did so because of “inauthentic behavior” rather than because of its extremist content and misinformation. As a result, his rejection of the far-right sect – whose sin, in Facebook’s eyes, wasn’t promoting hatred and false information, but playing Facebook – was a mere drop in the bucket.

Likewise, Facebook said it was eager to fix what it could in Myanmar, but when the Gambian government filed a lawsuit in international court against Myanmar for the Rohingya genocide and asked for access to the data that Facebook held in its investigation into the matter, the the social media giant backed out, arguing that the request is “extraordinarily large”, as well as “unduly intrusive or burdensome”.

A federal judge in Washington last month he ruled that Facebook must release the data. In response, the company complained that the judge’s order “creates serious human rights concerns, leaving Internet users’ private content unprotected and therefore susceptible to disclosure – at the whim of a provider – to private parties, foreign governments, law enforcement or anyone else. “

But this is a false rationale. Like Matthew Smith of Harvard’s Carr Center for Human Rights Policy observed at Weather:

Facebook might say it’s worried about setting a dangerous precedent, but sharing information about genocidal intent through a US federal court would appear to be the “precedent” the company should want to set – to dissuade state actors from. ‘use its platform for criminal purposes. Not to mention the fact that voluntarily complying with the Gambia’s request would not create any legal precedent, only internal to the company.

Facebook’s model of generating social chaos for profit has already had its effect in the United States, which in particular went back to sleep in the Capitol on January 6; Internal society reports acknowledge that much of the extremism (especially the misinformation about the 2020 election) and violence, including the siege of Congress, was disseminated and organized on Facebook that day.

Now, as defenders of the uprising have been speaking for a time of civil war and targeted violence against liberals (“When will we be able to use weapons?”) Escalating on social media and in real life, it is becoming clear that what happened in Myanmar may happen everywhere. Ethiopia is just the latest nation to suffer from Facebook’s lethal revenue generation model.

LEAVE A REPLY

Please enter your comment!
Please enter your name here