Facebook Used to Incite Violence and Promote Fascist Disinformation in Myanmar... Again.

Facebook, Myanmar, Technology -

Facebook Used to Incite Violence and Promote Fascist Disinformation in Myanmar... Again.

More than 40% of Myanmar's population used Facebook in January 2020, according to the social network management service NapoleonCat. Despite promising to clamp down on the misuse of its platform, Facebook is promoting violence against Myanmar's protesters and amplifying violent junta misinformation, a study finds. An investigation by the human rights defense group Global Witness found that Facebook’s recommendation algorithm continues to invite users to view content that breaches its own policies. After liking a 
Myanmar military fan page, which did not have recent posts in violation of Facebook's policies, the rights group discovered Facebook suggested several pro-military pages with abusive content.

“What happens on Facebook matters everywhere, but in Myanmar that is doubly true,” the report says. Like in many countries outside the Western Hemisphere, mobile phones in Myanmar often come preloaded with Facebook and many businesses do not have websites, only Facebook pages. Facebook is effectively the internet for many people in the country.

Global Witness said it set up a Facebook account just before the peak of the military violence against civilians, with no history of liking or following specific topics, and searched for "Tatmadaw", the Burmese name for the armed forces. Filtering the search results to show pages, it selected the top result - a military fan page whose name translates as "a gathering of military lovers."

The page had earlier posts showing sympathy for Myanmar's soldiers and even one promoting military service for youths - but none of the more recent ones since the coup violated Facebook's terms. Nonetheless, when Global Witness's account "liked" the page, Facebook began recommending related pages with material inciting violence, false claims of election interference, and support of violence against civilians.

On March 1, for example, a death threat was issued to protesters who vandalized surveillance cameras. “Those who threaten female police officers from the traffic control office and violently destroy the glass and destroy CCTV, those who cut the cables, those who vandalize with color sprays, (we) have been given an order to shoot to kill them on the spot,” 

In another post featured on one of the pages was an image of a “wanted” poster offering a $10m bounty for the capture “dead or alive” of a young woman. It claimed she was among protestors who had burned down a factory that had been damaged by a military crackdown. The woman's face and a screenshot of her Facebook profile were posted along with the caption: “This girl is the one who committed arson in Hlaing Tharyar. Her account has been deactivated. But she cannot run."

Facebook admitted in 2018 that its platform had been used to push genocide, “foment division and incite offline violence” against Rohingya Muslims. The same year, a UN fact-finding mission concluded that Facebook had been "a useful instrument for those seeking to spread hate" and that the company's response had been "slow and ineffective".

In February 2021, Facebook claims its staff has been working around the clock to keep its platform safe. As a result of the coup, there was a vastly increased likelihood of "online threats that could lead to offline harm".

According to the Global Witness report, Facebook should also look into other types of content, such as the circulation of forced confession videos by political prisoners, military ads, and posts that amplified military propaganda. By removing the Tatmadaw from its services and taking other measures, Facebook claims that it has "made it harder for people to misuse our services to spread harm. This is a highly adversarial issue and we continue to take action on content that violates our policies to help keep people safe.”

According to Naomi Hirst, head of Global Witness' digital threats campaign, Facebook does not adhere to the “very basics” of its own guidelines. “The platform operates too much like a walled garden, its algorithms are designed, trained, and tweaked without adequate oversight or regulation,” “This secrecy has to end, Facebook must be made accountable.”



Leave a comment