Meta, Formerly Facebook, Sued For $150 Billion For Enabling Genocide

Myanmar, UK, USA -

Meta, Formerly Facebook, Sued For $150 Billion For Enabling Genocide

 

Myanmar Rohingya refugees seek $150 billion from Meta Platforms Inc (FB.O), formerly known as Facebook, for failing to act against hate speech that contributed to violence and genocide directed at Rohingyas in Myanmar.

According to Edelson PC and Fields PLLC's lawsuit, the company failed to police content that contributed to violence against the Rohingya community. Furthermore, in coordination with British lawyers, Facebook's London office was given a letter of notice.

In 2018, U.N. human rights investigators reported that Facebook had played a key role in spreading hate speech that fueled violence and genocide against the Rohingya. The U.N. fact-finding mission concluded that Facebook had been "a useful instrument for those seeking to spread hate" and that the company's response had been "slow and ineffective." Facebook admitted in 2018 that its platform had been used to push genocide, "foment division and incite offline violence" against Rohingya Muslims. In a statement, Facebook admitted that it was "too slow" to prevent misinformation and hate in Myanmar but claims it has since taken steps to crack down on platform abuse in the region, including banning the military from Facebook and Instagram following the coup d'etat on February 1.

A Burmese military genocide campaign involving mass murder, torture, and rape caused more than 730,000 Rohingya Muslims to flee Myanmar's Rakhine state in August 2017. Human rights groups documented killings of civilians and the burning of villages. Myanmar military authorities, currently operated by an illegal junta, deny carrying out systematic atrocities. As a result of accusations of crimes in the region, the International Criminal Court has opened an investigation. A U.S. federal judge ordered Facebook to release records of accounts linked to violence against anti-Rohingya Muslims in Myanmar that the social media giant had closed in September 2021.

According to NapoleonCat, 40% of Myanmar's population used Facebook in January 2020. Despite its promise to clamp down on misuse of its platform, Facebook promotes violence against Myanmar protesters and amplifies misinformation about the violent junta, researchers find. An investigation by the human rights group Global Witness found that Facebook's recommendation algorithm encourages users to view content that violates the company's policies. Upon liking a Myanmar military fan page that did not contain any recent posts that violated Facebook's policies, a human rights group discovered Facebook suggested several pro-military pages with inappropriate and abusive content.

The company says Section 230 of the U.S. internet law protects it from liability related to content posted by users. The law states that online platforms are not liable for content posted by third parties. According to the complaint, if Section 230 is invoked as a defense, then Myanmar law will apply. Due to a lack of precedent, it is unlikely the United States would accept a foreign law. In the upcoming class-action lawsuit, plaintiffs claim that Facebook does not police abusive content in countries where such speech is most likely to harm society. In addition, Whistleblower Frances Haugen, whose files leaked in 2021, asserts that the company does not police abusive content.

In 2013, philosopher Heather Marsh and the Anonymous collective launched Operation Rohingya to raise global awareness about the ongoing Rohingya genocide and growing refugee crisis in the region. Human rights groups such as Partners in Relief and Fortify Rights and other advocates mobilized to defend the Rohingya population. Their goal is to raise awareness, provide humanitarian aid, and seek justice for the perpetrators and enablers of the genocide.


Dejar un comentario