French Muslim group sues Facebook and YouTube for broadcasting footage in New Zealand
Group representing French Muslims sues Facebook and YouTube for posting video of March 15 shooting at two mosques in New Zealand which killed 50 people.
The French Council of Muslim Worship (CFCM) said on Monday that the organization had filed a formal complaint with the Paris prosecutor’s office against the French offices of the two tech giants for “spreading a message with violent content encouraging terrorism, or of a nature likely to seriously infringe upon human dignity and likely to be seen by a minor â, according to a copy of the complaint obtained by AFP.
CFCM chairman Ahmet Ogras told CNN the organization is accusing Facebook of not removing the video quickly enough. Such acts mentioned in the complaint are punishable in France by up to three years in prison and a fine of 75,000 euros ($ 85,000).
“This [is] inadmissible, Facebook must assume its share of the responsibility and must do everything to anticipate these livestreams, as much as [they do with] messages of hatred and Islamophobia on their networks â, he told CNN.
Facebook is “reviewing the complaint” and “cooperating with authorities and our teams remain fully engaged,” the company told HuffPost. YouTube did not directly respond to the CFCM’s complaint, but provided an earlier statement about the attack, saying it had seen an “unprecedented” volume of attempts to post footage of the shooting and deleted dozens. thousands of related videos.
Facebook previously said it “promptly” deleted the original livestream showing the massacre at the two mosques after New Zealand police informed the social media network shooter video, which was filmed in the style of a first person shooter video game. But according to AFP, the CFCM said it took Facebook 29 minutes after the broadcast began to remove the video.
By then, the 17-minute video had been replicated and shared over and over again on the Internet, including Facebook. The platform said so deleted over 1.5 million videos of the massacre in the first 24 hours after the live broadcast. Company spokeswoman Mia Garlick later said Facebook blocked 1.2 million of them from being downloaded, meaning that 300,000 filming videos were available on the platform.
Tech giants like Facebook and YouTube have said they are working to curb the spread of violent and inappropriate content through their artificial intelligence systems and human moderators. Twitter suspended the alleged shooter’s account, on which he apparently shared a 74-page white supremacist manifesto. Butall the leaders say that is obviously not enough.
New Zealand Prime Minister Jacinda Ardern said days after the shooting tech companies had “a lot of work” to do to curb the rapid spread of content that shows or encourages violence. British Labor Party Deputy Leader Tom Watson said YouTube should suspend all new uploads if the platform cannot stop the broadcasting of such videos.
Last week, the U.S. House of Representatives Committee on Homeland Security called on the top executives of tech giants to explain how their platforms work to stop violent and terrorist content to spread.
This article has been updated with comments from Facebook.