French Muslim group sues Facebook and YouTube over Christchurch video



A group representing Muslims in France is suing Facebook and YouTube over the release of a video showing the mass shootings at two mosques in New Zealand on March 15.

The French Council for Muslim Worship (CFCM) has filed legal documents against the two tech companies for their response to a video of the terrorist attack in the city of Christchurch, in which 50 people were killed.

CFCM chairman Ahmet Ogras told CNN the organization is taking legal action against Facebook for not removing the video quickly enough.

“This is not admissible, Facebook must take its share of responsibility in this and must do everything to anticipate these livestreams, as much as [they do with] hate and Islamophobia messages on their networks, ”Ogras told CNN.

Abdallah Zekri, president of the Observation Center Against Islamophobia, which is part of the CFCM, confirmed that the lawsuit targeted the French offices of Facebook and YouTube.

“We can’t have these videos online like shootout movies… YouTube and Facebook need to take action to prevent this in the future,” Zekri told CNN.

The council lodged a complaint with the Paris prosecutor’s office and said it was suing Facebook and YouTube for “disseminating a message with violent content encouraging terrorism, or of a nature likely to seriously undermine human dignity and likely to ‘to be seen by a minor,’ according to the AFP news agency, which received a copy of the complaint.

Under French law, this is punishable by imprisonment for up to three years and a fine of € 75,000 ($ 85,000).

Anthony Wallace / AFP / Getty Images

A group of students sing in front of a floral tribute to the victims of the Christchurch Mosque shooting.

Facebook did not immediately respond to CNN’s request for comment on Monday. In a statement following the attack, Mia Garlick, Facebook’s policy director for Australia and New Zealand, said: Instagram accounts and video.

A YouTube spokesperson declined to comment on the complaint and referred CNN to its previous statements. Following the attack, a Google spokesperson told CNN that YouTube removes “shocking, violent and graphic content” as soon as it is notified. YouTube declined to comment at the time on how long it took to remove the video.

A CFCM spokesperson told CNN that if the tech companies paid a fine as a result of the complaint, they would like the families of the victims of the Christchurch attack to share the money.

The images were shared widely on social media, and tech companies were criticized for their handling of the video.

In a statement posted on its website, Facebook said it removed 1.5 million videos of the attack within 24 hours of the shooting. It blocked 1.2 million of them while downloading, meaning they would not have been seen by users. Facebook did not say how many people watched the remaining 300,000 videos.

On March 18, New Zealand Prime Minister Jacinda Ardern said tech companies had “a lot of work” to do to curb the proliferation of content that incites hatred and violence.


Comments are closed.