[ad_1]
- A New York state judge refused to dismiss a lawsuit regarding the Buffalo mass shooting against Facebook, YouTube, and Reddit amongst other social media sites.
- 40 plaintiffs have filed a case accusing these social media giants of hosting and encouraging extremist content on their platforms
- YouTube offered condolences to the families of the victims but has decided to appeal against the decision
On Monday, a New York state judge ruled that social media platforms such as Reddit, YouTube, Facebook, and Instagram will be facing lawsuits for enabling the Buffalo mass shooting.
The judgment was passed by Justice Paula Feroleto of the Erie County Supreme Court about the 2022 mass shooting in Buffalo, New York where 18-year-old Payton S. Gendron shot 10 people and injured 3 more at a Tops Friendly Markets supermarket. All of them were African Americans.
Part of that attack was also livestreamed on Twitter but it was quickly shut down within 2 minutes.
In the end, Gendron pleaded guilty to all charges and received 11 consecutive life sentences with no chance of parole. The federal prosecutor is still on the case trying to get the death penalty for the attacker.
Following this trial, 40 plaintiffs including the relatives of those who passed away in the attack and the store staff that witnessed the shooting filed a case against many major social media platforms, accusing them of hosting content that can radicalize people.
This was clearly a hate crime and Gendron is said to have written a manifesto where he clearly described himself as an ethno-nationalist and a believer in white supremacy.
The plaintiffs believe that such ideas have been planted in him through social media.
As addictive as these platforms are, they are also home to a host of conspiracy theories and extremist political discussions. For example, Gendron believed in the “Great Replacement Conspiracy Theory” which is a part of the White Genocide theory. The lawsuit claims that it’s the content on these platforms that indoctrinated him with such theories.
Read More: X To open new content moderation center in Austin
All the accused in the lawsuit jointly appealed for the case to be dismissed. In their defense, they said they simply host content from other users, they have no hand in the type of posts that go online.
In fact under section 230 of the Communications Decency Act, or the U.S. Constitution’s First Amendment, they are not even legally responsible for governing such content. But after this ruling, these social media platforms are set to face a trial.
The judge said that although the companies are not directly violating any law (in terms of content moderation), the plaintiffs could still try and prove that it was their platform’s negligence that led to this incident.
While we disagree with today’s decision and will be appealing, we will continue to work with law enforcement, other platforms, and civil society to share intelligence and best practices.José Castañeda, YouTube Spokesperson
Reddit also shared similar sentiments. They said that their content policy already prohibits violent and hateful content. They do their best to remove any content that can directly or indirectly encourage violence or physical harm against an individual or a certain group.
They are deeply saddened by the unfortunate incident and assured that their team is working on developing better detection tools to ensure the speedy removal of such content.
We reached out to Meta for a comment but they are yet to respond.
Maqvi News #Maqvi #Maqvinews #Maqvi_news #Maqvi#News #info@maqvi.com
[ad_2]
Source link