officialstreetpreachers Subscribe
Published: February 22, 2023

Supreme Court Hears Arguments Concerning Tech Giants’ Liability for Social Media Content

By The Editor

WASHINGTON, D.C. – The Internet could soon face a significant transformation, courtesy of the Supreme Court.
Justices heard arguments in a case questioning whether tech companies should be liable for content on their platforms. Currently, a decades-old law protects big tech. However, that could change with a landmark decision.

Eight years ago, a 23-year-old American college student Nohemi Gonzalez died in a series of coordinated ISIS terror attacks in Paris.

In seeking justice, her family isn’t going after the terrorists. Instead, they’re targeting YouTube, a platform owned by Google.  
“Hopefully by this, it’ll change the laws and it’ll be for the good,” said Nohemi’s father, Jose Hernandez. 
Nohemi’s family claims YouTube’s internal algorithm highlighted ISIS-produced materials, radicalizing the extremists that killed their daughter.

In Gonzalez v. Google, the family is suing the tech giant in a case that’s made it all the way to the Supreme Court.

Tuesday, justices listened to arguments centering on Section 230 of the Communications Decency Act.

The 1996 law states that Internet companies cannot be sued over content uploaded by third party users. 

The Heritage Foundation’s Jake Denton told CBN News, “So Section 230 has really been invoked across all sorts of cases because the courts have given them, tech companies, broad immunity with Section 230. They’ve interpreted the original text of the law to be far more encompassing than it was originally intended to be, and I think it’s best seen through the algorithm issue.”

During Tuesday’s hearing, Eric Schnapper, the family’s attorney, argued YouTube is not simply providing third party content, adding the company uses an algorighm that maes unsolicited video recommendations to its users.

According to Schnapper, that is not protected by Section 230. “What we’re saying is that, insofar as they were encouraging people to go look at things, that’s what’s outside the protection of that statue, not that the stuff was there.”
  
Google, meanwhile, argued those algorithm recommendations do falll under the statute’s protections, because it’s acting as a publisher when related videos are listed.

Google Attorney Lisa Blatt, said, “‘Publication’ means communicating information. so when websites communicate third party information and the plaintiffs harm flows from that information, c1 bars the claim.”
 
During their questioning, the justices agreed tech companies should not have blanket protection, although they struggled to find the line where companies would become liable. The justices frequently wondered aloud if this should be an issue for the courts or for Congress. 

“We’re a court. we really don’t know about these things. these are not, like, the nine greatest experts on the Internet,” Justice Elena Kagan said.

Schnapper believes the court should simply apply the statute the way it was written, while also agreeing Congresss has a role to play.

“It will inevitably happen and has happened that companies devise practices which are maybe highly laudable but they don’t fit within the walls of the statute. That will continue to happen no matter what you choose to do and the answer to that when someone devises some new practices – the industry has to go back to Congress and say we need you to braoden the statute because you wrote this to protect chat rooms in 1996, and we want to do something that doesnt fit within the statute,” Schnapper explained.

On Wednesday, another important Internet regulation case, this one, with Twitter as the defendant. It also deals with Section 230 protections, questioning whether the tech giant can be held liable for failing to keep ISIS off its platform.

The remainder of this article is available in its entirety at CBN


Share this Article

Download the Mobile App.
Exit mobile version