Supreme Court to hear oral arguments in Google case with potential to turn the Internet upside down | CNN Business
The Supreme Court will hear oral arguments on Tuesday in the first of two cases this week with the potential to reshape how online platforms manage speech and content moderation.
Tuesday’s oral arguments are for a case known as Gonzalez v. Google, which focuses on whether the technology giant can be sued because of the algorithmic promotion of terrorist videos from its subsidiary YouTube on its platform.
According to the plaintiffs in the case, the family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris, YouTube’s targeted recommendations violated a US anti-terrorism law by helping to radicalize viewers and promote the view of ISIS world.
The complaint seeks to frame content recommendations from receiving protections under Section 230, a federal law that for decades has largely shielded websites from lawsuits over user-generated content. If successful, it could expose tech platforms to a host of new demands and could reshape the way social media companies run their services.
“I don’t want my daughter’s life to be erased like this. I want something to be done,” said Beatriz Gonzalez, Nohemi’s mother, in an interview with CNN. “We are looking for justice. Someone has to be responsible for what happened. Not just to me, but to many other families who have lost loved ones.”
Nitsana Leitner, the attorney for the Gonzalez family, told CNN that Google should be held accountable because by allowing ISIS videos to circulate on the platform, the company profited from the terrorist group’s activities.
“If you use the content for your own profit, you have to pay for your misconduct,” Leitner said.
Google and other tech companies have said they exempt targeted recommendations from Section 230 immunity it would increase the legal risks associated with the classification, curation and curation of online content, a core feature of the modern Internet. Google has stated that in such a scenario, websites would try to play it safe by either removing far more content than necessary or forgoing content moderation and allowing even more harmful material onto their platforms.
Friend-of-the-court filings from Craigslist, Microsoft, Yelp and others have suggested that the bets aren’t limited to algorithms and could also end up affecting virtually anything on the web that could be construed as a recommendation. This could mean that even average Internet users who volunteer as moderators on various sites could face legal risks, according to a Reddit filing and several Reddit volunteer moderators.
Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, the original co-authors of Section 230, argued before the Court that Congress’ intent in passing the law was to give websites broad discretion to moderate the content as they see fit.
The Biden administration has also weighed in on the case. In a brief filed in December, he argued that Section 230 protects Google and YouTube from lawsuits “for failing to remove third-party content, including content it has recommended.” But, according to the government’s brief, those protections don’t extend to Google’s algorithms because they represent the company’s own speech, not that of others.
On Wednesday, the Court will hear arguments in a second case, Twitter v. Taamneh. It will decide whether social media companies can be sued for aiding and abetting a specific act of international terrorism when the platforms have hosted user content expressing general support for the group behind the violence without reference to the terrorist act concrete in question.
The verdicts in both cases are expected at the end of June.
– CNN’s Jessica Schneider contributed to this report.