Two Supreme Court cases this week could turn the entire Internet | CNN Business
The Supreme Court will hear back-to-back oral arguments this week in two cases that could significantly reshape online speech and content moderation.
The outcome of oral arguments, scheduled for Tuesday and Wednesday, could determine whether technology platforms and social media companies can be sued for recommending content to their users or for supporting acts of international terrorism by hosting terrorist content. It marks the Court’s first review of a federal law that largely protects websites from lawsuits over user-generated content.
The closely watched cases, known as Gonzalez v. Google and Twitter v. Taamneh, carry significant stakes for the Internet at large. An expansion of the legal risk for apps and websites to host or promote content could lead to major changes for sites like Facebook, Wikipedia, and YouTube, to name a few.
The litigation has produced some of the most intense rhetoric in recent years in the technology sector about the potential impact on the future of the Internet. US lawmakers, civil society groups and more than two dozen states have also joined the debate with submissions to the Court.
At the heart of the legal battle is Section 230 of the Communications Decency Act, a nearly 30-year-old federal law that courts have repeatedly said provides broad protection to technology platforms but has since been under scrutiny along with growing criticism of Big Tech content. moderation decisions.
The law has critics on both sides of the aisle. Many Republican officials allege that Section 230 gives social media platforms a license to censor conservative viewpoints. Prominent Democrats, including President Joe Biden, have argued that Section 230 prevents tech giants from being held liable for spreading misinformation and hate speech.
In recent years, some in Congress have called for changes to Section 230 that could expose tech platforms to more liability, along with proposals to change US antitrust rules and other bills aimed at reining in dominant tech platforms. But those efforts have largely stalled, leaving the Supreme Court as the most likely source of change in the coming months in how the United States regulates digital services.
Resolutions of the cases are expected at the end of June.
The case involving Google centers on whether it can be sued over its subsidiary YouTube’s algorithmic promotion of terrorist videos on its platform.
According to the plaintiffs in the case, the family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris, YouTube’s targeted recommendations violated a US anti-terrorism law by helping to radicalize viewers and promote the view of ISIS world.
The complaint seeks to craft content recommendations so they don’t receive protections under Section 230, potentially exposing technology platforms to more liability for how they run their services.
Google and other tech companies have said that this interpretation of Section 230 would increase the legal risks associated with the classification, classification and curation of online content, a core feature of the modern Internet. Google has stated that in such a scenario, websites would try to play it safe by either removing far more content than necessary or forgoing content moderation and allowing even more harmful material onto their platforms.
Friend-of-the-court filings from Craigslist, Microsoft, Yelp and others have suggested that the bets aren’t limited to algorithms, and could also end up affecting virtually anything on the web that could be construed as a recommendation. This could mean that even average Internet users who volunteer as moderators on various sites could face legal risks, according to a Reddit filing and several Reddit volunteer moderators. Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, the original co-authors of Section 230, argued before the Court that Congress’ intent in passing the law was to give websites broad discretion to moderate the content as they see fit.
The Biden administration has also weighed in on the case. In a brief filed in December, he argued that Section 230 protects Google and YouTube from lawsuits “for failing to remove third-party content, including content it has recommended.” But, according to the government’s brief, those protections don’t extend to Google’s algorithms because they represent the company’s own speech, not that of others.
The second case, Twitter v. Taamneh, will decide whether social media companies can be sued for aiding and abetting a specific act of international terrorism when the platforms have hosted user content expressing general support for the group behind the violence without referring to the specific. terrorist act in question.
The plaintiffs in the case – the family of Nawras Alassaf, who was killed in an ISIS attack in Istanbul in 2017 – have alleged that social media companies, including Twitter, had knowingly aided IS in violation of the US anti-terrorism law by allowing some of the group’s content. to persist on their platforms despite policies aimed at limiting this type of content.
Twitter has said that ISIS’s use of the company’s platform to promote itself does not constitute Twitter’s “knowing” assistance to the terrorist group, and that under no circumstances can the company be held liable under the law counterterrorism because the content at issue in the case was not specific to the attack that killed Alassaf. The Biden administration, in its writing, has agreed with this view.
Twitter had also previously argued that it was immune from suit thanks to Section 230.
Other tech platforms such as Meta and Google have argued in the case that if the Court finds that tech companies cannot be sued under US anti-terrorism law, at least under these circumstances, it would avoid a Section 230 debate in both cases, because the claims in question would be rejected.
In recent years, however, several Supreme Court justices have shown an active interest in Article 230 and have appeared to invite opportunities to hear cases related to the law. Last year, Supreme Court Justices Samuel Alito, Clarence Thomas and Neil Gorsuch wrote that new state laws, like the one in Texas, that would force social media platforms to host content they would prefer to remove raise questions of ” great importance” about “the power of the dominant”. social media corporations to shape public discussion about the important issues of the day.”
There are currently a number of petitions pending asking the Court to review the Texas law and a similar law passed by Florida. Last month, the Court delayed a decision on whether to hear those cases, asking instead for the Biden administration to present its views.