Temporary security fences line the road leading to the plaza at the US Supreme Court Building in Washington, DC. leads, April 19, 2023.
Thesupreme courtSilicon Valley scored a major victory Thursday as it defended online platforms in two lawsuits legal experts warn could turn the internet upside down.
Both decisions preserve social media companies' ability to avoid lawsuits over terrorist content -- and represent a defeat for tech industry critics who view the platforms as irresponsible.
In doing so, the court sided with the tech industry and digital rights groups, who argued that greater accountability for tech platforms could undermine the fundamental functionality of many websites,There may even be legal risks for individual Internet users.
In each of these two casesWeibo in Tamnei,The Supreme Court has ruled that Twitter does not need to be charged with aiding and abetting terrorism if it hosts tweets by the terrorist group ISIS.
The court also refusedGonzalez vs Google, another well-known case of social media content moderation – circumventing a request to limit a website's primary state liability protection, Section 230 of the Communications Decency Act. Thursday's ruling preserved a lower court ruling that shielded social media platforms from a sweeping content moderation lawsuit.
Twitter's decision is unanimous and was written by judgesClarence Thomas, who said that social media platforms are no different from other digital technologies.
"It could be that bad actors like ISIS can use platforms like those of the defendants for illegal — and sometimes horrific — purposes," Thomas wrote. "But the same goes for mobile phones, email or the internet."
Thomas' opinion reflects the court's difficulty in determining what types of speech at hearings should give rise to social media liability and what types of speech merit protection.
"I think the courts will recognize the importance of these platforms for billions of people to communicate and stop meddling with them," said Samir Jain, vice president for policy at the Center for Democracy and Technology, which issued a brief in support of the tech industry submitted .
For months, many legal experts have viewed the Twitter and Google case as a sign that courts may seek sweeping changes to Section 230, a law that has faced bipartisan criticism of tech companies' content moderation decisions. Thomas, in particular, expressed an interest in trying a Section 230 case.
The expectation of extremely damaging outcomes in both cases led Kate Klonick, a law professor at St. John's University, to say:describe„Flood of Madness“ als Amicus-Slip.
However, as oral hearings unfold and judges clearly grapple with the complexities of internet speech, the chances of sweeping legislative changes appear to be diminishing.
"I think it's slowly starting to move into the realm of possibility ... maybe the courts don't know what the hell is going on with these cases and maybe they're singled out as activists, but they're not ready to be those activists." .” said KlonickTweets.
Daphne Keller, director of the Platform Governance Program at Stanford University, agrees.
"I think it's proof that we're all saying, 'The Supreme Court is trying the wrong cases that don't raise the issues that they really want,'" Keller told CNN.
Judges may soon have another opportunity to voice their opinions on social media. Courts are still deciding whether to hear some cases involving the constitutionality of state laws passed by Texas and Florida that restrict online platforms' ability to moderate content. But the way the court is handling the Twitter and Google cases suggests the courts are likely to approach new cases with caution.
"The fact that the judges are playing it safe is in itself a good sign that they have a more sophisticated understanding of these issues than many fear," said Evelyn Douek, an assistant professor at Stanford Law School.
In Thursday's ruling against Twitter, the court found that Twitter's hosting of terrorist remarks generally would not create indirect legal liability for specific acts of terrorism, effectively raising the bar for such claims in the future.
"We conclude," Thomas wrote, "that the allegations made by the plaintiffs do not provide sufficient evidence that these defendants aided and abetted ISIS in conducting the relevant attacks."
He stressed that the plaintiffs "have not alleged that the defendants knowingly provided significant assistance in the disputed attacks," nor did they provide "widespread and systematic support" to ISIS, which they blame for "every single ISIS attack." made.
The Twitter v. Taamneh case focused on whether social media companies can be sued under US anti-terrorism laws for hosting terrorist content that is only remotely related to a specific terrorist attack.
Plaintiffs in the case, the family of Nawras Alassaf, who was killed in an ISIS attack in Istanbul in 2017, allege that social media companies, including Twitter, allowed some ISIS content to continue, violating federal anti-terrorism laws breached to knowingly assist ISIS in its activities platforms. , despite the policy's intent to restrict such content.
"The myriad of companies, academics, content creators and civil society organizations that have joined us in this cause will be pleased with this outcome," said Halimah DeLaine Prado, Google's general counsel, in a statement. Rest assured.” “We will continue our efforts to protect freedom of expression online, fight harmful content and support the businesses and creators who benefit from the internet.”
Twitter did not immediately respond to a request for comment.
Google lawsuit dismissed, Section 230 unchanged
In a narrow decision, the court dismissed the lawsuit against Google in just a brief statement, leaving intact a lower court ruling that declared Google immune from allegations that its subsidiary YouTube had aided and abetted the terrorism trial.
The result could come as a relief not only to Google, but also to many websites and social media companies that have asked the Supreme Court not to cut legal protections for the internet.
The opinion was unsigned, and the court said, “We decline to address complaints that Section 230 applies to what appear to be few, if any, valid claims of relief. Instead, we are overturning the judgment below and remitting the case to the Ninth Circuit, considering the plaintiff's complaint based on our decision on Twitter."
The Google case is about whether its subsidiary YouTube can be sued for promoting algorithmically horrible videos on its platform.
The family of Nohemi Gonzalez, who was killed in the 2015 ISIS attacks in Paris, allege that YouTube's targeted recommendations violated US counter-terrorism laws because they helped radicalize viewers and promote ISIS ideology . worldview.
The charges aim to compartmentalize content recommendations in a way that they are not protected by Section 230, which could result in technology platforms being held more liable for how their services work.
Google and other tech companies say Section 230's interpretation will increase legal risks associated with the rating, categorization and management of online content, a fundamental feature of the modern internet. In such cases, Google claims, websites will attempt to ensure their safety by removing far more content than necessary, or by not moderating content at all and allowing more harmful material to be used on their platform.
Amicus Curiae filings from Craigslist, Microsoft, Yelp, and others suggest that the risks are not limited to algorithms, but could ultimately impact almost anything on the web that could be construed as a recommendation. That could mean that even ordinary netizens who volunteer as moderators on various websites could face legal risks, according to a filing by Reddit volunteer moderators and several Reddit moderators.
Oregon DemocratSenator Ron Wydenand Rep. Chris Cox, a former California Republican who was an original co-author of Section 230, argued in court that Congress passed the law to give websites broad discretion to customize their content as they see fit.
The Biden government is also involved in the case. A brief filed in December argued that Section 230 protects Google and YouTube from lawsuits alleging "failure to remove third-party content, including content they recommend." However, the government statement argues that this protection does not extend to Google's algorithms, as these reflect the company's own views and not those of others.
This story happens and is updated.