Technology

Big Tech is about to have an epic week in the Supreme Court

The stakes are high as the Supreme Court takes its first look at a law Republicans and Democrats have both criticized for giving too much protection to the tech industry.

 A billboard advertisement for YouTube.

Tech companies are bracing for the U.S. Supreme Court to hear one of the most consequential cases facing the trillion-dollar industry to date — a ruling that could potentially make them liable for the recommendation of harmful content on their platforms.

The case, Gonzalez v. Google, seeks to hold Google’s YouTube liable for the death of a woman killed in a 2015 terrorist attack. Her family is suing the company for recommending ISIS videos used to recruit potential terrorists for attacks across the world, and contending that a federal liability shield for tech platforms doesn’t protect YouTube’s use of algorithms to recommend content.

At the heart of the case, the court will take its first look at Section 230 of the Communications Decency Act — a law passed in 1996 that gives internet providers and similar companies legal protections for hosting most posts from users, as well as moderating and removing such content.

The case, which goes to oral arguments before the court on Tuesday, specifically tests whether social media platforms’ use of algorithms to recommend content to users is protected under Section 230. The court’s ruling could reshape the entire online ecosystem, including social media, e-commerce and job portals — all of which use algorithms to promote content to users.

Platforms say if the liability shield doesn’t protect their use of targeted algorithms to recommend and promote content, some companies would more aggressively remove users’ speech or bar the discussion of more controversial topics for fear of being sued.

In recent years, as social media platforms have come under increasing fire for the harms caused by content they host, Section 230 has become a target for politicians on both the left and the right who see it as granting the industry special protections not enjoyed by traditional publishers. (Both President Joe Biden and former President Donald Trump have called for removing the shield. Biden has yet to back any specific proposals.) Its supporters argue it’s crucial to a free and open internet where citizens can exchange ideas without worrying they’ll get the entire system shut down.

To date, Congress has largely failed to act outside of passing a 2019 carveout to the law related to sex trafficking. The disagreement stems from Democrats wanting platforms to remove content related to extremism and hate speech, and Republicans wanting more content — particularly conservative speech — to remain.

Here are four things to watch going into Tuesday’s oral arguments:

Can Clarence Thomas form a winning coalition?

Thomas, a frequent critic of Section 230, has written two dissents urging his colleagues to take a case reviewing what he sees as the lower courts’ overly broad interpretation of the law in favor of tech companies.

A key question Tuesday is whether Thomas can persuade four other justices to join him for a majority. Two potential allies could be Justices Samuel Alito and Neil Gorsuch. They joined a dissent with Thomas last May in a separate tech industry case before the court, NetChoice v. Paxton, seeking to uphold a Texas law requiring social media platforms to host all users’ political viewpoints.

“Alito and Gorsuch are his most likely allies in this case, and the question I think then is whether he can grab a couple others, and it’s not clear to me whether he can,” said Anupam Chander, a professor of law and technology at Georgetown Law.

And the bipartisan nature of the pressure to change Section 230 protections has experts watching to see if that is reflected in any decision from the justices. “There’s a kind of strange bedfellows aspect to tech regulation currently with everyone mad at tech companies for the opposite problems — the left accusing it of allowing it too much speech, and the right accusing it of censoring too much speech,” Chander said.

The importance of algorithms

Among those most affected by any ruling against Google could be smaller internet companies and individual website users, like volunteer moderators for Reddit, legal scholars and lawmakers said.

Large platforms such as YouTube could afford the liability risks of continuing to use algorithms to recommend content if the justices rule against Google. But some lawmakers fear that decision would be financially crippling for small businesses and startups.

“If you harm the little guys and you harm moderation, you’re going to reduce innovation, competition and opportunities, and give the big guys — like Facebook and Google — even more of the online market,” said Sen. Ron Wyden (D-Ore.), one of the original authors of Section 230.

Without algorithms that rely on user preferences to push recommendations, websites would likely present content in reverse chronological order, said Jeff Kosseff, a cybersecurity law professor at the U.S. Naval Academy who wrote a book on the history of Section 230.

“I don’t know if the American public is ready for not having personalized algorithms anymore,” Kosseff said. “How does TikTok operate without personalized algorithms? You just get any random video that’s ever been posted?”

But some legal scholars said tech companies should be liable for their products and services that break the law, just like any other business.

Mary McCord, the executive director of Georgetown Law’s Institute for Constitutional Advocacy and Protection, doesn’t believe tech companies’ “sky is falling” hyperbole that internet platforms will shut down if they can’t use recommendation algorithms. “They’ve just had this free pass since their inception — not even having to worry about the kinds of risks that every other company has had to face,” said McCord, who filed an amicus brief in the case on behalf of former national security officials.

McCord, who was an acting assistant attorney general for national security in the Obama administration, said that in 90 percent of terrorist incidents, social media factored significantly into the radicalization of individuals committing the attacks.

Republican Party split

In amicus briefs filed with the court, Republican lawmakers are split on how the justices should rule. That division may make it harder to predict how the conservative justices will land on the case as well — either siding with arguments that tech’s legal shield is too broad or that it’s necessary to protect free speech.

Sen. Josh Hawley (R-Mo.) called for the court to narrow its reading of Section 230 to more strictly align it with the statute — saying lower courts too broadly interpreted the law in tech’s favor. Similarly, Sen. Ted Cruz (R-Texas), along with 16 other Republican members of Congress, argued that the court needs to narrow the scope, arguing it gives large tech too much power over which speech is allowed — or “censored” — on their sites.

In contrast, former Pennsylvania Republican Sen. Rick Santorum’s amicus brief said narrowing the law’s interpretation would suppress speech, adding that Section 230 specifically allows companies to “filter,” “choose,” and “organize” content.

The split in the GOP between traditionally business-friendly conservatives and a more populist anti-tech contingent creates a challenging tightrope. “Historically, conservatives have sought to reduce litigation risk for corporations,” Georgetown’s Chander said. “Section 230 very much does that.”

But he added that, today, conservatives are taking “an anti-big business stance — and a new populism stance in doing so — that coincides with a kind of irritation with what they see as anti-conservative bias by technology companies.”

Gonzalez ruling may influence upcoming tech cases

How the Supreme Court rules in Gonzalez could affect its decision in a tech case scheduled for arguments the following day — Twitter v. Taamneh. That case asks whether Twitter, Google and Facebook can be held liable under the Justice Against Sponsors of Terrorism Act for allegedly aiding and abetting terrorists by sharing ISIS recruitment content.

The 9th U.S. Circuit Court of Appeals ruled in an opinion consolidating the two cases that the plaintiffs’ Anti-Terrorism Act claims in Gonzalez were barred under Section 230. In Taamneh, it found the platforms could be held liable for aiding and abetting an act of international terrorism by permitting ISIS to post content on their sites.

The Biden administration filed a brief recommending the Gonzalez case be sent back to the 9th Circuit, arguing that Section 230 does not immunize YouTube when its algorithm recommends ISIS content.

Legal scholars said the justices will likely rule on the cases in tandem. Chander predicts the court will find that Section 230 doesn’t provide immunity for YouTube’s targeted algorithms in the Gonzalez case, but will rule in favor of Twitter, Google and Meta in the Taamneh case by finding they couldn’t be held liable for underlying claims they aided and abetted terrorist acts by hosting ISIS content.

It could also tee up the justices for a potential ruling in two other cases the court likely punted to next term involving Republican laws from Texas and Florida that ban platforms from removing users’ viewpoints and deplatforming candidates. The companies said the laws violate their free speech rights.

But Daphne Keller, a director at Stanford’s Cyber Policy Center, said a ruling in Gonzalez that finds Google’s recommendation algorithms aren’t protected under Section 230 may backfire if the court later upholds the Texas and Florida laws that ban platforms from removing content.

“If Texas and Florida win their cases, then people can sue because platforms took their content down, even though the whole reason the platforms took the content down was to avoid the liability that Gonzalez created,” Keller said.

“It’s so circular, and I’m not sure the court realizes that.”