The Supreme Court actually understands the Internet
For the first time, the Supreme Court discusses its opinion in a short but powerful “26 words that created the web“.
In 1996, Section 230 of the Communications Decency Act went into effect, protecting online platforms from liability for what is posted on their site by a third party; protection that allowed the Internet to flourish by encouraging experimentation and interactivity in the early years. More recently, Section 230 has come under scrutiny as bipartisan critics argue that it provides powerful tech companies with too much coverage and too little accountability.
The Supreme Court’s view on the issue was a mystery until this week, when the justices heard oral arguments in two cases involving 230. On TuesdayThe court was asked to consider whether Google is responsible for YouTube’s recommendation algorithms that show Islamic State videos to users. Wednesday’s case was similar but addressed Twitter’s alleged responsibility for ISIS members using its platform to recruit and fundraise. Whatever the justices decide, it will be an important moment in web history. Passing 230 would put more pressure on Congress or regulatory agencies to come up with their own ideas to modernize the legal arms of the Internet, and its overhaul would force tech companies of all sizes to mutate to avoid liability.
The direction and tone of the questioning suggests the justices are leaning more toward the former, though the Court’s opinions likely won’t be released for at least several months. “There doesn’t seem to be any appetite on the part of the Supreme Court to deliberately open the floodgates to lawsuits against tech companies,” James Grimmelman, a professor of digital and information law at Cornell Law School, told me. That’s notable in part because the court hasn’t said much about platforms before, he noted. “We don’t know anything for years. We finally found out something about where their minds are.” Looks like maybe they’re inclined to leave the internet alone.
The Court briefly discussed whether algorithms can lose Section 230 immunity if they are intentionally discriminatory; the example they entertained was a dating app algorithm written to block interracial matches. They seemed to think about the role of intentionality; Did it matter if YouTube wrote an algorithm that protected ISIS or other extremists over more benign material, or would any algorithm still be protected by 230? But these issues were not resolved. The justices have hinted that they would like to see Congress tweak Section 230 if it needs tweaking, and sometimes they are self-deprecating about their ability to understand the issues. “We really don’t know much about these things,” Justice Elena Kagan joked on Tuesday. “You know, these aren’t the nine greatest experts on the Internet.”
However, they mostly got by because they understood the internet pretty well. During oral arguments against Google, Eric Schnapper, representing the family of ISIS victim Nohemi Gonzalez, spoke at length about YouTube’s choice to display video suggestions through thumbnails, saying it creates new content. from the platform. “Is there any other way they can be organized without using thumbnails?” Justice Samuel Alito asked clearly rhetorically. (He then joked that he assumed the site might come up with “ISIS video one, ISIS video two, etc.”) Judge Clarence Thomas asked Schnapper if YouTube’s recommendation algorithm works differently than, say, , for videos on rice pilaf than it does. videos from ISIS. Schnapper said he didn’t think so, and Justice Kagan interjected. “I think what was hidden under Justice Thomas’ question was the suggestion that algorithms are endemic to the Internet, that every time someone looks at something on the Internet, there is an algorithm. involved.” He wondered if this algorithmic approach would send the court “so that 230 can’t really mean anything.”
None of the judges were satisfied with Schnapper’s reasoning. Judge Brett Cavanaugh summed it up as paradoxical, noting that an “interactive computer service” as defined in section 230 is defined as a service “that filters, screens, selects, selects, organizes content.” If the algorithms are not subject to Section 230 immunity, that would “mean that the very thing that makes a website an interactive computer service also means that it loses the protection of 230. And as a textual and structural issue, we don’t.” usually reads a statute essentially to defeat itself.”
On the second day of arguments, the Court barely addressed Section 230, instead focusing almost entirely on the merits of the case against Twitter: Justice Against Sponsors of Terrorism Act. This led to a lengthy discussion about what may or may not be considered “aid and abetment”. Will the platform be liable, for example, if it fails to enforce its policy that prohibits terrorists from using its services? Edwin Knidler, arguing on behalf of the Justice Department, sided with Twitter in the case, saying the law “requires more than allegations that a terrorist organization used interactive computer services that are remote from a terrorist act; were widely and routinely available to hundreds of millions, if not billions, of people through the automated features of those services; and did not single out ISIS for favorable treatment.”
The Court then explored a number of hypotheticals involving pager sales, arms sales, the idea of Osama bin Laden using personalized banking, and J. Edgar Hoover’s imaginary scenario of telling Bell Telephone that Nederland Schultz was a gangster and was using his phone to carry out mob activity. “The discussion this morning has taken on a very academic tone indeed,” observed Chief Justice John Roberts.
In fact, both mornings were heavy with abstract arguments. The court will have to deal with larger questions before anyone can figure out whether the 1,348 ISIS videos that received a total of 163,391 views on YouTube, compared to an average of 121 views, are algorithmic amplification of terrorist content, according to the case documents. A few weeks ago, I argued that the Supreme Court’s decision in these two cases could change the Internet as we know it, especially if it rules that all kinds of algorithms are not subject to Section 230 immunity. This would render search engines inoperable and trigger a flood of lawsuits against any companies that organize content through any kind of automated process.
In taking these cases, the Court was clearly interested in whether distinguishing algorithmic recommendations might be a good opportunity to reinterpret and thereby modernize Section 230. “I can see why it was attractive,” Grimmelman said. “But what happened when the cases actually got to oral argument, the justices saw how complicated it really is and why that line is not a very fine one to draw.”