Free Speech Summer: The U.S. Supreme Court’s Recent Opinions Regarding Online Content Moderation (Part I.)
This summer, the U.S. Supreme Court issued a series of landmark decisions, including Moody v. NetChoice and NetChoice v. Paxton, which demonstrate the contentious issue of state regulation of social media platforms and their First Amendment rights. These cases, alongside Murthy v. Missouri and the Court’s stance on Section 230 in John Doe v. Snap, Inc., underscore the complexities of balancing state regulatory efforts, free speech, and the autonomy of digital platforms in the digital age. This blog post explores these important decisions and their potential impact on the future of online speech regulation.
The Supreme Court’s decision in Moody v. NetChoice represents an important moment in the ongoing debate over state regulation of social media platforms and the First Amendment rights of these platforms. This case, combined with NetChoice v. Paxton, addressed whether Florida and Texas laws that restrict how social media companies can moderate content violate the First Amendment. The Court’s ruling on July 1, 2024, vacated the previous decisions of the Eleventh and Fifth Circuits and remanded the cases for further proceedings, highlighting significant concerns about the way these courts approached the constitutional issues.
The laws in question were enacted by Florida and Texas in 2021. Florida’s SB 7072 and the analogous Texas law aimed to prevent social media platforms from deplatforming political candidates and required transparency in content moderation decisions. These laws were part of broader political efforts to address perceived biases against conservative viewpoints on major social media platforms. NetChoice, representing companies like Google and Facebook, challenged these laws on the grounds that they infringed on the platforms’ First Amendment rights to exercise editorial judgment. The Eleventh Circuit, addressing the Florida law, found that the law’s provisions likely violated the First Amendment by compelling speech and infringing on the editorial discretion of private companies. Conversely, the Fifth Circuit upheld the Texas law, arguing that the platforms’ content moderation did not constitute protected speech. This disagreement between the circuits set the stage for Supreme Court intervention to resolve the constitutional questions.
The Supreme Court’s opinion, written by Justice Elena Kagan, stressed that the lower courts had failed to perform a comprehensive facial analysis of the First Amendment issues. The majority opinion pointed out that the Eleventh and Fifth Circuits had only considered the laws’ impact on specific features of social media platforms rather than their broader constitutional implications. Justice Kagan emphasized the necessity of evaluating the laws’ effects on all aspects of the platforms’ operations to determine their constitutionality. Chief Justice Roberts and Justices Sotomayor, Kavanaugh, Barrett, and Jackson joined Justice Kagan’s opinion. The Court maintained the injunctions against the enforcement of both state laws, underscoring the need for a more detailed examination of the First Amendment concerns raised by these regulations. The majority opinion highlighted that the laws potentially compelled platforms to host speech against their policies, thus infringing on their editorial discretion and violating established First Amendment principles.
Justice Barrett’s concurrence underscored the importance of focusing constitutional challenges on specific provisions of the laws rather than the statutes in their entirety. She cautioned that overly broad challenges could complicate judicial review and potentially overlook important nuances in the laws’ applications. Justices Jackson and Thomas, in their concurrences, expressed concerns about the scope of the Court’s guidance to the lower courts. They echoed the sentiment that the federal judiciary should avoid ruling on the constitutionality of entire statutes and should instead address specific contentious provisions. Justice Alito, joined by Justices Thomas and Gorsuch, further emphasized the need for caution when applying constitutional standards to new technologies like social media. Alito warned against simplistic comparisons between social media platforms and traditional media, highlighting the unique challenges posed by digital communication technologies.
The Supreme Court’s decision could have meaningful implications for the future of social media regulation and First Amendment jurisprudence. By vacating the lower courts’ rulings and remanding the cases, the Court has signaled the need for a thorough and nuanced analysis of the constitutional issues at stake. This decision will likely influence how courts and lawmakers approach the regulation of online speech and the rights of digital platforms.
In addition to the majority opinion and concurring opinions, various amicus briefs played a significant role in shaping the Court’s understanding of the issues. Organizations such as the Knight First Amendment Institute and the Reporters Committee for Freedom of the Press argued that social media platforms’ content moderation decisions are protected by the First Amendment as exercises of editorial judgment. However, they also contended that this protection is not absolute and that certain regulatory measures could be justified under specific circumstances.
But the ruling left unresolved how its conclusions might influence the broader question of the overall constitutionality of the Florida and Texas law. Federal district courts in Texas and Florida had previously issued temporary orders barring enforcement of these laws, which are likely to remain in place during ongoing appeals. In this matter, Justice Amy Coney Barrett joined Justice Elena Kagan’s opinion but wrote separately to emphasize her belief that these cases demonstrate the perils of challenging a law in its entirety. She suggested that the internet trade groups would be better served by focusing on challenging the constitutionality of the laws as applied to their specific services, rather than the laws in their entirety. Justice Ketanji Brown Jackson expressed that she would have preferred not to address the merits of the Texas law at this stage. She argued that the Court should avoid deciding more than necessary when dealing with complex constitutional issues in new contexts on undeveloped records. Justice Clarence Thomas echoed this sentiment, noting that the Court’s broader discussion was unnecessary for its holding. He argued that federal courts should limit their rulings to the specific cases before them and that they do not have the authority to declare a statute wholly unconstitutional. This perspective highlights a judicial philosophy that emphasizes restraint and specificity in constitutional adjudication.
This nuanced decision reflects a cautious approach by the Supreme Court, underscoring the importance of judicial restraint and the complexities involved in adjudicating new constitutional issues presented by modern legislative actions. The ongoing appeals and the district courts’ orders indicate that the legal battles over these state laws are far from resolved, and future rulings will continue to shape the landscape of internet regulation and constitutional law. The decision underscores the complex relationship between state regulatory interests and the constitutional protections afforded to private companies in the digital age. As these cases return to the lower courts, the detailed scrutiny mandated by the Supreme Court will shape the ongoing debate on the extent to which states can regulate the content moderation practices of social media platforms while respecting First Amendment rights.
Another summer decision (case No. 23-961, titled John Doe, Through Next Friend Jane Roe, v. Snap, Inc.) encapsulates a similar legal issue surrounding the liability of social media platforms for third-party content and the boundaries of Section 230 of the Communications Decency Act (CDA). This case arose from allegations that Snap, Inc.’s Snapchat application was used by a teacher to groom and sexually exploit a student, John Doe. The petitioner, represented through a next friend (the next friend is an individual who appears in court in place of another who is not competent to do so, usually because they are a minor), sought to hold Snap, Inc. accountable, arguing that the platform’s design and functionality played a contributory role in the unlawful activities.
The legal journey began in the United States Court of Appeals for the Fifth Circuit, which ruled in favor of Snap, Inc., invoking Section 230 of the CDA. (This section, about which I have written extensively on the blog previously, effectively shields online platforms from being treated as publishers or speakers of third-party content, thereby granting them immunity from certain types of liability.) The petition for a writ of certiorari (a “writ” by which a higher court reviews some lower court’s decision) to the Supreme Court was aimed at challenging this broad interpretation of Section 230, urging a reassessment in light of modern technological advancements and their associated risks.
The Supreme Court’s refusal to grant certiorari, thereby declining to hear the case, upholds the Fifth Circuit’s decision. This refusal is significant as it reaffirms the robust protections afforded to social media companies under Section 230, despite increasing calls for the re-evaluation of this statute. Justice Clarence Thomas, with the concurrence of Justice Neil Gorsuch, dissented from this denial, underscoring the necessity for the Court to address the expanding implications of Section 230. Justice Thomas articulated concerns that the statute, formulated in 1996, might not adequately address the complexities introduced by contemporary digital interactions and the potential harms they engender. The dissenting opinion by Justice Thomas reflects a growing judicial recognition of the need to balance technological innovation with accountability. Thomas emphasized that the legal framework established by Section 230 may no longer be sufficient to address the intricacies of modern technology, particularly when platforms potentially facilitate harmful activities. He suggested that the Court should consider revisiting and potentially recalibrating the scope of Section 230 to ensure it remains relevant and effective in the current digital landscape.
The case also shed light on the broader discourse concerning the responsibility of social media platforms in safeguarding users, especially minors, from exploitation and abuse. The petitioner’s arguments emphasized that Snap, Inc.’s design choices, such as the anonymity features and the ephemeral nature of Snapchat messages, could contribute to an environment that facilitates predatory behavior. This argument sought to establish a direct link between the platform’s functionalities and the harm experienced by the petitioner, thereby challenging the traditional boundaries of platform liability under Section 230. The Supreme Court’s decision to deny certiorari in this case underscores the enduring tension between technological innovation and legal accountability. While the denial leaves the broad protections of Section 230 intact, the dissenting opinions and the arguments presented in the amici briefs signal a growing awareness and concern about the need for updated legal frameworks that better address the challenges posed by modern technology.
In the second part, we deal with the Supreme Court’s decision in Murthy v. Missouri and what these three cases could mean in the future of online speech regulation.
János Tamás Papp JD, PhD is an assistant professor at Pázmány Péter Catholic University, Hungary, and a legal expert at the Department of Online Platforms of the National Media and Infocommunications Authority of Hungary. He has taught civil and constitutional law since 2015 and became a founding member of the Media Law Research Group of the Department of Private Law. He earned his JD and PhD in Law at the Faculty of Law and Political Sciences of the Pázmány Péter Catholic University. His main research fields are freedom of speech, media law, and issues related to freedom of expression on online platforms. He has a number of publications regarding social media and the law, including a book titled „Regulation of Social Media Platforms in Protection of Democratic Discourses