Supreme Court Showdown: Social Media Giants vs. State Laws on Content Moderation, Ruling Could Redefine Online Landscape.
Is this the beginning of the end of the internet as we know it? (Photo illustration by Slate / Getty Images Plus)

Supreme Court Showdown: Social Media Giants vs. State Laws on Content Moderation, Ruling Could Redefine Online Landscape.

On February 26, 2024, the U.S. Supreme Court will hear oral arguments with knotty free-speech questions as it weighs the constitutionality of landmark Florida and Texas laws that restrict online social media platforms’ ability to moderate user content. The U.S. Courts of Appeals opinions in these cases present two different views of the First Amendment rights at issue. The Eleventh Circuit concluded that parts of the Florida law were likely unconstitutional because they unduly restrict the editorial judgment of covered platforms. This decision was consistent with the way a number of trial courts have characterized the First Amendment rights of websites that host user speech. In contrast, the Fifth Circuit upheld the Texas law as constitutional, saying the covered platforms were engaged in “censorship,” not protected speech. A Supreme Court ruling in this case, expected by June 2024, could severely curtail the platforms' ability to curate their newsfeeds to make them more engaging for users, drastically altering social media sites, and could have significant implications for the U.S. Congress as it considers whether and how to regulate online platforms. Whichever way the Court rules, the outcomes of the two cases could set the precedent for how content moderation is handled in the United States for decades to come. This article discusses the relevant First Amendment principles at stake, then explains the background of the two cases and the parties’ arguments at the U.S. Supreme Court.


I. Free Speech Protections for Speech Hosts.

The First Amendment prevents the government from unduly infringing speech, including the speech of private companies. The U.S. Supreme Court has long recognized that companies may be engaged in protected speech both when they create their own speech — which can include activities like designing a website for customers — and when they provide a forum for others’ speech. When a private entity hosts speech, it may "exercise editorial discretion over the speech and speakers in the forum." For instance, in Miami Herald Publishing Co. v. Tornillo, 418 U.S. 241 (1974), the Supreme Court struck down a state law requiring newspapers to publish certain pieces from political candidates. The Court reasoned that newspapers "exercise editorial control and judgment" over what material to print and how to present the content and ruled that the First Amendment prevented the government from regulating "this crucial process."

The Court has recognized these protections for editorial judgment outside the context of traditional media. To take one example, in Hurley v. Irish American Gay, Lesbian & Bisexual Group of Boston, 515 U.S. 557, 574 (1995), the Court held that a parade organizer’s decisions about who could march were expressive even though the parade as a whole did not communicate one coherent, particularized message. The Court said the organizer’s decision to include a parade unit suggested the organizer would be celebrating that group’s message, and accordingly, the First Amendment protected the decision to exclude a certain group. Federal trial courts have applied these principles to online speech, citing the First Amendment to dismiss private lawsuits that have challenged websites’ editorial decisions about what content to publish.

In some cases, however, the Supreme Court has concluded the government can force a private entity to host others’ speech. The distinction between these two lines of cases has not always been clear, but factors that may be relevant are the type of media, whether the business is providing a forum for speech, whether the host in fact exercises discretion over the speech it hosts, and whether there is a risk the third party’s speech will be attributed to the host. For instance, in United States Telecom Association v. Federal Communications Commission, 825 F.3d 674 (2016), the U.S. Court of Appeals for the District of Columbia Circuit (D.C. Circuit) rejected a First Amendment challenge to a 2015 net neutrality order that classified broadband internet access service providers as common carriers and prevented them from blocking lawful content. Critically, the court concluded that those providers did not actually make editorial decisions in picking and choosing which speech to transmit. Then-Judge Brett M. Kavanaugh, however, disagreed with this "use it or lose it" analysis, arguing that the providers were entitled to First Amendment protection because they transmit internet content, regardless of whether they actually choose to exercise their editorial discretion.

If a court concludes that a host exercises protected editorial discretion, it may then ask what government infringements on that speech activity may be permissible. Similarly to Tornillo and Hurley, some federal trial courts have seemed to take an absolute approach, concluding that no infringement on editorial discretion is permissible. In other contexts, however, courts have applied different levels of constitutional scrutiny that allow the government to justify infringing protected activity. Broadly, if a law compels specific messages or targets speech based on its content, courts will usually apply a demanding standard known as "strict scrutiny." Under strict scrutiny, a law is presumptively unconstitutional and must be narrowly tailored to serve compelling state interests. If a law is content-neutral, or if it primarily targets nonexpressive conduct and only incidentally regulates speech, a court may apply "intermediate scrutiny." This standard requires the restriction on speech to be no greater than essential to further an important or substantial government interest. For example, the Court applied intermediate scrutiny to evaluate a law requiring cable operators to carry certain local broadcast stations after concluding the law’s application did not depend on the content of the operators’ programming and was instead based on special characteristics of the medium. The Court also evaluates commercial speech regulations using intermediate scrutiny; however, it has reviewed certain commercial disclosure requirements under an even more lenient standard prescribed in Zauderer v. Office of Disciplinary Counsel, 471 U.S. 626 (1985). Zauderer upheld a state law requiring attorneys to include certain statements in their advertisements after ruling that the disclosures were "reasonably related" to the state’s interest in preventing consumer deception.


Silhouettes of Florida and Texas are superimposed on the U.S. Supreme Court Building.


II. State Laws and Procedural History.

An abbreviated, brief-as-a-breath, slim as a Pop-Tart discussion of Florida’s and Texas’s landmark content moderation laws and the litigation in the lower federal courts follows.

A. Florida: NetChoice v. Moody.

In May 2021, Republican Governor Ron DeSantis signed a Florida law enacted by a Republican-controlled legislature limiting computer services’ ability to restrict user content. It prohibits social media platforms from "willfully deplatforming a candidate," a reference to both X (at the time named Twitter) and Meta locking U.S. President Donald J. Trump out of his accounts on their platforms after his supporters stormed the U.S. Capitol on January 6, 2021. The law applies to any service that meets certain size thresholds and "provides or enables computer access by multiple users to a computer server." Thus, the law included not only social media sites but also, for instance, internet service providers and offline entities that provide computer access. (For ease of reference, this article uses the term "platforms" to refer to the entities covered by both states’ laws.) Florida’s law requires platforms to apply their moderation standards consistently and limits how often platforms can change their moderation rules. The law also requires platforms to publish the criteria they use for banning users and content and to apply such criteria consistently. It also insists that platforms give notice and explanation before they can restrict users’ content. Further, the law completely prohibited platforms from removing or restricting the content of political candidates or "journalistic enterprises." The law also contains other disclosure provisions, such as requiring platforms to share terms of service and provide data about how many people viewed a user’s posts. Floridians would be able to sue social media platforms for violating the new law.

Two trade groups, NetChoice and the Computer & Communications Industry Association, known as CCIA, sued to enjoin this law, claiming it violated the First Amendment rights of their members. In May 2022, the U.S. Court of Appeals for the Eleventh Circuit (Eleventh Circuit) ?partially affirmed a preliminary injunction preventing the state from enforcing this law. The court held that the provisions limiting platforms’ ability to engage in content moderation were likely unconstitutional but rejected the constitutional challenges to most of the disclosure requirements.

The Eleventh Circuit first concluded that the law triggered First Amendment scrutiny by restricting the platforms’ exercise of editorial judgment and imposing disclosure requirements. More specifically, the court ruled that the regulated platforms "exercise editorial judgment that is inherently expressive:" They express a message of disagreement or disapproval when they "choose to remove users or posts, deprioritize content in viewers’ feeds or search results, or sanction breaches of their community standards." Accordingly, the law’s content moderation provisions — those that would prohibit restricting content or control how platforms apply or change their moderation standards — triggered either strict or intermediate First Amendment scrutiny. The court said it did not need to decide exactly what standard applied to each of these provisions because they could not withstand even intermediate scrutiny. The court rejected a hypothetical state interest in preventing private "censorship" or promoting a variety of views by citing U.S. Supreme Court precedent establishing "there’s no legitimate — let alone substantial — governmental interest in leveling the expressive playing field."

The court applied Zauderer to review the disclosure provisions, both the notice-and-explanation requirement and the other provisions. However, while the court ruled that most of the disclosure provisions satisfied Zauderer’s lenient constitutional standard, it held that the notice-and-explanation requirement did not. The court said this provision was unduly burdensome; it was “practically impossible to satisfy” and so would be likely to chill the platforms’ exercise of editorial judgment.

?

B. Texas: NetChoice v. Paxton.

Throughout our Nation's history, the First Amendment's freedoms of speech and press have protected private entities' rights to choose whether and how to publish and disseminate speech generated by others. See e.g., Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1930 (2019); Hurley v. Irish-Am. Gay, Lesbian & Bisexual Grp. of Boston, 515 U.S. 557, 570, 575 (1995); Miami Herald Publ'g Co. v. Tornillo, 418 U.S. 241,258 (1974). Over two decades ago, the U.S. Supreme Court held there is "no basis for qualifying the level of First Amendment scrutiny that should be applied to" speech disseminated on "the Internet." Reno v. ACLU, 521 U.S. 844, 870 (1997). Today, many internet websites publish and disseminate curated collections of expression generated by themselves and others.

Nevertheless, the State of Texas — much like Florida before it — has enacted a viewpoint-, content-, and speaker-based law (House Bill 20 or HB20) targeting certain disfavored "social media" websites. HB20 Section 7 prohibits these websites from making editorial choices based on "viewpoint." And HB20 Section 2 imposes on these websites burdensome operational and disclosure requirements, chilling their editorial choices. The Supreme Court has already ensured once that Respondent cannot enforce this law against Petitioners' members. NetChoice, LLC v. Paxton, 142 S. Ct. 1715, 1715-16 (2022).

Texas’s law, enacted by a Republican-controlled legislature and signed by Republican Governor Greg Abbott in September 2021, also imposes content moderation and disclosure requirements on platforms but has a slightly different scope than Florida’s law. Texas’s law applies more narrowly to large public sites that allow users to create accounts and communicate with others "for the primary purpose" of sharing information. The law also excludes certain services such as internet service providers, email, or certain news sites. Texas’s law prohibits covered platforms from "censor[ing]" based on a user’s viewpoint, the viewpoint represented by the user, or the user’s location in Texas, although it contains exceptions allowing providers to remove certain types of unlawful or otherwise harmful content. The law imposes a requirement for platforms to provide users with notice and an opportunity to appeal when their content is removed. Finally, the law also imposes other disclosure requirements relating to the platforms’ content moderation standards.

Texans would also be able to sue social media companies for violating the new law. For example, to participate in the internet’s leading Star Trek forum, Reddit users must abide by a simple rule: "Be nice." So, when a user called one of the franchise’s characters a "soy boy" — a pejorative term insulting a person’s masculinity — in 2022, the discussion board’s volunteer moderators kicked him out. But the user shot back, filing a lawsuit against Reddit, Inc. under the Texas law prohibiting social media companies from removing posts or accounts based on a viewpoint — an unprecedented regulation subverting how the internet has operated for decades.

The trade groups NetChoice and the Computer & Communications Industry Association, known as CCIA, also challenged the Texas law under the First Amendment. The U.S. Court of Appeals for the Fifth Circuit (Fifth Circuit), however, reversed a preliminary injunction against the law, concluding the trade groups were unlikely to succeed on their constitutional claims. (One judge dissented in part.) The Fifth Circuit recognized that "Texas was not the first State to enact a law regulating censorship by large social media platforms" and that its opinion disagreed with the Eleventh Circuit’s reasoning, including how to interpret U.S. Supreme Court precedent discussing editorial discretion. See NetChoice, LLC v. Att’y Gen. of Fla., 34 F.4th 1196 (11th Cir. 2022).

Throughout its opinion, the Fifth Circuit rejected the First Amendment arguments by characterizing the plaintiffs as asserting a "right to censor," not a protected speech right. The court said the platforms regulated by Texas’s law were “nothing like” the newspapers in Tornillo. The court concluded that the platforms "exercise virtually no editorial control or judgment," describing them as using algorithms to screen out obscenity and spam but posting "virtually everything else." The Fifth Circuit believed cases like Hurley apply only when a host is "intimately connected" with the third-party speech and said the platforms are not so connected, in part because they do not curate an overall message. In addition to declaring Tornillo and Hurley inapposite, the Fifth Circuit also stated that the U.S. Supreme Court has not recognized editorial discretion as a type of protected speech. Instead, the court held that the platforms could be treated as "common carriers subject to nondiscrimination regulation." (While the D.C. Circuit’s U.S. Telecom Association case involved common carrier classification under the Communications Act, the Fifth Circuit looked to a historical, common-law definition of common carriers as a special class of "communication and transportation providers" that must serve all comers.) In the alternative, the court ruled that even if the law did implicate the platforms’ First Amendment rights, it would implicate at most only intermediate scrutiny, which the state could satisfy. In contrast to the Eleventh Circuit, the Fifth Circuit said protecting the free exchange of a variety of ideas is an important government interest.

Like the Eleventh Circuit, the Fifth Circuit concluded that the disclosure provisions were subject to review under Zauderer. Unlike the Eleventh Circuit, the Fifth Circuit held that all the disclosure provisions, including the notice and explanation requirements, were not overly burdensome and satisfied this level of constitutional review.

?

III. Party Arguments at the U.S. Supreme Court.

The U.S. Supreme Court has recently shied away from broad rulings on the free speech rights of online platforms, including in decisions in two cases last year where they sided with Google and Twitter when the two were sued for damages over terror attacks abroad. In the two cases at issue here — Moody v. NetChoice, LLC (22-277) and NetChoice, LLC v. Paxton (22-555) — Texas and Florida have pushed for a broad ruling, arguing that social media companies should be regarded as common carriers, like telephone companies or delivery services, with fewer free speech rights and open for more regulation.

Both cases arrive at the U.S. Supreme Court for oral arguments on February 26, 2024, garbed in politics, as they concern laws in Florida and Texas aimed at protecting conservative speech by forbidding leading social media sites from removing posts based on the views they express. But the outsize question the cases present transcends ideology: It is whether tech platforms have free speech rights to make editorial judgments. Picking the apt analogy from the Court’s precedents could decide the matter, but none of the available ones is a perfect fit. If the platforms are like newspapers, they may publish what they want without government interference. If they are like private shopping centers open to the public, they may be required to let visitors say what they like. And if they are like phone companies, they must transmit everyone’s speech. What is clear is that the Court’s decision, expected by June 2024, could transform the internet.

Both Florida and the trade groups appealed the Eleventh Circuit’s ruling to the Supreme Court, and the trade groups appealed the Fifth Circuit’s ruling. The U.S. Supreme Court agreed to hear Moody v. NetChoice, (Florida’s appeal) and NetChoice v. Paxton (the Texas case), limited to two questions presented by the Elizabeth Prelogar, the 48th Solicitor General of the United States and the fourth-ranking individual at the U.S. Department of Justice, in her brief for the United States as amicus curie: whether the laws’ "content-moderation restrictions" and "individualized-explanation requirements" comply with the First Amendment. Thus, the Court did not agree to consider all the laws’ disclosure provisions, only the provisions requiring platforms to explain their moderation decisions.

In their briefs, the trade groups claim the content moderation restrictions and explanation requirements in both laws violate the First Amendment by forcing private parties to host speech with which they disagree. The trade groups cite Tornillo and Hurley as recognizing constitutional protections for private parties’ editorial judgments. The groups contend these principles extend online, including to platforms’ post-publication review of user content. Responding to the Fifth Circuit opinion, they point out that Hurley found the parade organizer to be "intimately connected" to the speech it compiled even though the parade failed to convey "an exact message." The covered platforms are not "common carriers," they claim, because the platforms "constantly engage in editorial filtering ... pursuant to policies they publish and enforce." In their view, the fact that platforms make "individualized determinations about which speech to disseminate and how" — unlike a common carrier — was the very reason Florida and Texas enacted laws limiting this discretion. The United States filed a brief in support of the trade groups, defending the Eleventh Circuit’s approach to editorial discretion — although that brief also suggested the trade groups’ view of speech hosts’ rights sweeps too broadly at times.

The trade groups argue that the Florida and Texas laws should be subject to strict scrutiny because the laws compel speech and contain other content-based distinctions. The Florida law’s focus on journalistic enterprises and speech by or about political candidates arguably singles out specific subject matters for different treatment. The Texas law contains exceptions that allow sites to censor specific types of content and completely excludes news, sports, and entertainment sites apparently based on the content of the speech they carry. Finally, more specifically addressing the notice-and-explanation requirements, the trade groups observed that the U.S. Supreme Court has never applied Zauderer to uphold a disclosure requirement outside the context of correcting misleading advertising and argued that Zauderer’s lenient review should not apply to requirements that “have nothing to do with advertising."

Both Florida and Texas portray their laws as permissible nondiscrimination regulations, arguing that the laws regulate nonexpressive conduct rather than speech. Florida, for instance, claims that the platforms’ hosting of third-party speech is "inherently nonexpressive conduct" because the platforms "are generally open to all users and content" and most often do not make individualized decisions about whether to allow specific content. Florida contrasts this to the "deliberate selection and expression" at issue in cases like Hurley. Further, Florida asserts that the platforms’ decisions about how to arrange content are not made with the intent to convey a message or promote certain content. It further claims that the platforms do not have an expressive interest in censoring journalistic enterprises or political candidates.

Texas’s arguments in favor of its law are somewhat different. Texas argues that its law regulates conduct because the platforms’ "‘dominant market shares’ allow them to exercise ‘unprecedented’ and ‘concentrated control’ over the world’s speech" — including by "favoring certain viewpoints." Texas focuses on historical regulation of certain mediums of communication, asserting that the covered platforms "are today’s descendants" of common carriers. A group of almost 20 states filed an amicus brief in support of Florida and Texas that raised somewhat similar arguments, focusing largely on the historical precedent for regulating platforms for mass communications. Like the Fifth Circuit, Texas claims there is no First Amendment right of "editorial discretion." Instead, if the platforms seek to avoid being treated as common carriers, Texas argues (among other factors) that services must be provided according to individualized contracts that vary by the customer rather than provided on general terms to all customers.

In the alternative, both Florida and Texas argue that even if their content moderation provisions implicate speech, they are content-neutral laws that should survive intermediate scrutiny. Both states also claim their notice-and-explanation provisions should be upheld under Zauderer, saying that the Supreme Court has never expressly limited that case to the advertising context and arguing that any compliance burdens on platforms will be minimal.

?

IV. Considerations for Congress.

These cases at the U.S. Supreme Court could shake up a broad swath of the online landscape and define how lawmakers regulate social media. They pit Texas and Florida laws against the First Amendment rights of large internet companies as the U.S. Congress has largely stayed at a standstill over internet policy. Both involve the constitutionality of state laws, but some Members of the 118th Congress have expressed interest in regulating online platforms and, more specifically, in regulating online content moderation. For instance, one bill would make it unlawful for a social media service to "de-platform" citizens based on their "social, political, or religious status." Some bills would limit the scope of a federal immunity provision known as Section 230 with the goal of disincentivizing sites from restricting user content — for example, allowing liability if certain sites engage in content moderation activity that promotes “a discernible viewpoint.” A number of other bills would impose various transparency requirements on platforms, such as requiring platforms to disclose their terms of service, including their content moderation practices. Some of these bills would specifically require an explanation of platforms’ decisions to restrict content and require platforms to provide a complaint process to appeal the decision. If the Supreme Court clarifies the scope of constitutional protections for editorial judgment and, more specifically, weighs in on the constitutionality of Florida’s and Texas’s content moderation and disclosure provisions, its opinion could affect Congress’s ability to enact similar proposals.

?

Hat tip to Valerie C. Brannon, Legislative Attorney at the Congressional Research Service, for contributing content, supplying hyperlinks, and sharing quality research. Founded in 1914, the Congressional Research Service is a public policy research institute of the United States Congress. Operating within the Library of Congress, it works primarily and directly for members of Congress and their committees and staff on a confidential, non-partisan basis.


The U.S. Supreme Court Building in Washington, D.C.


要查看或添加评论,请登录

社区洞察

其他会员也浏览了