Florida and Texas laws are moving inexorably towards the Supreme Court
Supreme Court will almost certainly be deciding on how social media can moderate content. (Tingey Injury Law Firm/Unsplash)

Florida and Texas laws are moving inexorably towards the Supreme Court

Everything has aligned to make it all but inevitable that the Supreme Court will weigh in on how social media platforms moderate content. Earlier this year Florida and Texas passed laws that would prohibit social media companies from making viewpoint-based content moderation decisions. Big Tech opposed the laws in court, and at this point two federal appeals courts have issued conflicting rulings on the laws. The U.S. Court of Appeals for the 11th Circuit eliminated most of Florida’s law while the U.S. Court of Appeals for the 5th Circuit upheld the Texas law. The Florida attorney general has asked the Supreme Court to break the tie. Of course, the Supreme Court could decline to decide the case, but with lower courts and now two circuit courts in disagreement that would be very surprising. Although the 5th Circuit decision would have reinstated the Texas law immediately, the Supreme Court did make an emergency ruling that the law is not in effect until after either a rehearing by the full 5th Circuit Court or the Supreme Court takes up the case.

Daphne Keller, who directs the Program on Platform Regulation at Stanford's Cyber Policy Center, has pointed out that viewpoint neutral is not innocuous and has consequences. For example, if a platform removes pro-Nazi speech, must it also take down anti-Nazi speech? If companies removes posts that encourage anorexia and other eating disorders, must it also take down comments that promote healthy attitudes towards food and nutrition? Most people involved in online Trust and Safety (or anyone who has ever used social media actually), expect these laws to bring a flood of hate speech, misinformation and all kinds of violent content.

The Florida and Texas laws are not obvious contenders for Supreme Court consideration since there is a well-established First Amendment principle that the government cannot tell newspapers, broadcasters, cable companies and even parades what their content must include. There is a long history of cases upholding editorial discretion for all kinds of publishers. In fact, NetChoice, one of the organizations representing Big Tech, has made that precise argument, stating that the platforms’ decisions are akin to the editorial decisions of newspapers and TV stations, and are therefore considered protected speech under the First Amendment. Accepting that argument sets the legal bar very high for the government to intercede in companies' moderation decisions.

The 5th Circuit sees it differently, though. In the court’s view online platforms actually do not have First Amendment protections. Apart from illegal content, it believes any content moderation amounts to censorship. “Today we reject the idea that corporations have a freewheeling First Amendment right to censor what people say,” wrote Judge Andrew Oldham in the court's decision. When social media companies take the position that lawful but awful speech can actually ruin online discourse and have real-life consequences, Judge Oldham sees an “obsession with terrorists and Nazis” full of “fanciful” and “hypothetical” ideas. Of course, anyone connected to Trust and Safety knows that risks from Nazi and terrorist content are core to a lot of what we do.

The law sets up transparency obligations, a must-carry provision, and requires that platforms allow Texas residents on the platform (in other words they can’t just block Texas users and not worry about the law). The transparency obligations are not under contention for the most part, but the must-carry and geographic provisions are at the heart of the matter. The legal notion required to consider these laws constitutional given the obvious First Amendment protections is that social media platforms are common carriers much like telephone companies and internet backbone providers. They don’t have a say in what happens on their platforms. There is at least some sympathy for this idea on the Supreme Court. In a dissent on a case that didn't even touch on the matter, Justice Thomas went out of his way to express his view that platforms should be treated as common carriers, and he has expressed the opinion that platform companies should not be moderating content at all.

The consequences of a Supreme Court decision could have repercussions well beyond what Texans and Floridans see online. According to a report shared with The Washington Post by the Computer & Communications Industry Association, another group representing tech companies in the courts, there are more than 100 bills in state legislatures across the country whose goal is to regulate social media content moderation policies. These state legislators will be closely watching how the litigation over the Florida and Texas laws resolves.

It’s a lot to keep track of. Subscribe to our newsletter to stay updated as the courts try to untangle this mess of regulations.

If you would like more information or are exploring options for AI enhanced moderation for your platform, contact us at?[email protected]. Alternatively, you can also visit our website?www.checkstep.com.

要查看或添加评论,请登录

Checkstep的更多文章

社区洞察

其他会员也浏览了