Article Legal Notes By David Urban

Will these five landmark social media rulings impact your city?

David Urban is senior counsel with Liebert Cassidy Whitmore, representing public agencies in all aspects of labor and employment law. He can be reached at durban@lcwlegal.com


The U.S. Supreme Court in its 2023-2024 term confronted the issue of First Amendment free speech rights and social media directly. The court issued an unprecedented number of decisions that clarify the rights and obligations of the public, government, and social media companies in this still-developing domain of expression. Here is what cities should know about these five landmark cases.

New test for city officials’ social media accounts

Can a city manager, council member, or other member of a government agency be liable under the First Amendment when maintaining their own personal Facebook, X/Twitter, Instagram, or other social media page on the grounds that they wrongfully censored the public? The Supreme Court confirmed in two cases this term, Lindke v. Freed and O’Connor-Ratcliff v. Garnier, that the officials could, as long as they used their social media pages in particular ways. 

O’Connor-Ratcliff involved two members of the Poway Unified School District Board of Trustees near San Diego. In 2014, those individuals created public Facebook and Twitter pages to promote their campaigns for office and continued to use those pages to discuss district-related issues. The second case, Lindke v. Freed, centered on the Facebook page of James Freed, the city manager of Port Huron, Michigan, who posted both personal content (such as family events and picnics) and content related to his job.

Both Lindke and the Garniers suppressed posts by commentators who provided unwelcome content, and those commentators sued for violation of their First Amendment rights.

In Lindke, the Supreme Court’s opinion set forth a test for determining when “state action” exists under these circumstances, i.e., when the official can be held liable for unlawful censorship. The official must have both (1) possessed actual authority to speak on the state’s behalf on a particular matter, and (2) purported to exercise that authority when speaking in the relevant social media posts. 

Justice Barrett, writing for the majority, noted that “officials too have the right to speak about public affairs in their personal capacities,” and “[l]est any official lose that right, it is crucial for the plaintiff to show that the official is purporting to exercise state authority in specific posts.” The Supreme Court sent the cases back to the lower courts to apply this test.

The cases highlight the need for city officials to exercise caution in how they use their social media accounts. City officials may want to separate work and personal accounts. It would also be prudent to comply with constitutional free speech standards when moderating content for social media pages related to a city official’s work (in case state action did exist for any of the officials’ posts). This involves, among other things, providing sufficiently specific standards for what commentator posts can be removed and avoiding viewpoint discrimination.

Content moderation cases raise red flags over creative First Amendment arguments 

Can a city issue directives that control how social media companies themselves engage in content moderation? It is far-fetched to believe that a city would attempt such an encroachment on social media companies. But two states had concerns about companies’ apparent suppression of viewpoints and tried to do so — giving rise to Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton. In those cases, Florida and Texas respectively passed laws that purported to restrict how Facebook, X, Instagram, and other companies moderate content.

In their legal challenges, the social media companies turned the tables by arguing that they held the true First Amendment rights through their moderation decisions, which they analogized to editorial control by a newspaper or magazine. Such periodicals have the right to edit articles submitted to them and to pick which articles to publish.

The Supreme Court vacated the decisions of both lower appellate courts on the basis that the record had not been sufficiently developed, among other things. It sent the cases back to the lower courts to hold further proceedings and evaluate the cases based on a more in-depth presentation of evidence and issues.

The court’s majority opinion, written by Justice Kagan, predicted that Facebook and YouTube in particular would win their First Amendment arguments for important parts of their content moderation, i.e., that the moderation itself did constitute First Amendment-protected speech. Several justices disagreed and did not join in the part of the majority opinion making the prediction.

The significance of these cases for cities is secondary since local governments have not entered the fight on content moderation on the internet. But the cases raise red flags about government regulation of enterprises that can invoke free expression as part of their business model. This is true even though it may be counterintuitive that something like editorial control can itself constitute free expression. More broadly, Moody and Paxton encourage creative First Amendment arguments to oppose government restrictions in any form.             

Content moderation ruling could open the door to liability

When does government speech directed at private social media companies in an effort to force them to make certain content moderation decisions amount to state action in violation of the First Amendment? This is the issue of Murthy v. Missouri, in which the State of Missouri and several individual plaintiffs argued that the Biden Administration’s efforts, starting January 2021, to stop the spread of COVID-19 disinformation on social media amounted to a violation of the First Amendment rights of Missouri citizens to post or receive such information.

The First Amendment generally restricts government actors and not private companies — including social media companies. But what happens when the government supposedly participates in, and even coerces, “private” censorship? Are citizens entitled to an injunction to stop this conduct? For now, the court declined to decide the question and returned the Murthy case for further proceedings on the basis that the state and individual plaintiffs lacked “standing.”

To sue in federal court, a plaintiff must show a sufficient concrete “case or controversy.” Justice Barrett, writing for a 6-3 majority, explained that the future harm that Missouri and the other plaintiffs alleged that supposedly justified an injunction was simply too speculative to make this showing. However, the court left open the theoretical possibility for a local government agency to have some liability under the First Amendment for coercing a social media platform to remove content. 

Conclusion

Even if the Supreme Court had definitively ruled on each of the issues in these five cases, this term would not have seen the end of the debate over how the First Amendment applies in each situation involving social media. The strength of the interests involved and the complexity of the doctrines guarantee further legal challenges could make their way to the Supreme Court.

Free speech is a foundational right in our nation, and cities have a front row seat to its fight for continued vitality when confronted with this developing 21st century technology.