Here’s What Happened:

The States of Florida and Texas passed laws that limited a private internet platform’s ability to engage in content moderation. Content moderation included disfavoring or removing posts containing what the platform considered unacceptable content. This included posts advocating terrorism, glorifying gender-based violence, encouraging suicide or self-injury, etc. The intent behind these laws was, apparently, to protect user’s posted viewpoint. The practical effect of the laws was that the internet platforms could not disfavor posts no matter the content.

Trade associations brought separate lawsuits in each state challenging the laws based primarily on First Amendment grounds. In each case, the district court granted the trade associations’ motion for a preliminary injunction. Each State appealed and the States lost on appeal. The States then asked the US Supreme Court to accept an appeal of the cases. The US Supreme Court granted the requests and consolidated the cases.

The US Supreme Court, Justice Kagan writing for the majority, affirmed the lower courts and held that the state statutes were facially invalid under the First Amendment.

The First Amendment does many things. One those things is to insure that the country has a well-functioning sphere of expression. This gives citizens access to information from many sources. But the government is barred from forcing a private speaker to present views.

The States argued that the content moderation somehow skewed the original posts which should be prohibited. Justice Kagan rejected this argument. The First Amendment applies even if the internet platforms excluded, labeled, and demoted lots of content from users' feeds. In other words, the platforms had a First Amendment right to exercise editorial discretion.

Justice Kagan could have let the opinion end there. But, she went on to identify the States’ laws as interfering with a private actor’s speech to advance their own vision of an ideological balance. These types of laws are directly prohibited by the First Amendment.

The cases were remanded back to the district courts for further analysis of how these laws would actually work.

WHY YOU SHOULD KNOW THIS: This case tells us that an unhappy user has little recourse against an internet platform’s moderation of content. But, the user has to understand that a private internet platform has the right to moderate the content that it displays. 

Cited Authority:   Moody v. NetChoice, LLC, 144 S. Ct. 2383, 2388 (2024)

This website uses cookies to enhance your browsing experience and provide you with personalized services. By continuing to use this site, you consent to the use of cookies. See our Terms of Engagement to learn more.
ACCEPT