Social Media Legal Battles: What Every Business Needs to Know (Updated 2026)
An updated reference guide to the Supreme Court cases, federal legislation, and ongoing legal battles shaping social media regulation in 2024-2026, with specific implications for business owners.
Between 2023 and 2026, social media regulation moved from theoretical debate to active enforcement. The Supreme Court issued decisions in Moody v. NetChoice, Murthy v. Missouri, and Lindke v. Freed that redrew the lines of government power over online speech. TikTok survived a forced divestiture deadline and continued operating under ongoing legal uncertainty. The Kids Online Safety Act passed into law in 2024. Section 230 remains intact but faces the narrowest legal environment it has seen since its 1996 passage. Every business that uses social media for marketing needs to understand where these lines are drawn.
[!note] This post was originally published October 11, 2024, and has been updated in 2026 to reflect current case status, new legislation, and developments through Q1 2026. Legal status changes rapidly in this area. Consult a qualified attorney for advice specific to your business.
Social media law moved from background noise to front-page business risk between 2023 and 2026. Six Supreme Court decisions. One federal law affecting minor-facing platforms. Ongoing state-level legislation in 26 states. A forced divestiture order against the most-downloaded app in US history.
If you run a business and use social media for marketing, some of this affects you directly.
What Is Social Media Law for Businesses?
Social media law for businesses covers three intersecting areas. First, it covers what platforms can and cannot do: content moderation, algorithm design, data collection, and advertising targeting. Second, it covers what governments can and cannot require platforms to do. Third, it covers what liability attaches to businesses and individuals who use platforms.
Each of these areas changed materially between 2024 and 2026. Here is the current state of each major development.
Why This Matters in 2026
Three conditions converged to make social media regulation urgent in 2025-2026:
- The Supreme Court’s 2024 term produced more decisions touching online speech than any prior term. Many of those decisions did not resolve underlying questions and sent cases back to lower courts, creating ongoing uncertainty.
- TikTok’s legal situation demonstrated that a platform can be legally banned or forced into sale by federal legislation, changing the risk profile for any business that builds its marketing around a single platform.
- The state-level legislative landscape became fragmented. As of 2026, over 26 states have enacted or are actively considering laws regulating minors’ access to social platforms, creating a patchwork compliance environment for any business running national campaigns.
Supreme Court Decisions: What They Actually Said
| Case | Citation | Year Decided | Core Question | Ruling | What It Means for Your Business |
|---|---|---|---|---|---|
| Moody v. NetChoice | 603 U.S. ___ (2024) | 2024 | Can states bar platforms from content moderation? | Vacated and remanded; platforms have First Amendment editorial rights | State anti-censorship laws remain unsettled; platforms retain moderation authority for now |
| NetChoice v. Paxton | Consolidated with Moody | 2024 | Can Texas require platforms to carry all speech? | Same ruling as Moody v. NetChoice | Same practical effect |
| Murthy v. Missouri | 603 U.S. ___ (2024) | 2024 | Did Biden administration violate First Amendment by coordinating with platforms? | Dismissed for lack of standing | Government-platform coordination remains legally unsettled in the long term |
| Lindke v. Freed | 601 U.S. 187 (2024) | 2024 | Can a government official block constituents on personal social accounts? | Depends on whether the account functions as an official government channel | Government entities and officials need specific policies on personal vs. official account use |
| Garnier v. O’Connor-Ratcliffe | 601 U.S. 205 (2024) | 2024 | Same blocking question for school board members | Same fact-specific test as Lindke | Same implication for public-sector organizations |
| TikTok v. Garland | 604 U.S. ___ (2025) | January 2025 | Is the forced divestiture law constitutional? | Upheld 9-0; law is constitutional | TikTok can be shut down or sold by federal order; platform-dependency risk is real |
Moody v. NetChoice and NetChoice v. Paxton
These companion cases reached the Supreme Court after Texas and Florida passed laws designed to prevent large social media platforms from removing, demoting, or algorithmically suppressing content based on political viewpoint.
The Court’s 2024 opinion, authored by Justice Kagan, held that platforms have First Amendment rights as editorial entities. Curating and moderating content is expressive activity that the government cannot simply override. However, the majority declined to issue a final ruling on whether the specific laws were facially unconstitutional, instead sending the cases back to lower courts for a proper analysis of the complete laws.
Practical implication: For businesses, platform content moderation is legally protected and likely to remain so. Your content can still be removed or suppressed by a platform without triggering a legal remedy. This is unlikely to change even if lower court proceedings on remand produce new rulings.
Murthy v. Missouri
This case alleged that the Biden administration unconstitutionally coerced Facebook, Twitter, and other platforms to remove content about COVID-19, election integrity, and the Hunter Biden laptop story. The plaintiffs claimed their speech was effectively censored by a government-directed private enforcement mechanism.
The Supreme Court’s 6-3 majority held that the plaintiffs could not demonstrate that any specific content removal was directly caused by government pressure rather than by the platforms’ independent moderation decisions. The standing question was dispositive. The underlying First Amendment coercion question was not resolved.
Practical implication: Future litigation on government-platform coordination is nearly certain. The legal line between permissible government communication with platforms and unconstitutional coercion remains unclear. Businesses should not assume platforms are operating free from government influence on content decisions.
Lindke v. Freed and Garnier v. O’Connor-Ratcliffe
Both cases asked whether a public official who blocks a constituent on a personal social media account engages in unconstitutional viewpoint discrimination (a First Amendment violation) because the account functions as a public forum.
The Court’s answer: it depends. The test is whether the government official created the account using governmental authority and whether they use it to carry out their official duties. If yes, it is state action and the First Amendment applies. If no, it is a private account and the official can block anyone they choose.
Practical implication: For businesses that are public-sector adjacent, such as a nonprofit receiving government contracts, a school, or a public authority, your organization’s social accounts need a documented policy on comment moderation and blocking that distinguishes official from personal use. For private businesses, this ruling has limited direct impact.
TikTok v. Garland
In April 2024, Congress passed the Protecting Americans from Foreign Adversary Controlled Applications Act, requiring ByteDance to divest TikTok’s US operations within 270 days or face a nationwide ban. TikTok challenged the law on First Amendment grounds.
The Supreme Court upheld the law 9-0 in January 2025, finding that the national security interest outweighed the First Amendment concerns and that the law was content-neutral. TikTok went dark briefly on January 19, 2025, before the incoming Trump administration announced it would delay enforcement for 90 days to allow sale negotiations.
As of April 2026, TikTok continues operating in the US. ByteDance has not completed a sale. The law remains in force. The situation is unresolved.
Practical implication: If TikTok represents a significant portion of your content distribution or advertising, the platform could be inaccessible to US users with limited warning. Build parallel audience assets on platforms with stable ownership. This is not speculative. The Supreme Court already confirmed the government has the authority to shut it down.
The Kids Online Safety Act
KOSA passed in July 2024 and was signed into law. It requires platforms with a substantial number of minor users to:
- Default to the most privacy-protective settings for users under 17
- Give parents the ability to monitor and control their minor child’s accounts
- Restrict algorithmic amplification of content that promotes self-harm, eating disorders, or violence to minor users
- Provide an annual public report on the measures taken to reduce minor exposure to harmful content
For businesses running social media advertising campaigns, the KOSA implications are primarily in audience targeting. Platforms are restricting ad targeting by interests and behaviors for under-17 users in response to KOSA requirements. This means campaigns targeting demographics that skew young may see reduced targetable audience sizes on Instagram, TikTok, and YouTube.
Social Media Addiction Lawsuits: What Is at Stake
Forty-one state attorneys general filed a consolidated lawsuit against Meta in October 2023, alleging that Meta knowingly designed Instagram and Facebook with addiction mechanics targeting minors and concealed internal research documenting the harms.
Similar lawsuits are ongoing against TikTok, Snap, and YouTube.
None of these cases have produced final liability verdicts as of April 2026. The defendants are arguing that Section 230 provides immunity. Plaintiffs are arguing that product design claims fall outside Section 230’s scope.
If plaintiffs prevail and Section 230 is found not to protect algorithm design decisions, the legal landscape for every social platform changes significantly. The implication for business is that advertising products built on those algorithms could be redesigned or restricted.
Section 230: What It Is and Where It Stands
Section 230(c)(1) of the Communications Decency Act (47 U.S.C. § 230) states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
This single sentence has protected every major social platform from liability for user-generated content for nearly 30 years. It allows platforms to moderate content (under Section 230(c)(2)) without becoming liable for what they do not remove.
Section 230 has not been amended as of April 2026. But it faces the narrowest legal environment it has seen since passage. The addiction lawsuits are testing whether product design claims fall outside its scope. Congressional proposals to narrow or repeal it were introduced in both the 118th and 119th Congresses.
What to watch: If the addiction lawsuits produce rulings that platforms are not protected for algorithm design decisions, platforms will redesign recommendation systems. That redesign will affect organic content reach, paid ad delivery, and the entire distribution ecosystem that social media marketing depends on.
What Your Business Should Do Right Now
Four actions are worth taking in the current legal environment:
1. Audit your TikTok dependency. If TikTok represents more than 20% of your content reach or ad spend, build an equivalent presence on a second platform. The legal authority to shut TikTok down is confirmed. Execution timing is uncertain.
2. Review your minor-targeting practices. If you run ads on any major social platform and your targeting includes demographic ranges that cover users under 17, review current platform-level restrictions. KOSA compliance changes are already in effect at the platform level.
3. Document your social media practices. If you face a future inquiry about your advertising targeting, content practices, or data handling, documentation of your current practices is your primary defense. This is more relevant for businesses with large social presences, but applies at any scale.
4. Get a social media policy for official accounts. If your organization includes any government-adjacent function, clarify which accounts are official versus personal and document comment moderation procedures following Lindke v. Freed.
Starfish Ad Age monitors the legal environment affecting social media marketing as part of our Social Media Marketing service. Our clients in East Texas and Shreveport-Bossier LA are updated on material changes that affect their campaigns. This is one of those updates.
Questions
worth answering.
What did the Supreme Court decide in Moody v. NetChoice? +
In Moody v. NetChoice, LLC (2024), the Supreme Court addressed Texas and Florida laws that attempted to prohibit large social media platforms from moderating content based on viewpoint. The Court vacated and remanded lower court decisions, with a majority opinion authored by Justice Kagan. The Court held that social media platforms have First Amendment editorial discretion rights, but sent the cases back to lower courts to apply proper facial challenge analysis. The practical result was that both states' anti-censorship laws remain in legal limbo as of 2026.
What did Murthy v. Missouri rule? +
In Murthy v. Missouri (2024), the Supreme Court ruled 6-3 that the plaintiffs (Missouri, Louisiana, and individual social media users) lacked standing to sue the Biden administration for allegedly pressuring social media platforms to remove content. The Court held that the plaintiffs could not demonstrate a sufficient direct causal link between government communications and specific content removal decisions. The ruling did not resolve the underlying First Amendment question of when government coordination with private platforms crosses into unconstitutional coercion.
What is Section 230 and why does it matter for businesses? +
Section 230 of the Communications Decency Act (1996) provides platforms with legal immunity for third-party content hosted on their services. It means that Facebook, Google, Instagram, and YouTube cannot be sued for content their users post. For businesses, Section 230 protects you when you run review systems, user-generated content campaigns, or community forums, because the hosting platform is not liable and, in many interpretations, neither are you as a moderator of a business page. Any narrowing of Section 230 would create liability exposure for anyone running a platform feature that involves user content.
What is the Kids Online Safety Act and how does it affect my business? +
The Kids Online Safety Act (KOSA) was signed into law in 2024. It requires platforms to implement safety protections for minors, including default privacy settings, parental controls, and restrictions on algorithmic amplification for users under 17. For businesses, KOSA directly affects how you can advertise to audiences that may include minors on covered platforms. If your Instagram or TikTok campaigns target audiences that include under-17 users, you must review platform-level restrictions on ad targeting for that demographic. COPPA (Children's Online Privacy Protection Act) remains separately in force for under-13 users.
What happened with the TikTok ban in the United States? +
Congress passed and President Biden signed legislation in April 2024 requiring ByteDance to divest TikTok's US operations within 270 days or face a ban. TikTok challenged the law. The Supreme Court upheld the law unanimously in January 2025. TikTok briefly went dark on January 19, 2025, before returning under a Trump administration pledge to delay enforcement for 90 days to allow a sale negotiation. As of April 2026, TikTok continues to operate in the US under unresolved ownership status. The underlying law remains in force.
What are the Lindke v. Freed and Garnier v. O'Connor-Ratcliffe decisions? +
The Supreme Court decided both cases on the same day in March 2024. Both cases addressed whether government officials violate the First Amendment when they block constituents on personal social media accounts. In Lindke v. Freed (No. 22-611) and O'Connor-Ratcliffe v. Garnier (No. 22-324), the Court held that whether a government official's social media activity constitutes state action depends on whether the official is speaking in their governmental capacity and whether their account was created or maintained under the authority of their office. The standard is fact-specific and sent both cases back to lower courts.
Can social media platforms be held liable for addiction and mental health harms? +
As of 2026, major multi-state litigation against Meta, TikTok, and other platforms over mental health harms to minors is ongoing. The lawsuits allege that platforms knowingly designed addictive features targeting minors and concealed the harms. Meta faces a consolidated complaint involving attorneys general from 41 states. These cases are proceeding in federal court. No final liability verdict has been issued. Section 230 immunity is being challenged in this context. The outcomes will significantly affect how platforms are allowed to design recommendation algorithms.
What should a business do today to reduce social media legal risk? +
Four actions reduce your exposure. First, document your social media advertising practices, particularly any targeting that includes age-sensitive demographics. Second, review your terms of service and privacy policy for any user-generated content features on your owned digital properties. Third, if you are a government-adjacent organization (a public school district, municipal authority, or public utility), get specific legal guidance on blocking users or moderating comments on official accounts following Lindke v. Freed. Fourth, if TikTok is a material part of your marketing strategy, maintain parallel audience-building on platforms with clearer long-term operating status.
Mindy Lewellen · CEO, Partner
Mindy leads strategy, client relationships, and creative direction at Starfish Ad Age. Based in Longview, Texas. Joined the agency in 2019.
Meet the rest of the crew →
Want your business
cited by AI?
45-minute strategy call. We audit your stack, name the biggest opportunity, and tell you what we would ship first.