Advertisement:

Tech Groups Seek Ruling Against Florida Social Media Law

by | Sep 15, 2025

Advertisement

 


Two technology industry associations have asked a federal judge to strike down major provisions of Florida’s 2021 social media law, arguing the statute conflicts with constitutional protections for how online platforms manage user content.

In a motion for partial summary judgment filed in the Northern District of Florida, NetChoice and the Computer & Communications Industry Association (CCIA) said the law, Senate Bill 7072, regulates editorial decisions that the U.S. Supreme Court has recognized as protected speech. The groups, which represent companies including Meta, YouTube, Reddit, Pinterest and X, contend the statute violates the First Amendment, is vague in key provisions, and should be permanently enjoined.

“The Supreme Court has clearly affirmed that the First Amendment protects the editorial decisions that websites make in displaying content. This Florida law deliberately interferes with those choices and it is time for the law to be invalidated and blocked permanently,” said Stephanie Joyce, Senior Vice President and Chief of Staff of CCIA’s Litigation Center.

The law, signed by Gov. Ron DeSantis in 2021, restricts what it terms “censorship,” “deplatforming,” “post-prioritization” and “shadow banning.”

Among its requirements, platforms must apply moderation standards consistently, cannot deplatform political candidates, cannot limit the reach of content posted by or about candidates, and may not restrict entities deemed “journalistic enterprises.”

The law also limits rule changes to once every 30 days, requires users be allowed to opt out of algorithmic feeds in favor of chronological displays, and obligates platforms to provide detailed explanations for content removals or other moderation actions.

NetChoice and CCIA argue in the motion that these mandates intrude directly on decisions about what content to carry and how to present it. They contend the consistency requirement effectively forces platforms to disseminate speech whenever state officials or juries deem similar content was previously allowed.

The candidate-related provisions, they argue, would require platforms to host or give prominence to harmful material if it comes from or concerns a political candidate. The journalistic-enterprise protection could apply broadly, they said, compelling platforms to carry outlets regardless of content.

The motion also challenges the rule-change restriction as limiting companies’ ability to respond quickly to emerging threats, such as new forms of misinformation or harmful content, and argues the algorithm opt-out mandate improperly dictates how platforms arrange user feeds.

The filing comes after the Supreme Court’s 2024 decision in Moody v. NetChoice, which held that platforms engage in protected expression when they curate and organize user content, such as Facebook’s News Feed or YouTube’s homepage.