Legal
Foreign Adversary Property Bans Advanced Across 38 States in 2025
January 26, 2026 | Jason Phillips, Anthony Amatucci
February 4, 2026 | Sandy Dornsife
Key Takeaways:
Over the past five years, state legislators have increasingly made efforts to pass legislation regulating social media. We highlighted this trend all the way back in 2021 and highlighted efforts to implement online safety mechanisms for minors in 2023. In October, we discussed attempts by states to limit minor access to social media platforms and illustrated the constitutional challenges that these laws face, which have led to the permanent or temporary blocking of many of them, including laws in Arkansas, Ohio, California, Florida, and Georgia. Recently, Arkansas’s newest legislation imposing liability on social media platforms for certain mental health consequences on minors was again blocked as a violation of the First Amendment’s protection of free speech.
As a result of the legal obstacles faced by social media legislation, many groups are turning to tort law as a means of holding social media companies accountable for harms allegedly caused by their platforms.
Tort cases against social media platforms focus primarily on the applications’ function instead of the content itself. Plaintiffs argue that these products are intentionally designed to be addictive and algorithms encourage compulsive behaviors that are inherently harmful to minors’ mental health. Such arguments are reminiscent of similar claims that proved successful against the tobacco industry in the 1990s. The litigation response arguments tend to rely heavily on Section 230 of the Communications Decency Act, asserting that the law grants broad scale immunity from such actions. The original architects of Section 230 designed the law to encourage the exercise of free speech by protecting online platforms from liability for user posts. While this defense has been successful in many cases regarding controversial user content, state lower courts have almost universally ruled that Section 230 does not apply to social media addiction claims. Recently, developments in several cases shed light on the potential future of these cases.
On December 5, Massachusetts’ Supreme Judicial Court held oral arguments for Commonwealth v. Meta Platforms in which the state alleges that the defendant created a public nuisance when it implemented an addictive algorithm specifically aimed at increasing minor use of their application. The state also accuses the defendant of deliberately misleading the public regarding the safety of its platforms. Meta counters that its design features are protected by both the U.S. Constitution and state constitutions’ protections of free speech and Section 230 of the Communications Decency Act. The Superior Court held that Section 230 does not protect a platform from its own speech which includes any of its “misrepresentations of the safety of its platform, its efforts to protect the well-being of young users, and its age-verification efforts.” Additionally, the Court held that the Section did not apply as the plaintiffs were attempting to hold the defendant liable for its own design features and not the content of a third-party. During oral arguments on appeal to the state supreme court, several justices of the Court seemed to sympathize with state arguments and express skepticism that the defendant’s free speech was being restricted as the platform's content was not being challenged, but its manner of delivery. If the Massachusetts high court permits the case to proceed in this landmark case, it would be a major victory for social media regulatory advocates and serve as a model for other states in the future.
Additionally, there were new developments in one of the largest cases against social media platforms, California, et al. v. Meta, Inc. et al., in which eighteen state attorneys general, including twelve Democrats and six Republicans, allege that the defendants deliberately employ addictive algorithms to prolong their use, disproportionately harming the mental health of minor users. The AGs in the case, along with attorneys general from eleven other supporting states (six Republicans and five Democrats), argued that the issue would be best resolved by a single joint trial. The states argue that the defendant’s proposition of 19 individual cases is an attempt by the corporation to procedurally delay the case and would be unnecessarily repetitive.
The number of cases against social media platforms has skyrocketed over the last several years. There are currently thousands of claims filed across almost every state in the nation. In the Multi-District Litigation (MDL) of In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation filed in the Northern District of California, almost 2,200 individual actions have been filed claiming that all four major social media platforms deliberately designed their products to be addictive and produce problematic behavior that can lead to serious mental health outcomes for children. Plaintiffs in the case include individual parents, various representative organizations, school districts, and state and local governments. Trials in six bellwether cases from the MDL will begin in 2026. The outcome of these cases, as well as the final disposition of legal challenges against laws like the one in Arkansas which imposes liability on social media platforms for knowingly causing damage to young people, will provide the foundation upon which future state legislation will be based. It is unlikely that legislatures will wait for a final word, however, before taking action. In September 2025, U.S. Senator Dick Durbin introduced legislation that would expand the definition of “product” to include most social media platforms and open up the developers to claims for defective design, failure to warn, breach of express warranty, and strict liability. With interest at a national level, state proposals will not be far behind.
Federal and state legal activity can have significant policy and regulatory implications for businesses and organizations. If your organization would like to further track federal and state legal activity, please contact us.
January 26, 2026 | Jason Phillips, Anthony Amatucci
December 9, 2025 | Marvin Yates
October 17, 2025 | Sandy Dornsife