Technology & Privacy
State Children's Online Safety Laws Expand Beyond Social Media in 2026
April 30, 2026 | Max Rieper
Lawmakers introduced nearly 300 state children's online safety laws this year, targeting how companies collect data from minors and expose them to harmful content, predatory behavior, and addictive features. Recent jury verdicts against Meta and Google have added momentum to this legislative push. Twenty-six states now require age verification to restrict minors' access to adult websites, with West Virginia becoming the latest to pass such legislation. These laws typically apply to sites where one-third or more of content is deemed "materially harmful" to minors. Social media parental consent laws take two approaches: some set hard age limits or require parental approval for minors to join platforms, while others mandate design changes like restricting auto-play, infinite scroll, and push notifications during school hours. AI companion chatbot regulation has emerged as the newest frontier, with Idaho, Oregon, and Washington enacting laws that require operators to prevent chatbots from claiming sentience or initiating sexual conversations with minors. Several other states are close to passing similar measures. State child protection legislation faces constitutional challenges under the First Amendment, particularly around age verification systems that may burden adults' access to lawful content, though the Supreme Court recently applied a lower scrutiny standard to a Texas law restricting access to adult websites.