2025 Legislative Session Dates
image/svg+xml Skip to main content
Search image/svg+xml

Key Takeaways:

  • The last few years have seen a rise in concern among lawmakers regarding children’s safety online stemming from issues such as the increased amount of time spent online, algorithms with addictive design features, social media’s relationship to mental health, online bullying, and foreign actors having influence over content viewed by minors.
  • At the federal level, these concerns have led to several bill introductions but no legislation has been enacted to address these concerns.
  • State age verification laws require social media platforms to verify the ages of individuals accessing their website. The laws being challenged require age verification of new and existing social media accounts. Additionally, some of these laws also require parental consent for minors to have a social media account and require social media platforms to give parents access to their children's accounts.
  • Courts in four states have blocked new laws aiming to protect children online before they went into effect as a result of challenges. These laws have all been challenged from the technology industry.


The last few years have seen a rise in concern among lawmakers regarding children’s safety online. These concerns stem from a number of issues such as the increased amount of time children spent online during the Covid-19 pandemic, algorithms with addictive design features, social media negatively impacting children’s mental health, online bullying, and foreign actors having influence over content viewed by minors. 

At the federal level, these concerns have led to several bill introductions and some contentious hearings, however, no legislation has been enacted to address these concerns. In light of the absence of action among federal lawmakers, state lawmakers have begun to enact their own legislation seeking to protect children online by mandating measures such as age verification and requiring accounts held by minors to have stronger privacy settings. However, some state legislation has been blocked by the courts before it can be allowed to go into effect. 

Courts in four states have blocked new laws aiming to protect children online before they went into effect. These laws have all been challenged by technology industry representatives. In each case, courts have cited the challenger’s likelihood of success on its First Amendment concerns in deciding to block these laws as the cases are argued in court. 

Age verification laws require social media platforms to verify the ages of individuals accessing their product. The laws being challenged require age verification of new and existing social media accounts. Additionally, some of these laws also require parental consent for minors to have a social media account and require social media platforms to give parents access to their children's accounts. Proponents of these measures state that parents should have more awareness and control over their children’s online activities and argue that age verification laws allow parents to do exactly that. However, opponents of these bills have raised privacy and First Amendment concerns saying that age verification requires needlessly supplying personally identifying information to private corporations and infringes on the First Amendment by restricting minors' access to information and discussions of interest to them. 


Age Verification Lawsuits

Utah

Utah enacted two online safety bills in 2023, which were set to go into effect on March 1, 2024. The first bill (UT HB 311) prohibits social media platforms from using algorithms, practices, designs, or features that could cause minors to become addicted to the platform. The second bill (UT SB 152) requires social media platforms to verify the ages of new and existing account holders in Utah and if an account holder is a minor parental consent must be given before the minor can access the account. Additionally, this bill requires social media platforms to prohibit minor account holders from accessing their accounts from 10:30 pm to 6:30 am unless a parent or guardian allows a minor to access their account during those times. 

In December 2023, a lawsuit was filed arguing that both laws are First Amendment violations that “attempt to regulate both minors’ and adults’ access to — and ability to engage in — protected expression.” This lawsuit set off a flurry of activity in the Utah legislature aimed at repealing and replacing these bills in an attempt to alleviate the court's concern. 

The first bill (UT SB 89 that Governor Spencer Cox (R) signed into law in Utah this year pushed back the enactment dates of UT SB 152 and UT HB 311 from March 1, 2024, to October 1, 2024. The legislature then enacted UT SB 194 and UT HB 464 in March 2024. UT SB 194 still requires social media platforms to verify the ages of account holders, but it removes the requirement that a parent or guardian approve the account and removes the time restrictions for when minors can access their accounts. Instead, UT SB 194 requires that the maximum privacy setting be enabled by default for minors’ accounts and requires social media platforms to offer minor account holders supervisory tools that can be made available to parents or guardians. UT HB 464 removes UT HB 311’s provisions regarding algorithms and design features and establishes a private right of action for minors who suffer an adverse mental health outcome, such as depression or thoughts of self-harm, that is at least partially attributable to their social media use. 

While the hope may have been to alleviate concerns from the legal challenges with these more narrowly tailored bills, the lawsuit appears to be poised to continue, and no motion to dismiss the case has been filed by either party despite the newly signed legislation. 


Arkansas

In April 2023, the Arkansas legislature enacted a law (AR SB 396) that requires parental consent for minors to have a social media account and requires social media companies to contract with third-party vendors to perform age verification services. A lawsuit was quickly filed arguing that the law violates the First Amendment and puts users' private information at risk. A U.S. A District Court judge blocked the law from going into effect on August 23, 2023, one day before it was scheduled to go into effect, stating the lawsuit would likely succeed on its claims the law is void for vagueness and violates the First Amendment.   


Ohio

As part of the Ohio budget, the legislature enacted the Social Media Parental Control Notification Act, which requires website operators whose website is “reasonably anticipated to be accessed by children” to obtain parental consent for children under 16 to have an account on a social media platform and to provide parents with content moderation features. Similar to other states, shortly after the was signed into law, opponents filed a lawsuit on First Amendment grounds. As was the case in other states, a U.S. District Court judge agreed with the challenger's arguments and blocked enforcement of the law in early February 2024. 


California's Age Appropriate Design Lawsuit

Separate but related to the age verification lawsuits is a lawsuit challenging California’s Age Appropriate Design Code Act (CAADCA) which was signed into law in September 2022. The law is based on similar laws that recently went into effect in the United Kingdom and Ireland and requires certain businesses that provide online services that are likely to be accessed by children to have the maximum privacy settings enabled by default and requires businesses to complete a Data Protection Impact Assessment for any online services, products, or feature that are likely to be accessed by children. 

A lawsuit was filed in December 2022 challenging the law on First Amendment grounds saying it requires content-based regulation. In addition to the First Amendment claims, the lawsuit also argues that the CAADCA is preempted under the federal Children’s Online Privacy Protection Act (COPPA) which regulates online services provided to children under 13, and Section 230 of the Communications Decency Act, which shields online service providers from liability for hosting and moderating user-generated content. 

The law was scheduled to go into effect on July 1, 2024, but in September 2023, a judge in the U.S. District Court for the Northern District of California issued a preliminary injunction against CAADCA, noting that opponents had successfully shown that the lawsuit was likely to succeed on the merits of their arguments. In October 2023, The California Attorney General, Rob Bonta, filed an appeal to overturn the injunction in the Ninth Circuit Court of Appeals and allow the law to go into effect as scheduled as the court case continues. The Ninth Circuit has yet to rule on whether the preliminary injunction will be upheld. 


2024 Legislation 

Lawsuits challenging online safety bills have not slowed down momentum at the state level. In addition to the bills signed into law in Utah, eight other states have enacted age verification and lawmakers in 32 states have introduced age verification legislation. 

Many of the age verification bills are more narrowly tailored than those that have been challenged in court and require verification to take place when a “substantial portion,” typically ⅓, of the material on their websites is harmful to minors. Additionally, likely in an attempt to survive a court challenge, many of the bills make use of the Miller test in defining material that is harmful to minors. The Miller test was established by the U.S. Supreme Court in 1973 in Miller v. California and is used to determine whether certain material is obscene by considering: (1) whether the average person applying contemporary community standards would find the work as a whole appeals to the prurient interest; (2) whether the work depicts or describes sexual conduct or excretory functions in a patently offensive way; and (3) whether the work as a whole lacks serious literary, artistic, political, or scientific value. 

For example, Florida Governor Ron DeSantis (R) recently signed legislation (FL HB 3/FL SB 1792), which prohibits individuals under 14 from having a social media account, requires parental consent for individuals 14-15 to have parental consent to have a social media account, and requires commercial entities to verify the ages of their users if ⅓ of the information they publish or distribute is harmful to minors. No lawsuit has been filed against the new law yet. Indiana and Idaho recently enacted similar legislation (IN SB 17 and ID HB 498) that requires website operators that display material harmful to children to verify the ages of their users.

Lawsuits have not yet been filed for the more narrowly tailored, age verification bills. However, whether lawsuits are filed or not, age verification and other legislation aimed at protecting children online enjoy strong bipartisan support and show no signs of slowing down in the future. 


Track State Tech Policy

Tech policy impacts nearly every company, and state policymakers are becoming increasingly active in this space. MultiState’s team understands the issues, knows the key players and organizations, and we harness that expertise to help our clients effectively navigate and engage on their policy priorities. We offer customized strategic solutions to help you develop and execute a proactive multistate agenda focused on your company’s goals. Learn more about our Tech Policy Practice