Nebraska advances bills to protect kids from AI, social media, and addictive apps
(Photo by Andrey K on Unsplash.)
Nebraska’s LB 383, the Parental Rights in Social Media Act, advanced to the second round of debate in Lincoln on a strong 38-2 vote late last week. The bill, sponsored by Sen. Tanya Storer (R-Whitman), would require social media companies to verify parental consent prior to allowing a minor to create a social media account.
LB 383 is one of the bills introduced in January by Nebraska Gov. Jim Pillen (R), who partnered with lawmakers to craft legislation to combat online abuse, as well as apps and online services with addictive elements.
The proposals have been working their way through the Nebraska Legislature and are getting closer to final floor votes.
If successful, the bills would prohibit AI-generated child pornography, empower parents to limit their children’s access to apps and online services, and require platforms to obtain parental consent before minors are allowed to create social media accounts.
In Nebraska, the sole unicameral legislature in the U.S., bills must get out of committee and be read on the floor at least three times before a final vote can be taken. Nebraska legislators are scheduled for a 90-day session this year, with adjournment tentatively set for June 9.
where the three bills stand now
LB 383: The Parental Rights in Social Media Act
Sen. Storer’s bill, which would require social media companies to verify parental consent prior to allowing a minor to create a social media account, was amended to fold in LB 172 (Prohibit Conduct Involving Computer-Generated Child Pornography) after that proposal stalled in committee.
The bill now also addresses AI-generated and other child pornography created by minors by making such crime a Class III felony. It would add “computer-generated child pornography” to the statutes that prohibit child pornography. Adults who violate the act would be guilty of a class 1D felony.
A federal judge recently struck down an Arkansas law requiring parental consent for minors to establish social media accounts. According to the Nebraska Examiner, Sen. Storer defended her bill during last week’s floor debate as being more closely aligned with existing laws in Tennessee and Florida, which have not been struck down.
“I’m not waiting. I’m not going to sit here and wait. Well, we lose more kids to suicide, depression and anxiety,” Storer said.
Following last week’s 38-2 vote, the bill is ready for its second of three floor readings, which could happen any time.
LB 140: Relating to Use of Electronic Communication Devices by Students
This bill, sponsored by Sen. Rita Sanders (R-Bellevue), would require each public school board to adopt policies to restrict the use of phones in schools before the start of the next academic year.
The bill includes some exceptions to this policy, such as medical needs, emergency situations, educational purposes, or if the child has an individualized education plan.
The proposal hasn’t had any major amendments and is now ready for a final floor reading and vote, which could happen any time.
LB 504: The Age-Appropriate Online Design Code Act
This bill, sponsored by Sen. Carolyn Bosn (R-Lincoln), seeks to protect children from the harms of social media and other online services. The proposal is intended to target social media companies at the design stage.
The bill requires social media and other online services to include design features that prevent compulsive usage of the product, severe psychological harm such as anxiety and depression, severe emotional harm, identity theft, and privacy violations.
The Age-Appropriate Online Design Code Act also requires these services to provide parents the ability to manage their child’s privacy and account settings and allow parents the ability to restrict the hours of use of these services. Notably, the bill restricts the ability to send push alerts during hours children are in school or sleeping.
It includes a penalty up to $50,000 for each violation. Other provisions would require platform providers to: assume all users are minors unless and until they learn otherwise; provide security settings at the highest level unless the user changes the settings themselves; and avoid using “dark patterns” or any deceptive practices that would subvert or impair user autonomy.
The proposal has received its second reading and is ready to move on to a third reading and vote, which could happen at any time.