Always read the bill.
It’s a pretty good maxim for those seeking to understand the difference between what lawmakers say and what they mean, but also a pretty big ask for people not versed in the peculiarities of legislative drafting.
So perhaps better advice is this: don’t take press releases at face value.
On Tuesday, state Sen. Sue Rezin, R-Morris, issued a release recapping a Senate Judiciary Committee hearing focused on her Senate Bill 1126, which Rezin described as “regulations to protect minors from harmful aspects of social media platforms.”
The hearing included testimony from several experts. Camille Carlton, senior policy manager for the Center for Humane Technology, said the idea was proper balance of innovation and guardrails concerning things like algorithms and collection of personal data. Robert Weil, from the American Federation of Teachers, called proposed controls “digital seatbelts” that might help parents protect children. Matthew Bergman, from the Social Media Victims Law Center, cited research on how social media affects mental health, increases depression, anxiety and loneliness and noting minors are more susceptible to everything.
“If you look at what has occurred since 2012, you see a spike in mental health challenges among our young people and what is very significant about that is that this spike coincides with the advent of social media,” Bergman said. “You could solve 80 percent of the problems in two weeks by simply turning down the algorithms making them less addictive and providing some verification so that a person really is who they claim to be.”
As a parent, as well as an adult who has spent way too much time on social media, starting well before 2012, the upside of regulation is evident – especially using logic about the algorithms built specifically to prey on emotions just to keep users engaged regardless of any real-world consequences.
On the other hand, the legislative language itself has a few red flags, including one near the top of page two, defining an online service as “likely to be accessed by children” to mean, in part: “the online service, product, or feature is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children.”
That definition puts online dictionaries and encyclopedias in the same barrel as YouTube and Instagram. My seventh-grader shouldn’t be on X, formerly known as Twitter, but I love the hours he devotes to baseball-reference.com. I doubt even Tuesday’s experts disagree on which is a net negative for humanity.
The point isn’t opposing SB1126 in its entirety, nor suggesting Rezin was being disingenuous or would resist fine-tuning efforts. It’s just an example of the importance of going beyond initial assessments to seek a deeper understanding of intent and consequence.