Summarize this content to 2000 words in 6 paragraphs
(BigStock Photo)
Washington state lawmakers are moving ahead on legislation that aims to protect kids from the addictive impacts of social media apps.
Legislative committees in both the House and Senate on Friday passed their respective bills, teeing them up for votes by the full chambers.
Supporters point to evidence that social media use by teens is linked to depression, anxiety, harmful body images and suicidal thoughts.
“We have a lot of kids who are really struggling and we need to do something,” Stephan Blanford, executive director of the nonprofit Children’s Alliance, told GeekWire. He noted Washington state ranked 48th last year for its rates of youth mental illness and access to mental health services.
While it’s difficult to prove that social media is the sole cause of the youth mental health crisis, many researchers argue for a correlation. One-third of teens say they use YouTube, TikTok, Snapchat, Instagram or Facebook “almost constantly,” according to a survey last year by Pew Research Center. At the same time, 40% of high school students report persistent feelings of sadness or hopelessness, while 20% seriously considered suicide, based on federal data from 2023.
Opponents acknowledge these concerns, but say that tech companies are working to make their platforms safer and that the proposed regulations violate free speech and other constitutional rights.
House Bill 1834 and Senate Bill 5708 were requested by Washington state Attorney General Nick Brown and are supported by Gov. Bob Ferguson. The bills are largely backed by Democratic lawmakers and include Republican sponsors as well.
Key provisions of the measures include:
Stronger requirements for social media tech companies’ verification that a user is a minor, and additional privacy protections for minors.
Restrictions on a platform’s ability to use algorithms that deliver addictive content to a minor’s feed, enticing them to remain on a site.
Prohibiting social media sites from providing their services to minors between midnight and 6 a.m., and between 8 a.m. and 3 p.m. on weekdays from September through May. The prohibition also applies to sending notifications during those windows. Parental consent can override the restrictions.
Limitations on the use of “dark patterns” on minors, which is creating interfaces that can trick users into making choices they might not otherwise.
Requiring platforms to allow all users, of any age, to set time limits on their use of the app; block the sharing of “likes” and other feedback; and restrict the use of algorithms that generate more addictive streams.
There have been other national and state-led legislation to tackle these concerns, but this is the first time Washington lawmakers have pursued these laws.
The U.S. Senate last summer approved two measures addressing digital safety for kids: the Children and Teens’ Online Protection Act and the Kids Online Safety Act, but neither was approved by the U.S. House.
California passed bills in 2022 and 2024 regulating social media impacts on kids, and numerous provisions included in those laws are incorporated into Washington’s measures. The laws have, however, faced legal action.
NetChoice, a trade association for online companies, sued to block the California laws. Subsequent court rulings have limited the enforcement of both laws, allowing some provisions to be enacted. NetChoice’s appeals are ongoing.
Opponents to Washington’s bills cautioned during a House committee hearing in Olympia last week that similar legal challenges could cost taxpayers millions of dollars — while also noting that the state is facing a $15 billion budget shortfall over the next two years.
“We share the sponsor’s goal to protect minors online,” said Amy Bos, NetChoice’s director of state and federal affairs. “However, respectfully, we must oppose this legislation as it raises serious policy and constitutional concerns.”
In their testimony, Bos and Rose Feliciano of TechNet referenced a Florida law that tried to restrict content and was ultimately struck down by the U.S. Supreme Court.
But Washington and two school districts in the state have already stepped into the legal ring, suing social media platforms for their alleged harm to kids.
More than two years ago, Seattle Public Schools and the Kent School District sued major social media companies including Meta, TikTok, YouTube and others, claiming that they’re driving a youth mental health crisis.
In October 2023, Ferguson, who was then the state’s attorney general, joined 42 other AGs in a suit against Meta alleging that the parent company of Instagram and Facebook was knowingly targeting youth in the pursuit of profits. Ferguson filed a similar suit against TikTok last October.
In a recent hearing for HB 1834, supporters seemed to argue a case for pay-me-now-or-pay-me later when it came to the impact of social media on kids.
“The Attorney General’s Office is well aware of the economic situation and the budget situation of the state,” said Adam Eitmann, the AG’s legislative director.
“We believe that this is a small investment … that will help improve the mental health outcomes for our kids,” he added. “As long as eyeballs equal money, the status quo will continue. And we think that that is unacceptable.”
RELATED: Are social media companies to blame for youth mental health crisis? It’s complicated