New York seeks to limit addictive social media feeds in bid to stop deteriorating youth mental health

Gov. Hochul and other top New York officials are cracking down on Facebook, Instagram, TikTok and other social media platforms in a bid to get children to break the often dangerous habit of scrolling their feeds for hours on end.

A package of bills — introduced this afternoon by Hochul, Attorney General Letitia James and state lawmakers — would let kids and parents limit or opt out of algorithmic feeds on social media that drive engagement and glue children to their screens.

Long Island parent Kathleen Spence, at a press conference Wednesday in downtown Manhattan, said her daughter struggled with mental health issues the family blames on social media content.

“They don’t care if this engagement comes from addicting children to their product,” Spence said. “And pushing harmful material, from eating disorder to suicide content to instructions — instructions — on how to self-harm, as long as the kid’s eyes are on the screen.”

The proposed regulations, sponsored by state Sen. Andrew Gounardes (D-Brooklyn) and Assemblywoman Nily Rozic (D-Queens), also prohibit platforms from sending notifications to minors between midnight and 6 a.m., when children require some much-needed rest, without parental consent.

“Young New Yorkers are struggling with record levels of anxiety and depression,” said James, “and social media companies that use addictive features to keep minors on their platforms longer are largely to blame.”

“This legislation will help tackle the risks of social media affecting our children and protect their privacy,” James said.

Social media has been linked to dangerous mental health consequences for young people.

“It follows you,” Hochul said of the platforms’ algorithms. “It preys on you. You don’t ask for this content. It finds its way to you by very sophisticated ways that the social media companies have created to continue bombarding you and penetrating your mind with images and thoughts.”

“Knowing how dangerous these algorithms are, I will not accept that we are powerless to do anything about it,” she said.

Kids who spend more than three hours per day on social media double their risk of poor mental health symptoms such as depression or anxiety, research show. The platforms can also contribute to a lack of sleep, exposure to harassment and other problems.

Social media has been linked to dangerous mental health consequences for young people. (Shutterstock)

Last spring, U.S. Surgeon General Dr. Vivek Murthy issued an advisory about the “significant” public health challenge posed by social media on young people, especially between the ages of 10 to 19 during “a highly sensitive period of brain development.”

And children in the Big Apple have not been immune to growing concern about the well-being of young people.

Close to 4 in 10 of the city’s high school students reported feeling so sad or hopeless for the better part of at least two weeks in 2021 that they stopped doing their usual activities, according to city plans.

Over the last decade, the share of students who reported considering suicide jumped 4 percentage points to nearly 16% of local high school students. Close to 1 in 10 of them said they attempted suicide that year.

Under the proposed Stop Addictive Feeds Exploitation, or “SAFE,” for Kids Act, families who opt out of the algorithms would turn back the clock on how most social media platforms such as Facebook and Instagram function — and bring back chronological feeds of people they know or popular content and search capabilities.

Parents who allow their kids to continue accessing the algorithms can limit their use, such as suspending them overnight or setting daily time limits.

“Unlike other mainstream products like smoking, alcohol and vehicles, which are subject to rigorous government regulation to protect children, social media currently lacks any such meaningful safeguards,” said Gounardes. “Algorithms are the new tobacco. Simple as that.”

A second bill, the New York Child Data Protection Act, would prohibit websites from collecting or selling kids’ personal data for targeted advertisements. It was based on a bill proposed by Gounardes last fall.

It was not immediately clear how two opt-outs — for websites and applications that receive “informed consent” or “unless doing so is strictly necessary for the purpose of the website” — could curb the impact of the proposed law. Kids under 13 would need parental consent.

Both laws would authorize the state attorney general to seek penalties up to $5,000 per violation or parents to sue for that amount per incident.

On Wednesday afternoon, state officials acknowledged they are anticipating legal challenges to the bills.

“We know there will be some headwinds,” said Hochul. “There’ll be naysayers. In fact, there will probably be people running to the courthouse within the next hour.”

Concerns about the legislation from technology executives emerged soon after the announcement.

“We want young people to have safe, positive experiences across the internet,” said Antigone Davis, head of global safety at Meta, the parent company of Facebook and Instagram. “That’s why we’ve built safety and privacy directly into teen experiences.

Davis pointed to dozens of tools embedded in Instagram for families, such as parental supervision features that let them see how long their children are spending on the platform or set time limits, and who they follow or follow them. The application already lets users toggle to a feed of accounts they follow, though that is not the default setting.

Algorithms, meanwhile, can help young people find posts and accounts with similar interests, organizations and causes, the tech giant said. And some kinds of content — including those that discuss self-harm or eating disorders, or depict violence — should not be recommended to users.

TikTok also has controls in place for children, including a 60-minute time limit and for the youngest teens, no push notifications at night, according to the company. The application also allows for “family pairing,” so that parents can manage settings from making their child’s account private to restricting who can send messages to the user.

“Protecting children’s safety online is a goal that Tech:NYC and each of our member companies share,” said Julie Samuels, executive director of the local industry membership organization. “In pursuing this goal, however, we must not sacrifice user privacy or First Amendment rights, which these proposals would inadvertently risk doing.”

Hochul and other officials said the bills address specific features of the platforms, without bans or restrictions on free speech.

“Simply put, these two bills target the most dangerous aspects of social media. And they do so in a way that we believe will stand up to scrutiny and to challenges,” said James.

#York #seeks #limit #addictive #social #media #feeds #bid #stop #deteriorating #youth #mental #health
Image Source :

Leave a Comment