Social-Media Health Hazard? Attorneys General Call for Surgeon General’s Warning| National Catholic Register

0
Social-Media Health Hazard? Attorneys General Call for Surgeon General’s Warning| National Catholic Register

Bipartisan group cites the potential psychological harm that such sites can have on children and teenagers.

A bipartisan group of 42 state attorneys general recently called for the U.S. surgeon general to add a health warning to algorithm-driven social-media sites, citing the potential psychological harm that such sites can have on children and teenagers. 

“As state attorneys general, we sometimes disagree about important issues, but all of us share an abiding concern for the safety of the kids in our jurisdictions — and algorithm-driven social media platforms threaten that safety,” the coalition of attorneys general wrote in a Sept. 9 letter to congressional leaders. 

The attorneys general cited growing bodies of research that link young people’s use of these platforms to psychological harm, including depression, anxiety and even suicidal thoughts in kids and teens. They also noted how platforms feature “enticing algorithmic recommendations, infinite scrolling, and a constant stream of notifications, which are designed to keep kids relentlessly engaged on the platforms, even at the expense of taking breaks, engaging in other activities, or sleeping.”

State attorneys general have taken action in recent years to hold the largest social-media platforms accountable. In 2023, the attorneys general of 45 states and the District of Columbia filed a series of lawsuits against Meta, the parent company of Facebook and Instagram, alleging that the company “deployed harmful and manipulative product features designed to push young users’ engagement with the Instagram platform to dangerous levels, all while representing to the public that its products are safe.”

In addition, some states, including Arkansas, Indiana, Iowa, Kansas, Nebraska, New Hampshire and Utah, have commenced litigation against TikTok for violating their state’s consumer protection laws, they noted.

“[A] surgeon general’s warning on social media platforms, though not sufficient to address the full scope of the problem, would be one consequential step toward mitigating the risk of harm to youth. A warning would not only highlight the inherent risks that social media platforms presently pose for young people but also complement other efforts to spur attention, research, and investment into the oversight of social media platforms,” the attorneys general wrote. 

“This problem will not solve itself and the social media platforms have demonstrated an unwillingness to fix the problem on their own. Therefore, we urge Congress to act by requiring warnings on algorithm-driven social media platforms, as recommended by the surgeon general.”

U.S. Surgeon General Vivek Murthy has on several occasions signaled a willingness to highlight the health risks posed by social media, issuing in 2023 a 25-page advisory regarding social-media usage evidence for its negative effects. 

“Children are exposed to harmful content on social media, ranging from violent and sexual content to bullying and harassment. And for too many children, social media use is compromising their sleep and valuable in-person time with family and friends,” Murthy wrote in that advisory.

“We are in the middle of a national youth mental health crisis, and I am concerned that social media is an important driver of that crisis — one that we must urgently address.”

In July, the U.S. Senate voted overwhelmingly to advance extensive regulations that its supporters say will protect the safety and privacy of children on the internet. Under the bill, the government would impose a “duty of care” on social-media platforms, meaning social-media companies could be held legally liable if they are negligent in their efforts to prevent children from accessing harmful material.

Bullying and harassment, as well as sexual and violent material, are listed as harmful material covered by the legislation, known as the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0). The legislation would also require platforms to work to prevent children from accessing material that could contribute to anxiety, depression, eating disorders, and various other harm.


link

Leave a Reply

Your email address will not be published. Required fields are marked *