Lawmakers confront TikTok, Snapchat, and YouTube about eating disorder content

Social Media Eating Disorder Content.

Social Media Eating Disorder Content


Representatives from TikTok, Snapchat and YouTube testified before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security today to discuss how to protect kids online. This hearing follows Facebook whistleblower Frances Haugen‘s document leaks to the Wall Street Journal, which — among many things — exposed Facebook’s knowledge that Instagram is toxic for teenage girls. According to Facebook’s own research, 32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.



But as the Senate tries to hold Facebook accountable for its influence on teen girls, lawmakers understand that this problem doesn’t begin and end with Mark Zuckerberg. Though the companies that testified today each have policies prohibiting content that promotes eating disorders, Senators cited evidence from constituents about teenagers on these platforms who have still suffered from illnesses like anorexia and bulimia.

Blumenthal’s staff also found troubling content on TikTok. The Wall Street Journal conducted an investigation like this into the platform, creating 31 bot accounts — registered as users — between the ages of 13 and 15. The publication reported that while content glorifying eating disorders is banned on TikTok, the accounts in its investigation were still served several such videos.

Senator Amy Klobuchar (D-MN) confronted Michael Beckerman, TikTok’s head of Public Policy for the Americas, asking if TikTok has stopped promoting content that glorifies eating disorders, drugs and violence to teens.

Beckerman noted that he doesn’t agree with the Wall Street Journal’s methodology for that experiment — the users were bots programmed to search for and linger on certain content — but affirmed that TikTok has made improvements to the way users can control the algorithm and see age-appropriate content on TikTok.

Beckerman said that content related to drugs violates community guidelines and that 97% of content violating policies about minor safety is removed proactively. These numbers track with a recently released transparency report, outlining information about how content was removed on the platform between April and June 2021. Per the report, 97.6% of content violating minor safety policies were removed proactively before being reported by users, and 93.9% of those videos were removed at zero views. In the category of “suicide, self-harm and dangerous acts” — which is inclusive of content glorifying eating disorders — 94.2% were removed proactively, and 81.8% of videos had zero views.

As the hearing closed, Senator Blumenthal observed that he wouldn’t be taking the day’s testimony at face value. 

The time for platitudes and bromides is over

Senator Blumenthal

FULL STORY


Social Media Eating Disorder Content

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts