YouTube is implementing new safeguards to reduce teens’ exposure to videos that glorify specific fitness standards or physical appearances, as announced on Thursday. This initiative, which initially began in the U.S. last year, is now being extended globally for teenagers.
This decision follows a series of criticisms directed at YouTube over the past several years, focusing on its potential to negatively impact teens by exposing them to content that may contribute to eating disorders.
The content being targeted includes videos that compare body features and yearn for specific fitness levels, body types, and weights. Furthermore, content showcasing “social aggression,” such as non-contact altercations and intimidation, will also see restricted access for young viewers.
YouTube acknowledges that while individual videos may not seem harmful, continual exposure to such themes can be problematic for impressionable teens. To address this, the platform will limit the repetition of recommendations for videos that fall under these categories.
Given that YouTube’s suggestion algorithm relies heavily on user engagement and viewing habits, the introduction of these protections aims to shield teens from being constantly exposed to content that, although compliant with platform guidelines, may foster unrealistic ideals.
Dr. Garth Graham, YouTube’s global head of health, explained in a blog post, “As teens mold their identities and develop personal standards, frequent exposure to content that promotes idealized norms can lead to the internalization of negative self-perceptions.”
This announcement follows YouTube’s introduction of a new feature that allows parents to connect their accounts to their teens’, gaining insights into the teens’ activities on the platform. Once linked, parents can receive notifications about their teen’s channel activity, including the number of uploads and subscriptions.
This technology builds upon YouTube’s existing parental control features, permitting parents to create supervised accounts for children under the official age of consent, which is 13 years in the U.S. Similar supervised account options are also available on other social media platforms like TikTok, Snapchat, Instagram, and Facebook, ensuring that parents can maintain oversight of their young users’ online interactions.