Major social media platforms, including Meta, TikTok, and Snap, have committed to participating in a new external ratings initiative designed to assess their effectiveness in safeguarding adolescent mental health. This initiative, part of the Mental Health Coalition’s Safe Online Standards (SOS) program, outlines approximately two dozen criteria that encompass platform policies, governance, transparency, and content management.
Led by Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention, the SOS initiative aims to provide a transparent, user-focused evaluation of how digital platforms design their products to protect users aged 13 to 19 from harmful content related to suicide and self-harm. The participating companies will voluntarily submit their policies, tools, and features for review by an independent panel of global experts.
Platforms will receive one of three safety ratings based on their compliance. The highest designation, “use carefully,” will be accompanied by a blue badge that indicates adherence to baseline safety standards. Despite this status, the requirements mainly include user-friendly reporting tools and clear parental controls for privacy and safety settings. Other ratings include “partial protection,” which notes the existence of safety features that may be difficult to locate, and “does not meet standards,” indicating inadequate content moderation capabilities.
The Mental Health Coalition, which was established in 2020, has included Meta among its partners since its inception. Past collaborations have focused on destigmatizing mental health and improving resource accessibility via social media platforms. Recent efforts have included the launch of campaigns encouraging parents to discuss healthy social media use and programs for tech companies to share data on content guideline violations.
Recent scrutiny around Meta’s internal findings on the negative impact of its products on mental health has led to ongoing legal challenges, including a trial in California addressing allegations of child harm linked to its addictive features. Other participants in this ratings program include Roblox, which has faced criticism over children’s safety, and Discord, which has implemented stricter age-verification measures amidst concerns over child endangerment.
Key Points:
– Meta, TikTok, and Snap to participate in Mental Health Coalition’s SOS initiative.
– SOS includes 24 criteria focusing on policies, governance, and content oversight.
– Platforms rated as “use carefully,” “partial protection,” or “does not meet standards.”
– Emphasis on user-friendly reporting tools and parental privacy controls.
– Ongoing scrutiny of Meta regarding its products’ mental health impacts.
