Sarah Forland
Policy Analyst, Open Technology Institute, 国产视频
In recent years, there has been considerable attention paid to youth online safety and well-being. Legislators have proposed different technical interventions to improve the internet鈥檚 safety for young people, including age-appropriate design codes, increasing parental controls, and requiring age verification. Some legislators have suggested from social media altogether. As the Open Technology Institute (OTI) has underscored in our report on age verification, youth online safety requires a holistic approach, as it鈥檚 unclear if any of the commonly proposed technical interventions can fully or directly address the challenges that young people face online.
Many of the proposed technical interventions present feasibility issues as well as constitutional and privacy concerns for users. For example, OTI, along with and legislators, have raised concerns about the censorship risks of the (KOSA), especially for youth in marginalized communities. In August 2024, a federal appeals court upheld a partial block of due to similar fears over censorship and speech regulation. And the age verification mandates appearing across state legislatures raise data privacy and security concerns and can result in over-censoring access to content for all users. OTI recently joined other civil society organizations in filing an amicus brief in support of an challenging the constitutionality of , which will be heard by the Supreme Court.
Online spaces should be safer for youth. However, quick tech fixes that can cause more harm than good are inappropriate solutions. Instead, youth deserve a thoughtful, holistic approach to improving their online experiences and overall well-being. Recent reports from the and the discuss both the benefits and risks of social media use and highlight the need for more research to fully understand its impact on children and youth.
Improving young people鈥檚 mental health and overall well-being requires a nuanced approach that acknowledges and addresses the complex socioeconomic, community, and tech-based contributing factors. Addressing youths鈥 well-being also requires recognizing the vast breadth of social, developmental, and of the millions of children, teens, young adults, and families across the United States. While many stakeholders seek to better understand technology鈥檚 impact on youth mental health and development, common-sense policies that center user privacy, safety, and rights can help improve youth experiences and mitigate known online risks.
Policy Recommendations for Policymakers, Industry, and Civil Society
Growing concerns over social media鈥檚 impact on youth have prompted , , and to try to limit the tech鈥檚 potential negative outcomes for young people. A report concluded that while social media benefits youth, it can also pose a meaningful risk to their mental health. Legislators have often sought to tackle these challenges through technical intervention. However, some of the most commonly advanced technical interventions have concerning implications for users and may cause more harm than good.
by the U.K. Children Code, there has been a to pass age-appropriate design codes (AADCs) at the state level. The campaign proposes 15 standards for company practices based on developmental stages, including privacy-by-design and by-default and required data impact protection assessments.
Despite the laws鈥 promising components, they face criticism for being overly broad and having larger implications for user access. As written, the AADCs charge online operators to prevent youth from 鈥減otentially harmful鈥 material. However, without clear definitions of what that means, online operators may feel obligated to over-censor content, limiting access to protected speech and causing a chilling effect for all users. This may disproportionately impact marginalized communities and access to politicized content.
These , in part, led to the blocking of California鈥檚 AADC in 2022 and were in a recent federal appellate ruling. Similar have surrounded the , which proposes a 鈥渄uty of care鈥 to implement design features to 鈥減revent and mitigate鈥 mental health disorders such as anxiety, depression, eating disorders, and substance use disorders. Such a broad mandate could have significant consequences for free speech online. Further, there is no clear way to determine what may influence the development of such disorders for every individual.
Parental controls allow parents to filter content, set restrictions, monitor activity, enable permissions, and link their account with their child鈥檚. However, these tools place a high burden on parents, who may not have the capacity, willingness, or digital skills to use them effectively. In fact, despite their availability, parents simply .
Recent data shared by Discord and Snapchat shows that of minors have parents who use monitoring tools. This aligns with a of parents of 13鈥17-year-olds that found parents are 鈥渞elatively less likely to use technology-based tools to monitor, block or track their teen鈥 than other measures.
In addition, these tools raise concerns for youth privacy and safety. Parental controls, when overly invasive and restrictive, can be particularly dangerous for youth who are already vulnerable, such as LGBTQ youth, those seeking access to or information about reproductive health care, or those experiencing child abuse and neglect at home.
Current age verification practices require users to provide a government-issued ID, credit card, or biometric data to verify their age. Such requirements can significantly challenge all users鈥 right to access content, as , particularly those in marginalized communities and under the age of 16, do not own a valid government-issued photo ID or hold a credit card.
Even when users have appropriate ID, without proper safeguards, the process of verifying user ages can endanger their data privacy and security. Previous efforts to implement similar age verification requirements have been ruled unconstitutional. In July 2024, however, the an appeal concerning Texas鈥檚 new law requiring age verification to access adult content. This ruling could impact future determinations of if and in which cases age verification is constitutional鈥 details the risks of such legislation.
Online spaces should be safer for youth. The challenges online environments and activity can pose to youth safety and well-being are serious. Pressing concerns such as child sexual abuse material (CSAM), cyberbullying, and access to age-inappropriate content require further attention and action to mitigate. And policymakers and public health officials鈥 current focus on social media鈥檚 impact on mental health and development deserves further exploration to deepen our understanding of exactly how social media鈥攁nd technology use鈥攊mpact youth across different stages.
At the same time, it is necessary to recognize that technological solutions can create additional challenges, such as potentially censoring access to content or putting user data at risk. And, on their own, technological solutions cannot adequately or fully address the challenges that youth face, many of which are rooted in offline issues. For example, today鈥檚 generation of young people faces impacting their development and their outlook on life鈥攐nly some of which may be amplified by online activity. Societal challenges, including increasing in communities, , , , and an , to youth mental health outcomes and development. Simultaneously, the COVID-19 pandemic disrupted key events and life experiences while changing people鈥檚 relationship with technology. Society is still trying to understand the pandemic鈥檚 total impact on our collective physical and mental health, particularly for young people whose .
While technological design can help mitigate some harms young people face online, it鈥檚 important to be realistic about the limitations of technology鈥檚 ability to address complex online safety issues comprehensively. Mental health and well-being are shaped by various complex socioeconomic, community, and tech-based factors. While online spaces can challenges youth face offline, strategies to improve online spaces for youth cannot be addressed in isolation; rather, they must holistically consider all contributing risks and protective factors. Policies that promote common-sense protections can improve young people鈥檚 experiences online by advancing privacy, security, transparency, and user agency. At the same time, these policies can reduce the potential for bad practices that amplify real-world challenges online and encourage excessive or problematic internet use.
Social media and other forms of online spaces can offer important benefits to youth, fostering a as well as creating avenues for . Rather than supporting policies that can exclude young people from online spaces, it鈥檚 important to recognize the value of these spaces and work toward improving young people鈥檚 experiences within them. There is a wide range of approaches, including investments in digital literacy and specific interventions to counter CSAM, that can improve online safety for youth. The following recommendations are not a comprehensive list of solutions to online safety challenges. Instead, they are foundational ways to center user privacy, safety, and rights that address some of the underlying concerns driving the technological interventions discussed above.
These recommendations collectively offer a clear starting point to improve experiences for all users, not just young people online. To more closely address discrete challenges, greater collaboration between stakeholders is needed. Tackling technology鈥檚 role in the youth well-being and negative outcomes requires greater exploration to differentiate what challenges can be addressed through technical design and features and what must be addressed through larger social initiatives. There is no easy or quick solution. Improving youth online safety requires a thoughtful, nuanced, and holistic approach. Rather than attempting to implement one-size-fits-all technical interventions that do not address root causes, policymakers, industry, and civil society should tackle known issues by advancing legislation and solutions that promote user agency and platform accountability in a rights- and privacy-respecting manner.