国产视频

In Depth

Youth Deserve a Thoughtful, Holistic Approach to Online Safety

shutterstock_1727634253
pathdoc/Shutterstock

Overview

In recent years, there has been considerable attention paid to youth online safety and well-being. Legislators have proposed different technical interventions to improve the internet鈥檚 safety for young people, including age-appropriate design codes, increasing parental controls, and requiring age verification. Some legislators have suggested from social media altogether. As the Open Technology Institute (OTI) has underscored in our report on age verification, youth online safety requires a holistic approach, as it鈥檚 unclear if any of the commonly proposed technical interventions can fully or directly address the challenges that young people face online.

Many of the proposed technical interventions present feasibility issues as well as constitutional and privacy concerns for users. For example, OTI, along with and legislators, have raised concerns about the censorship risks of the (KOSA), especially for youth in marginalized communities. In August 2024, a federal appeals court upheld a partial block of due to similar fears over censorship and speech regulation. And the age verification mandates appearing across state legislatures raise data privacy and security concerns and can result in over-censoring access to content for all users. OTI recently joined other civil society organizations in filing an amicus brief in support of an challenging the constitutionality of , which will be heard by the Supreme Court.

Online spaces should be safer for youth. However, quick tech fixes that can cause more harm than good are inappropriate solutions. Instead, youth deserve a thoughtful, holistic approach to improving their online experiences and overall well-being. Recent reports from the and the discuss both the benefits and risks of social media use and highlight the need for more research to fully understand its impact on children and youth.

Improving young people鈥檚 mental health and overall well-being requires a nuanced approach that acknowledges and addresses the complex socioeconomic, community, and tech-based contributing factors. Addressing youths鈥 well-being also requires recognizing the vast breadth of social, developmental, and of the millions of children, teens, young adults, and families across the United States. While many stakeholders seek to better understand technology鈥檚 impact on youth mental health and development, common-sense policies that center user privacy, safety, and rights can help improve youth experiences and mitigate known online risks.

Policy Recommendations for Policymakers, Industry, and Civil Society

  1. Invest in multidisciplinary research to better understand the empirical effects of social media on youth well-being and investigate evidence-based solutions to improving youth experiences online.
  2. Pass comprehensive federal data privacy legislation to standardize basic online privacy protections for users across all states.
  3. Advance -, -, and -by-design practices to minimize risks to user data privacy and security.
  4. Implement features that allow greater user control and agency over individual online experiences.
  5. Require greater algorithmic transparency and accountability.

There鈥檚 No Quick (Tech) Fix for Online Safety

Growing concerns over social media鈥檚 impact on youth have prompted , , and to try to limit the tech鈥檚 potential negative outcomes for young people. A report concluded that while social media benefits youth, it can also pose a meaningful risk to their mental health. Legislators have often sought to tackle these challenges through technical intervention. However, some of the most commonly advanced technical interventions have concerning implications for users and may cause more harm than good.

Age-Appropriate Design Codes and Features

by the U.K. Children Code, there has been a to pass age-appropriate design codes (AADCs) at the state level. The campaign proposes 15 standards for company practices based on developmental stages, including privacy-by-design and by-default and required data impact protection assessments.

Despite the laws鈥 promising components, they face criticism for being overly broad and having larger implications for user access. As written, the AADCs charge online operators to prevent youth from 鈥減otentially harmful鈥 material. However, without clear definitions of what that means, online operators may feel obligated to over-censor content, limiting access to protected speech and causing a chilling effect for all users. This may disproportionately impact marginalized communities and access to politicized content.

These , in part, led to the blocking of California鈥檚 AADC in 2022 and were in a recent federal appellate ruling. Similar have surrounded the , which proposes a 鈥渄uty of care鈥 to implement design features to 鈥減revent and mitigate鈥 mental health disorders such as anxiety, depression, eating disorders, and substance use disorders. Such a broad mandate could have significant consequences for free speech online. Further, there is no clear way to determine what may influence the development of such disorders for every individual.

Parental Controls

Parental controls allow parents to filter content, set restrictions, monitor activity, enable permissions, and link their account with their child鈥檚. However, these tools place a high burden on parents, who may not have the capacity, willingness, or digital skills to use them effectively. In fact, despite their availability, parents simply .

Recent data shared by Discord and Snapchat shows that of minors have parents who use monitoring tools. This aligns with a of parents of 13鈥17-year-olds that found parents are 鈥渞elatively less likely to use technology-based tools to monitor, block or track their teen鈥 than other measures.

In addition, these tools raise concerns for youth privacy and safety. Parental controls, when overly invasive and restrictive, can be particularly dangerous for youth who are already vulnerable, such as LGBTQ youth, those seeking access to or information about reproductive health care, or those experiencing child abuse and neglect at home.

Age Verification

Current age verification practices require users to provide a government-issued ID, credit card, or biometric data to verify their age. Such requirements can significantly challenge all users鈥 right to access content, as , particularly those in marginalized communities and under the age of 16, do not own a valid government-issued photo ID or hold a credit card.

Even when users have appropriate ID, without proper safeguards, the process of verifying user ages can endanger their data privacy and security. Previous efforts to implement similar age verification requirements have been ruled unconstitutional. In July 2024, however, the an appeal concerning Texas鈥檚 new law requiring age verification to access adult content. This ruling could impact future determinations of if and in which cases age verification is constitutional鈥 details the risks of such legislation.

The Root of the Problem: It鈥檚 More Than Tech

Online spaces should be safer for youth. The challenges online environments and activity can pose to youth safety and well-being are serious. Pressing concerns such as child sexual abuse material (CSAM), cyberbullying, and access to age-inappropriate content require further attention and action to mitigate. And policymakers and public health officials鈥 current focus on social media鈥檚 impact on mental health and development deserves further exploration to deepen our understanding of exactly how social media鈥攁nd technology use鈥攊mpact youth across different stages.

At the same time, it is necessary to recognize that technological solutions can create additional challenges, such as potentially censoring access to content or putting user data at risk. And, on their own, technological solutions cannot adequately or fully address the challenges that youth face, many of which are rooted in offline issues. For example, today鈥檚 generation of young people faces impacting their development and their outlook on life鈥攐nly some of which may be amplified by online activity. Societal challenges, including increasing in communities, , , , and an , to youth mental health outcomes and development. Simultaneously, the COVID-19 pandemic disrupted key events and life experiences while changing people鈥檚 relationship with technology. Society is still trying to understand the pandemic鈥檚 total impact on our collective physical and mental health, particularly for young people whose .

While technological design can help mitigate some harms young people face online, it鈥檚 important to be realistic about the limitations of technology鈥檚 ability to address complex online safety issues comprehensively. Mental health and well-being are shaped by various complex socioeconomic, community, and tech-based factors. While online spaces can challenges youth face offline, strategies to improve online spaces for youth cannot be addressed in isolation; rather, they must holistically consider all contributing risks and protective factors. Policies that promote common-sense protections can improve young people鈥檚 experiences online by advancing privacy, security, transparency, and user agency. At the same time, these policies can reduce the potential for bad practices that amplify real-world challenges online and encourage excessive or problematic internet use.

Recommendations: How to Better Protect Youth鈥攁nd All Users鈥擮nline

Social media and other forms of online spaces can offer important benefits to youth, fostering a as well as creating avenues for . Rather than supporting policies that can exclude young people from online spaces, it鈥檚 important to recognize the value of these spaces and work toward improving young people鈥檚 experiences within them. There is a wide range of approaches, including investments in digital literacy and specific interventions to counter CSAM, that can improve online safety for youth. The following recommendations are not a comprehensive list of solutions to online safety challenges. Instead, they are foundational ways to center user privacy, safety, and rights that address some of the underlying concerns driving the technological interventions discussed above.

  1. Invest in multidisciplinary research to better understand the empirical effects of social media on youth mental health and investigate evidence-based solutions to improving youth experiences online. As underscored in recent reports from the U.S. Surgeon General and the White House Task Force on Kids Online Health and Safety, more research is needed to understand the full scope of technology鈥檚 impact on young people鈥檚 mental health. Such research should determine what type of technology use and exposure results in negative outcomes and how exactly it impacts youth across all developmental stages. Furthering our understanding鈥攊n the United States and globally鈥攐f technology鈥檚 effect on youth can better inform evidence-based solutions and recommendations for best online practices, design features, and usage. In addition, this research may highlight the need for policy interventions and initiatives to address social and community-based challenges, beyond the online environment, which contribute to concerning youth mental health trends.
  2. Pass comprehensive federal data privacy legislation to standardize basic online privacy protections for users across all states. Comprehensive federal data privacy measures, such as those proposed by earlier versions of the American Privacy Rights Act (APRA) and the American Data Privacy Protection Act of 2022 (ADPPA), can establish a baseline of protections for users across the United States. Such measures can create data minimization requirements, establish online civil rights protections, create universal opt-outs, and establish other privacy rights for users, such as their ability to view, export, or delete their data and stop its sale or transfer. Creating a federal baseline for user privacy protections will reduce practices that are data extractive and exploitative while offering young people and their families more control over how their data is collected and used.
  3. Advance -, -, and -by-design practices to minimize the risks to users鈥 data privacy and security. Implementing by-design practices requires centering user privacy, security, and safety at all stages of technology development and implementation to mitigate risks before they occur. These design practices require greater responsibility, transparency, and accountability from developers on protecting users and their data online. Such practices avoid placing the onus solely on individuals, particularly young people, to navigate associated risks.
  4. Implement features that allow greater user control and agency over individual online experiences. While privacy-, security-, and safety-by-design practices offer improved baseline protections for users, customizable technology features can also provide users greater control over their online experiences. Depending on personal preferences, certain design features can help users mitigate unwanted or problematic online experiences, such as choosing who can see and interact with their accounts; who can message them online; and what content, suggested or not, they see online. Together, young people and their families can alter settings to best suit their needs and preferences at various ages.
  5. Require greater algorithmic transparency and accountability. Algorithmic transparency measures, such as those proposed in the Algorithmic Accountability Act of 2023, can require platforms to provide meaningful transparency about the algorithms impacting a user鈥檚 online experience. Users could gain greater insights into promoter-suggested advertisements, content, and connections. In addition, requiring impact assessments about the bias and other effects of algorithms can mitigate risks to users and help users make informed choices about their online activity.

These recommendations collectively offer a clear starting point to improve experiences for all users, not just young people online. To more closely address discrete challenges, greater collaboration between stakeholders is needed. Tackling technology鈥檚 role in the youth well-being and negative outcomes requires greater exploration to differentiate what challenges can be addressed through technical design and features and what must be addressed through larger social initiatives. There is no easy or quick solution. Improving youth online safety requires a thoughtful, nuanced, and holistic approach. Rather than attempting to implement one-size-fits-all technical interventions that do not address root causes, policymakers, industry, and civil society should tackle known issues by advancing legislation and solutions that promote user agency and platform accountability in a rights- and privacy-respecting manner.

More 国产视频 the Authors

Sarah Forland
Forland_Headshot.original (1)
Sarah Forland

Policy Analyst, Open Technology Institute, 国产视频

Youth Deserve a Thoughtful, Holistic Approach to Online Safety