Table of Contents
- Executive Summary
- Introduction
- Age Assurance and Age Verification
- Pursuing Kids Safety through Online Age Verification Legislation
- Challenges with Age Verification
- Social Media Platforms and Age-Appropriate Practices
- The Path Forward: Minimizing Potential Ramifications of Online Age Verification
- Appendix
Social Media Platforms and Age-Appropriate Practices
In response to concerns about children鈥檚 experiences online from parents and teens, schools, legislators, and regulators, state and federal age verification legislation is beginning to focus on social media platforms.1 Growing evidence shows that while not inherently bad for youth, social media can facilitate and exacerbate challenges to children鈥檚 mental health and safety online.2 While more efforts are needed to ensure children can safely and securely access online spaces, age verification mandates present various challenges and may not actually address the root concerns surrounding social media use.
Already, websites and social media platforms implement a variety of age assurance practices to enforce previously established legal age restrictions鈥攕uch as the Children鈥檚 Online Privacy Protection Rule (COPPA) or age restrictions related to online gambling and alcohol and tobacco sales鈥攁nd to uphold their own account age requirements. For example, when age restrictions are mandated by law, online operators may use hard identifiers such as photo ID or credit cards to confirm a user鈥檚 age, which is similar to age assurance practices taking place in the physical world.
In the absence of age limits set by law, such as platform鈥檚 account holder age requirements, many platforms and websites rely on self-declaration. This is usually done by asking users to input their date of birth when creating an account, link to an existing account with date of birth information, or simply check a box to confirm that they are the required age.
However, these methods aren鈥檛 foolproof, as users can simply declare they are of the required age when they are not. Age verification legislation intends to close these loopholes but leaves online platforms grappling to respond to concerns about children鈥檚 access to social media and age-inappropriate material while minimizing potential risks of age verification.
As a result, platforms have employed a variety of strategies to create safer online spaces for children and teens, such as requiring age verification only when an account holder is suspected to be underage, introducing age-specific features for users, and creating parental controls. These strategies have their own trade-offs and considerations for user rights, data privacy, and security, but they may offer insight for more direct and effective strategies for promoting kids safety online than those of age verification mandates.3
Detecting and Verifying Under-Age Accounts
To identify users who do not self-declare their age accurately, some social media companies are incorporating measures to flag when a user may be under the required age. For example, TikTok scans public videos of users to help determine account holders鈥 ages.4 Meta uses artificial intelligence to detect underage account holders based on account activity and linked profiles.5 Additionally, both Meta鈥檚 Instagram and Facebook platforms allow users to report accounts suspected to be held by an underage user.6 If a user tries to change their self-reported age or has been identified as being underage, platforms, including , , , and , require users to verify their age with a government-issued ID, credit card, or a live photo.7 When users try to edit their account age from under 18 to over 18 years old, Meta鈥檚 Instagram requires them to verify their age by submitting a government-issued ID, recording a video selfie to be analyzed by age-estimation AI, or asking mutual friends to vouch for their age.8 This strategy may reduce the personal or sensitive data that users need to share with a platform to verify their age by only requesting verification of account holders suspected of being under the required age. However, methods used to detect these underage account holders may subject users to intrusive surveillance and monitoring of online activity and incorrectly flag account holders as being underage.
Age-Specific Design Features
Some platforms employ age-specific features to protect youth from potentially harmful content and interactions online. For example, Roblox is working to incorporate an age verification feature that will allow users 13 years of age and older to submit a government-issued photo ID and a selfie to verify their age to 鈥渁ccess innovative social capabilities and age-appropriate content.鈥9 Enabling account restrictions on Roblox will lock an account鈥檚 contact settings to block messages and chats from other users and limit play to experiences recommended for all ages.10
Google offers a suite of digital well-being tools that allows all users to set daily limits and timers on apps, customize or turn off notifications, and set bedtime reminders鈥攕ome of which are turned on by default for users who are 13 to 17 years old on YouTube.11 In addition, Google has specific ad policies for teens that restrict personalized ads or ads containing sensitive content.12
Snapchat implements specific default settings for teens, including limiting contacts to friends and existing phone contacts, restricting location sharing, and sending in-app reminders about privacy and safety settings.13 Similarly, TikTok has a variety of age-specific features, such as prohibiting users under 13 years old from posting videos or comments, setting accounts held by 13 to 15 year olds to private by default, and restricting live streaming and direct messaging for users under the age of 16.14 In March 2023, TikTok introduced new age-specific features, including an automatic 60-minute screen time limit for users under 18 and created a screen time dashboard and controls for all users.15
In January 2024, Meta released new policies for teens, hiding age-inappropriate content, limiting content recommendations, and defaulting content recommendations to the most restricted settings.16 While these features help customize a safer and healthier online experience for young people, these features are not activated unless an account is created with the correct age.
Parental Controls
Companies are also creating more opportunities for parents to play a greater role in supervising their child鈥檚 online activity. Many platforms already implement options for parents to set restrictions, monitor, enable permissions, and link accounts for their children鈥檚 accounts.
TikTok鈥檚 Family Pairing allows parents to link their TikTok account to their child鈥檚 account to manage settings for various features, including account discoverability, searches, direct messaging, and screen time.17 Similarly, Google鈥檚 Family Link allows parents to manage parental controls such as SafeSearch and edit settings on YouTube Kids and YouTube accounts.18
Apple鈥檚 Family Sharing allows parents to create Apple IDs for their children and set parental controls and receive warnings about sensitive content sent or received by a child鈥檚 account.19 In 2022, Snapchat introduced its Family Center tool that allows parents to view their teen鈥檚 privacy and safety settings, manage parental controls, and restrict sensitive content.20 Likewise, Discord鈥檚 Family Center allows parents to see who their child is talking to on the platform, what forums of which they are a part, and newly added friends.21 In 2023, Meta began launching new parental supervision features on Facebook and Instagram that allow parents to see with whom their child is friends or messaging through both apps.22
While parental controls offer greater insight and supervision into their child鈥檚 online life, these controls may negatively infringe upon a young person鈥檚 privacy and enable unnecessary surveillance of their online activity.
Citations
- Monica Anderson and Michelle Faverio, 鈥81% of U.S. adults 鈥 versus 46% of teens 鈥 favor parental consent for minors to use social media,鈥 Pew Research Center, October 31, 2023, ; Donna St. George, 鈥淪chools sue social media companies over youth mental health crisis,鈥 Washington Post, March 19, 2023, ; Cristiano Lima-Strong and Naomi Nix, 鈥41 states sue Meta, claiming Instagram, Facebook are addictive, harm kids,鈥 Washington Post, October 24, 2023, .
- Health Advisory on Social Media Use in Adolescence (Washington, DC: American Psychological Association, May 2023), ; Georgia Wells, Jeff Horwitz, and Deepa Seetharaman, 鈥淔acebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show,鈥 Wall Street Journal, September 14, 2021, ; Social Media and Youth Mental Health: U.S. Surgeon General Advisory (Washington, DC: U.S. Department of Health and Human Services, 2023), .
- The social media strategies shared in this report are not a comprehensive view of all the strategies employed by social media platforms, but instead are intended to provide a snapshot of what popular social media platforms are doing to improve youth experiences online.
- Sarah Perez, 鈥淭ikTok CEO says company scans public videos to determine users鈥 ages,鈥 TechCrunch, March 23, 2023, .
- Pavni Diwanji, 鈥淗ow Do We Know Someone Is Old Enough to Use Our Apps?鈥 Meta, July 27, 2021, ; Erica Finkle, Sheng Lou, Christine Agarwal, and Dave Fryer, 鈥淗ow Meta uses AI to better understand people鈥檚 ages on our platforms,鈥 Meta, June 22, 2022, .
- 鈥淩eport a child under 13 on Instagram,鈥 Instagram Help Center, ; 鈥淩eport an Underage Child,鈥 Facebook Help Center, .
- Regarding age restriction rules, Google鈥檚 YouTube Official Blog states, 鈥淚f our systems are unable to establish that a viewer is above the age of 18, we will request that they provide a valid ID or credit card to verify their age. We鈥檝e built our age-verification process in keeping with Google鈥檚 Privacy and Security Principles.鈥 However, Google does not detail what methods their systems use to establish a viewer鈥檚 age. See: The YouTube Team, 鈥淯sing technology to more consistently apply age restrictions,鈥 YouTube Official Blog (blog), Google, September 22, 2020, .
- 鈥淚ntroducing New Ways to Verify Age on Instagram,鈥 Meta, June 23, 2023, .
- 鈥淎ge ID Verification,鈥 Roblox, .
- 鈥淎ccount Restrictions,鈥 Roblox, .
- 鈥淭ools to help you achieve your own personal sense of digital wellbeing,鈥 Google, ; James Beser, 鈥淣ew safety and digital wellbeing options for younger people on YouTube and YouTube Kids,鈥 Canada Blog (blog), Google, August 10, 2021, .
- 鈥淎d-serving protections for teens,鈥 Google Advertising Policies, .
- 鈥淪afeguards for Teens,鈥 Snapchat, .
- 鈥淧arents鈥 Ultimate Guide to TikTok,鈥 Common Sense Media, December 14, 2022, .
- Cormac Kennan, 鈥淣ew features for teens and families on TikTok,鈥 TikTok, March 1, 2023, .
- 鈥淣ew Protections to Give Teens More Age-Appropriate Experiences on Our Apps,鈥 Meta, January 9, 2024, .
- 鈥淯ser safety,鈥 TikTok, .
- 鈥淯nderstand YouTube & YouTube Kids options for your child,鈥 YouTube for Families Help, ; 鈥淔ilter or blur explicit results with SafeSearch,鈥 Google Search Help, .
- 鈥淔amily Sharing. Share your favorite things with your favorite people,鈥 Apple, ; 鈥淪et up parental controls with Family Sharing on iPhone,鈥 iPhone User Guide, .
- 鈥淭ools and Resources for Parents,鈥 Snapchat, .
- 鈥淪tay Connected With Your Teen Using Discord鈥檚 Family Center,鈥 Discord Blog (blog), July 11, 2023, .
- 鈥淕iving Teens and Parents More Ways to Manage Their Time on Our Apps,鈥 Meta, June 27, 2023, .