In response to increasing global pressure for better safeguards on social media, Instagram has launched a new account type designed specifically for users under 18 years old. The introduction of “teen accounts” aims to limit the kind of content and interactions underage users are exposed to, with automatic restrictions applied in Australia, the UK, the US, and Canada.
These new protections are part of Instagram’s efforts to ensure that teens are safer while using the platform. Teen accounts will have default settings that limit who can contact these users and what content they can view. Parental permission will be required to modify any of these protections for those under the age of 16. New users will have the settings applied immediately, while existing accounts will be transitioned to the new structure over the next 60 days.
Despite the timing of this release aligning with the Australian Federal Government's push for tighter regulations on social media use among minors, Instagram has maintained that the decision to roll out teen accounts was unrelated to proposed legislation. According to Instagram’s head, Adam Mosseri, the company has been developing these features for over a year and is not acting in response to government pressure. “It’s going out now because it’s ready now,” Mosseri explained, while emphasizing that the development was an internal decision aimed at improving the platform’s safety for young users.
However, some Australian government officials remain skeptical. Federal Minister for Communications Michelle Rowland welcomed Instagram’s changes but indicated that government action might have influenced the company's decision. Rowland acknowledged Meta's statement that the feature had been in development for some time but highlighted that decisive government measures often prompt companies to act.
Youth Minister Anne Aly expressed further concerns, noting that Instagram’s new safeguards, while a positive step, do not address the full scope of the issue. Aly reiterated the importance of government regulation to protect children online, especially with the introduction of measures like age verification and social media restrictions for minors. In agreement, Coalition frontbencher James Paterson criticized the social media platform for the negative impact it continues to have on young people's mental health. Paterson stressed the need for government intervention, arguing that platforms like Instagram have long been aware of the harm caused by unrestricted content exposure to minors.
Core Features of Teen Accounts
The new teen accounts feature several important changes designed to enhance safety and privacy for younger users. These include:
Restricted Messaging Teen accounts will only receive messages from users they follow or are already connected with.
Sensitive Content Controls Instagram will apply content restrictions, limiting exposure to potentially harmful material in areas like the Explore or Reels sections. Content related to violence or cosmetic procedures will be reduced for teen accounts.
Tagging and Mentions Only users who are followed by a teen account will be able to tag or mention that account in posts.
Anti-Bullying Measures Instagram’s hidden words feature will automatically filter out offensive words and phrases from both comments and direct message requests for teen accounts.
Usage Reminders Teen users will receive reminders to close the app if they spend more than 60 minutes in a single day.
Sleep Mode Notifications for teen accounts will automatically turn off between 10 p.m. and 7 a.m., with autoreplies sent to any direct messages received during that time.
Concerns About Enforcement and Effectiveness
While these new features have been largely welcomed by experts, concerns remain about the effectiveness of the measures in place and whether the burden of responsibility is being shifted onto parents. RMIT University professor Lisa Given pointed out that while these controls are a step in the right direction, they still place much of the responsibility on parents to monitor their children’s activity on social media.
According to Professor Given, Instagram’s content filtering only applies to material that violates its recommendation guidelines, which are separate from the platform’s community guidelines. As a result, teens may still be exposed to harmful content from accounts they follow, even with the Sensitive Content Control settings in place.
There are also doubts about the enforcement of these new measures, especially regarding age verification. Instagram plans to use artificial intelligence tools to identify users who lie about their age, employing biometric data such as facial recognition and behavioral data analysis. However, these methods are still in development, and experts like Professor Given have raised concerns about their limitations, particularly regarding the accuracy of facial recognition for non-Caucasian users and potential errors in age estimation.
While Instagram intends to roll out these AI tools in a trial for U.S. users by 2025, the platform acknowledges that no perfect solution exists. Adam Mosseri emphasized the sensitivity of collecting biometric data and government IDs, expressing reluctance to gather such personal information.
The Path Forward
Though the introduction of teen accounts reflects Instagram’s efforts to improve safety for younger users, critics maintain that the changes may not go far enough in protecting children from the potential risks of social media. As governments, particularly in Australia, continue to push for more comprehensive regulations, platforms like Instagram are likely to face increased scrutiny over their responsibility to safeguard underage users.