Instagram Unveils New Safety Measures for Teen Users
Instagram has announced a significant update aimed at enhancing safety for teenage users and providing parents with more control over their children’s online experiences. The move comes amid growing concerns about the exposure of young people to harmful content on social media platforms.
The new features, set to roll out globally in January, include automatically setting teen accounts to private, implementing messaging restrictions, and applying the strictest content settings by default. These changes are designed to address issues such as cyberbullying, exposure to content related to eating disorders, and suicidal thoughts.
Under the new system, teen accounts will be automatically set to private, requiring users to accept or reject new follower requests. This measure limits the visibility of posts, videos, and tags associated with the account. Additionally, teens will only be able to receive direct messages from their followers, although they can still initiate conversations with other accounts.
Instagram is also introducing screen time notifications, prompting users to take a break after one hour of continuous use. A “sleep mode” will be activated between 10 p.m. and 7 a.m., although teens can still scroll and respond to messages during this time.
While these limits will be automatically applied to all teen accounts, users aged 16 and 17 will have the option to turn them off. Those under 16 will require parental permission to modify these settings.
To ensure compliance, Instagram is implementing age verification measures, including ID uploads and video selfies. The platform is also testing AI technology to detect if a user is a teen.
Parents will gain new supervisory controls, allowing them to monitor their teen’s messaging activity for the past seven days, set daily usage limits, and oversee content topics. However, both teens and parents must opt into the supervision feature, which can be revoked at any time.
Critics argue that these changes still place significant responsibility on parents and children to manage their online experiences. The effectiveness of these measures may also depend on parents’ familiarity with the platform, as they need an Instagram account to utilize the new controls.
The rollout of these features will begin with automatic enrollment for new users under 18 in the U.S., U.K., Canada, and Australia. Existing accounts will be transferred by mid-November, with European Union teens migrating later this year. Meta plans to expand these safety measures to other services, including Facebook, next year.
As social media platforms face increasing scrutiny over their impact on young users, Instagram’s latest update represents a significant step towards addressing these concerns. However, the long-term effectiveness of these measures remains to be seen.