Meta is further tightening its grip on who messages young users on Facebook and Instagram.
The social media company announced Thursday new tools and features that will limit teens’ ability to view potentially sensitive content on the two platforms. Chief among these is the new restriction on who can send direct messages.
Effective now, users under 16 (or 18 in some unnamed countries) will no longer be able to receive messages from people they are not friends with on either platform, even if the sender is another teenager. The only exception would be people in their phone’s Contacts list.
“We want teens to have safe, age-appropriate experiences on our apps,” Meta wrote in a blog post. “We’re taking additional steps to help protect teens from unwanted contact by turning off their ability to receive messages from people they don’t follow or aren’t connected to by default. Before a teen can change certain Instagram settings, they’ll now need parental approval through Instagram’s parental control tools.”
The changes do not appear to apply to Meta’s other assets, including Threads and WhatsApp. Snapchat, which is not owned by Meta, says users can adjust their settings to only be contacted by Snapchat friends and people in their phone’s Contacts list.
Earlier this month, Meta announced that it would automatically place teens in the most restrictive content moderation category, hiding posts about self-harm, eating disorders and related topics, even if those posts were shared by accounts they follow.
Meta said a new feature “designed to help prevent teens from seeing unwanted and potentially inappropriate images in messages from people they’re already connected to and to stop them from sending such images themselves” is yet to come.
The increased restrictions are part of an ongoing series of updates aimed at younger users. Late last year, Meta also eliminated cross-app communication, preventing Facebook and Instagram users from chatting with each other via direct messages. Last June, it introduced parental control tools for Messenger and Instagram DMs that allow parents to see how their children are using the service, as well as any changes made to their contact lists.
The change in status of the two messaging apps comes as Meta faces a threat from the European Commission to regulate its Messenger service as an “essential platform service” under the Digital Markets Act. This will force Meta to make Messenger inoperable with other messaging services.
This also follows a Wall Street Journal investigation last June that alleged pedophiles were using Instagram and its messaging system to buy and sell sexual content featuring minors.
Social media companies in general have been increasing parental controls more rapidly since last January, when Surgeon General Vivek Murthy said 13-year-olds were too young to join the sites, adding that the effects on mental health could be significant. For Meta, the company’s long battle with allegations that its products are being used to harm young people has added fuel to the fire.