More restrictive privacy settings will also be added by default for teen users.
Instagram will remove the messaging button for teens if their account is viewed by a “suspicious adult” as part of a slew of privacy changes being introduced by parent company Meta.
The updates, which also include more restrictive settings for younger users on Facebook
, come after a landmark ruling concluded that the social-media content viewed by British teen Molly Russell had contributed to her death by suicide.
Meta’s latest update builds on the limits it introduced last year to stop teens from interacting with adults they don’t know. These included restricting adults from messaging younger users they aren’t connected to or from seeing teens in their “People You May Know” recommendations. Now, it is testing removing the direct-messaging button altogether from teen users’ accounts if they’ve been viewed by a suspicious adult. Meta describes a “suspicious” account as one that may have recently been blocked or reported by a young person on its platforms.
is implementing stricter privacy settings by default for teens aged under 16 (or 18 in some countries). It is also encouraging teens to enable limits on who can view their friends’ lists, the people and pages they follow, the posts they are tagged in, and who is allowed to comment on their posts, as well as urging them to review posts they are tagged in before they appear on their profile. The rules match similar updates that were previously introduced to Instagram. In addition, in August, the photo-sharing app also updated some safety controls for teenagers to make it less likely for them to encounter potentially sensitive content on the site.
As part of its ongoing privacy push, Meta is proactively encouraging younger users to report suspicious activity. A new notification will now prompt teens to report accounts to Meta after they block someone, which will then send them safety notices with info on how to navigate inappropriate messages from adults.
Meta says more than 100 million people saw its safety notices on Messenger in the span of one month in 2021. It added that it saw a 70 per cent increase in reports reported by minors in the first quarter of this year versus the previous quarter, on Messenger and Instagram DMs.
Finally, Meta is working with the National Center for Missing and Exploited Children on a global platform for teens to tackle intimate images they have created being shared publically online without their consent.
“We’ve been working closely with NCMEC, experts, academics, parents, and victim advocates globally to help develop the platform and ensure that it responds to the needs of teens so that they can regain control of their content in these horrific situations. We’ll have more to share on this new resource in the coming weeks.” Meta vice president and global head of safety Antigone Davis said.