Meta Introduces New Measures to Combat Teen Sextortion
News Desk
Islamabad: Meta, the parent company of Facebook and Instagram, announced new initiatives on Thursday aimed at combating sextortion, a form of online blackmail where criminals coerce victims—often teenagers—into sending sexually explicit images.
The new measures include stricter controls on who can follow or message teen accounts and safety alerts in Instagram direct messages and Facebook Messenger regarding suspicious cross-border conversations.
These updates enhance the “Teen Accounts” feature introduced last month, which is designed to better protect underage users from the potential dangers of the photo-sharing platform.
Additionally, Meta is introducing restrictions that limit scammers’ access to follower lists and interactions and will prevent screenshots in private messages.
The company is also rolling out nudity protection features globally, which will blur potentially nude images and prompt teens before they send such content in Instagram direct messages.
In select countries, including the United States and the United Kingdom, Instagram will feature a video in teens’ feeds that educates them on identifying sextortion scams.
This initiative aims to help young users recognize warning signs, such as individuals who are overly aggressive, request photo exchanges, or try to shift conversations to other apps.
“The dramatic rise in sextortion scams is taking a heavy toll on children and teens, with reports of online enticement increasing by over 300 percent from 2021 to 2023,” stated John Shehan of the US National Center for Missing & Exploited Children.
He emphasized that campaigns like this provide essential education to help families recognize these threats early.
The FBI has highlighted sextortion as an escalating concern, particularly for teenage boys, with many offenders located outside the United States. Between October 2021 and March 2023, U.S. federal officials identified at least 12,600 victims, with twenty cases resulting in suicides.
Meta’s commitment to child protection comes amid mounting pressure on the social media giant and its competitors.
Last October, around forty US states filed a complaint against Meta, alleging that its platforms harm the “mental and physical health of young people” due to issues like addiction, cyberbullying, and eating disorders.
Currently, Meta does not verify the age of its users, citing privacy concerns, but is advocating for legislation that would mandate ID checks at the level of mobile operating systems, such as Google’s Android or Apple’s iOS.
Comments are closed.