500 words on the new tool that Instagram is launching to protect teens and kids from predators.
Meta has announced a new tool for Instagram direct messages to protect kids and teens from predators trying to elicit nudes and “sextort” them. This tool is aimed at preventing intimate image abuse from happening in the first place, and will automatically blur nude photos sent and received by teens under 18, giving those users the option to unsend their own intimate pictures and decide whether or not they want to see a nude photo sent to them. The new tool is part of Meta’s ongoing efforts to protect teens and kids from sexual predators, following damning reports about rampant child trafficking on its platforms.
The need for this tool is underscored by the alarming number of suicides identified in male victims of financially motivated sextortion, according to the FBI. The agency has reported an “alarming number” of suicides among teenage boys between 14 and 17 who have been targeted by online scammers asking for nude photos and threatening to release them unless they pay a sum of money.
Meta has already been taking steps to address the issue, including reporting sextortion after it happens and removing perpetrators’ accounts. However, the new tool takes a proactive approach to preventing intimate image abuse from happening in the first place. The company is deploying technology to identify users who could be engaged in sextortion, and will use machine learning to detect and automatically blur nude photos sent and received by teens under 18.
The new tool is part of Meta’s ongoing efforts to protect teens and kids from sexual predators, following damning reports about rampant child trafficking on its platforms. The company has launched several initiatives this year aimed at increasing reporting by minors and curbing the circulation of online child exploitation.
The new tool is a welcome development in the fight against sextortion and the protection of minors who use social media platforms like Instagram. Companies have a responsibility to ensure the protection of minors who use their platforms, and Meta’s proposed device-side safety measures within its encrypted environment is encouraging.
The tool is also an important step in addressing the vulnerability of teens when it comes to harms associated with social media — over-sexualization, bullying, sextortion — which has come under intense scrutiny from regulators in recent years. As a result, Meta’s new tool is a significant development in the ongoing effort to protect teens and kids from sexual predators online.
In conclusion, Meta’s new tool for Instagram direct messages is a welcome development in the fight against sextortion and the protection of minors who use social media platforms like Instagram. The tool is an important step in addressing the vulnerability of teens when it comes to harms associated with social media and will go a long way in protecting kids and teens from predators trying to elicit nudes and “sextort” them. With this new tool, Meta is taking a proactive approach to preventing intimate image abuse from happening in the first place, giving users the option to unsend their own intimate pictures and decide whether or not they want to see a nude photo sent to them.