Get featured on IndiaAI

Contribute your expertise or opinions and become part of the ecosystem!

To shield teenage users on its Instagram network from “financial sextortion” frauds, which frequently target children and teens, Meta announced last Thursday that it was creating an AI-driven “nudity protection” tool.  

Unless the victims provide money or gift cards, the scammers claim they will broadcast the nude photographs online, either on public websites or on newsfeeds where their friends will see them. Politicians in the US have argued that it harms children’s mental health.  

Instagram claimed that it would introduce a number of new features in the coming weeks, but these features would be limited to a specific group of users. These features include the ability to blur naked photos shared in direct messages and alert users when they have dealt with someone who has engaged in financial sextortion. Soon after, the tools will be available to all users globally.  

According to Antigone Davis, head of global safety at Meta, “It is a horrific crime that preys on making people feel alone and ashamed.” Davis spoke with CNN. “We want to get ahead of the game and make sure people are aware of what we’re doing as we consistently expand our tools, as it has been widely documented that crime is on the rise.”  

For years, there has been an issue with people sharing nude photos without consent, usually as a form of retaliation against victims they know personally. However, the FBI has reported that it has observed a rise in instances of cash extortion from strangers, frequently initiated by con artists operating abroad. Sextortion has occasionally led to suicide.  

The most recent tools from Meta expand upon its already-existing teen safety features, which include the ability to report direct messages that request or threaten to reveal private photographs, rigorous settings that forbid messaging between unconnected accounts, and an increase in safety notices.  

According to CNN, Meta plans to test its nudity prevention technology through direct Instagram messaging initially. The platform will blur an explicit picture and alert the recipient that it contains nudity when sent. In addition, the notice will inform users that they are not required to reply and prompt them to choose whether to block the sender. Additionally, the platform will notify users—urging them to think twice—when it senses that they wish to submit a photo of themselves in the nude. 

The tool will be turned on by default for teens under 18, but adults will also receive a notification urging them to do so. According to the business, Meta’s technology uses machine learning on-device to ascertain whether a photo contains nudity, as reported by CNN. News feeds and other public portions of Meta’s platforms already forbid nudity.  

Source: CNN

Image: Unsplash

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE