Meta’s new tool allows teens to remove ‘nude’ photos, videos from internet

Meta has introduced a new tool for teenagers that will allow them to remove ‘nude’, ‘partially nude’, or ‘sexually explicit’ images from Instagram and Facebook that were uploaded in the past.

Meta’s new tool is called “Take It Down” and is operated by the National Center for Missing and Exploited Children.

Take It Down is a free service that lets teens or their parents anonymously submit photos or videos they fear might be uploaded to the internet or that have already been distributed online.

Such photos can be submitted to a web-based tool that will convert the images into digital fingerprints known as hashes, which will be sent to NCMEC and shared with platforms. The social media sites will use hash-matching technology to find and block any attempts to upload the original images.

Apart from Facebook and Instagram, the other participating platforms are Yubo, OnlyFans and Pornhub, owned by Mindgeek.

Meta’s new tool has been designed to combat the rising problem of ‘sextortion,’ where children are coerced or deceived into sharing intimate images with another person online, then threatened or blackmailed with the prospect of having those images published on the internet.

Some offenders are motivated to extract even more explicit images from the child while others are seeking money.

Take It Down hashes the images in the browser, so they don’t leave the device of the child or parent. If the extorter tries to upload the original images, the platform’s hash-matching technology will detect a match and send the newly uploaded image to a content moderator to review.

Meta said it will ingest new hashes multiple times a day, so it can be ready to block images very quickly.

Now, the caveats. If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down.

In addition, if someone alters the original image, for instance, cropping it, adding an emoji, or turning it into a meme, it becomes a new image and thus needs a new hash. Images that are visually similar, such as the same photo with and without an Instagram filter, will have similar hashes, differing in just one character.

Meta, back when it was still Facebook, attempted to create a similar tool, although for adults, back in 2017. It didn’t go over well because the site asked people to send their (encrypted) nudes to Facebook — not the most trusted company even in 2017. The company tested out the service in Australia for a brief period but didn’t expand it to other countries.

In 2021, it helped launch a tool for adults called StopNCII — or nonconsensual intimate images, aka “revenge porn.” That site is run by a UK nonprofit, the UK Revenge Porn Helpline, but anyone around the globe can use it.

Catch all the Technology News and Updates on Live Mint.
Download The Mint News App to get Daily Market Updates & Live Business News.

More
Less

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.