“Once you send that picture, you can’t take it back,” it warns young people, often ignoring the fact that many young people send pictures of themselves under duress, or not. with no understanding of the consequences.
A new online tool aims to empower teenagers, or people who have been teenagers, to take photos and videos of themselves from the internet.
Called Take It Down, the tool is operated by the National Center for Missing and Exploited Children, and is sponsored by Meta Platforms, the owner of Facebook and Instagram.
The website allows anyone to hide – and without uploading any real photos – create what is essentially a digital fingerprint. This fingerprint (a unique number called a “hash”) is then entered into a database and the technology companies that agree to participate in the project remove the images from their services.
Now, caveats. Social media were, until Monday, Meta’s Facebook and Instagram, Yubo, OnlyFans and Pornhub, owned by Mindgeek. If the picture is on another page, or if it is sent on a hidden platform like WhatsApp, it will not be downloaded.
Additionally, if someone changes the original image – for example, crop it, add an emoji or turn it into a meme – it becomes a new image and therefore requires a new hash. Photos that are similar to what I saw – like the same photo with and without Instagram filters, will have the same hashes, differing only in one aspect.
Gavin Portnoy, spokesman for NCMEC, said: “Take It Down is aimed specifically at people who have an image that they have reason to believe has already been published online elsewhere, or it could be,” said Gavin Portnoy, a spokesman for NCMEC. “You’re young and you’re dating someone and you’re sharing the picture. Or someone snatched you away and said, ‘If you don’t give me a picture, or another picture of you, I will do X, Y, Z.’
Portnoy said that young people may be more comfortable going to another site than having law enforcement involved, which is unlikely, for one.
“For a young person who doesn’t like this kind of action, they want to know that it has been taken down, this is a big thing for them,” he said. NCMEC is seeing an increase in reports of child abuse online. The nonprofit CyberTipline received 29.3 million reports in 2021, up 35% from 2020.
Meta, back when it was still Facebook, tried to create a similar tool, although for adults, back in 2017. It didn’t go well because the site asked people to, in fact, send their nudity (hidden) to Facebook – not the most trusted company even in 2017. The company tested the service in Australia for a short time, but did not expand it to other countries.
But during this time, online bullying and harassment has become more and more serious, for children, teenagers and adults. Many tech companies already use this chat system to share, download and report to law enforcement images of child abuse. Portnoy said the goal is to get more companies to sign up.
“We’ve never had anyone say no,” he said.
Twitter and TikTok are not yet committed to the project. The company did not immediately respond to messages for comment on Sunday.
Antigone Davis, Meta’s head of global safety, said Take It Down is one of the tools the company uses to address child abuse and child abuse on its platform.
“In addition to supporting the development of this tool and finding, reporting and blocking systems on our platform, we are also doing various things to try to prevent situations like this from happening in the first place. So, for example, we don’t allow unrelated adults to message minors,” she said.
The site works with real and artificial intelligence-generated images and “depfakes,” Davis said. Deepfakes are created to look like real, real people saying or doing things they didn’t actually do.
#tool #young #people #remove #explicit #images