The new platform Meta aims to prevent the sextortion of teenagers on Facebook and Instagram | CNN Business
CNN
—
Meta is taking steps to crack down on the spread of intimate images of teenagers on Facebook and Instagram.
A new tool, called Take It Down, takes aim at a practice commonly known as “revenge porn”, where someone posts an explicit image of a person without their consent in order to publicly embarrass or cause them distress. The practice has skyrocketed in recent years on social media, especially among young men.
Take It Down, which is managed and operated by the National Center for Missing and Exploited Children, will allow minors for the first time to anonymously attach a hash (or fingerprint) to intimate images or videos directly from their own devices, without the need for upload them to the new platform. To create a hash of an explicit image, a teenager can visit the TakeItDown.NCMEC.org website to install software on their device. The anonymous number, not the image, will be stored in a database linked to Meta, so if the photo is ever posted to Facebook or Instagram, it will be compared to the original, reviewed and possibly removed.
“This issue has been incredibly important to Meta for a long, long time because the harm caused is quite severe in the context of teenagers or adults,” said Antigone Davis, Meta’s global director of security. “It can damage their reputation and their family relationships, and it puts them in a very vulnerable position. It’s important that we find tools like this to help them regain control of what can be a very difficult and devastating situation.”
The tool works for any image shared on Facebook and Instagram, including Messenger and direct messages, as long as the images are unencrypted.
Children under the age of 18 can use Take It Down, and parents or trusted adults can also use the platform on behalf of a young person. The effort is fully funded by Meta and builds on a similar platform it launched in 2021 alongside more than 70 NGOs, called StopNCII, to prevent adult revenge porn.
Since 2016, NCMEC’s cyber tip line has received more than 250,000 reports of online solicitation, including sextortion, and the number of such reports more than doubled between 2019 and 2019. Last year , 79% of criminals were looking for money to keep the photos. offline, according to the nonprofit. Many of these cases occurred on social media.
Meta’s efforts come nearly a year and a half after Davis was criticized by senators over the impact its apps have on younger users, after an explosive report indicated the company was aware that Facebook-owned Instagram could have a “toxic” effect on teenage girls. While the company has released a handful of new tools and protections since then, some experts say it’s taken too long and more needs to be done.
Meanwhile, in his latest State of the Union address, President Biden called for more transparency about tech companies’ algorithms and how they impact. the mental health of its young users.
In response, Davis told CNN that Meta “welcomes efforts to introduce standards for the industry on how to ensure that children can safely browse and enjoy all that online services have to offer.”
In the meantime, he said the company continues to redouble its efforts to help protect its young users, especially when it comes to keeping explicit photos off its site.
“Sextortion is one of the biggest crimes we see at the National Center for Missing and Exploited Children,” said Gavin Portnoy, vice president of communications and branding for NCMEC. “We call it the hidden pandemic, and no one really talks about it.”
Portnoy said there has also been an increase in young people dying by suicide as a result of sextortion. “This is the driving force behind the creation of Take It Down, together with our partners,” he said. “It really gives survivors a chance to say, look, I’m not going to let you do this to me. I have power over my images and my videos.”
In addition to Meta’s platforms, OnlyFans and Pornhub’s parent company, MindGeek, are also adding this technology to their services.
But limitations exist. To avoid hashing technology, people can alter the original images, such as by cropping, adding emojis, or rectification. they. Some changes, such as adding a filter to make the photo sepia or black and white, will still be flagged by the system. Meta recommends that teens who have multiple copies of the image or edited versions hash each one.
“There is no panacea for the issue of sextortion or the issue of non-consensual sharing of intimate images,” Davis said. “He really has a holistic approach.”
The company has rolled out a series of updates to help teens have an age-appropriate experience on its platforms, including adding new parental control tools, age verification technology, and default settings from teenagers to the more private settings of Facebook and Instagram.
This isn’t the first time a major tech company has poured resources into cracking down on explicit images of minors. In 2022, Apple abandoned plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material after backlash from critics who denounced the potential implications of feature privacy.
“Children can be protected without companies reviewing personal data, and we will continue to work with governments, child advocates and other businesses to help protect young people, preserve their right to privacy and make the internet a safer place for the kids and for us. everything,” the company said in a statement provided to Wired at the time.
Davis would not comment on whether he expects criticism of Meta’s approach, but noted that “there were significant differences between the tool that Apple released and the tool that NCMEC is releasing today.” He stressed that Meta will not check images on users’ phones.
“I welcome anyone in the industry who tries to invest in efforts to prevent these kinds of terrible crimes from happening in their apps,” he added.
.