Meta joins OnlyFans, Pornhub’s support tool to fight “sextortion”

Comment

A scrupulous stranger convinces a teenager to send illicit photos of themselves and threatens to publish them if they don’t get paid. A teenager breaks off a young romance, only to find their intimate images posted on porn sites.

These are nightmare scenarios for all parents – ones that tech companies have historically been ill-equipped to control. But Internet platforms are increasingly supporting new tools that allow users to remove these images.

Facebook parent company Meta has funded a new platform designed to address these concerns, allowing young people to proactively scan a select group of websites for their online images and have them removed. Run by the National Center for Missing and Exploited Children, Take that off assigns a “hash value” or fingerprint to images or videos, that technology companies use to identify copies of media on the web and remove them. Participants include tech companies, like Instagram, Facebook, and porn websites, including Onlyfans and Pornhub.

“Having a personal intimate image shared with others can be frightening and overwhelming, especially for young people,” Antigone Davis, global head of security at Meta, said in a statement. statement announcing the effort. “It can be even worse when someone tries to use these images as a threat for additional footage, sexual contact or money – a crime known as sextortion.”

The new tool comes as internet platforms struggle to find and prevent sexually explicit images from being shown on their websites without the subject’s consent. Experts say the problem seemed to get worse during the pandemic as the use of digital tools increased.

A 2021 report by the Revenge Porn Helpline found that reports of abuse of intimate images had increased significantly over the previous five years, with a percentage of 40 increase in cases reported between 2020 and 2021.

“A lot of times a child doesn’t know there’s an adult on the other end of that conversation,” National Center for Missing and Exploited Children spokesperson Gavin Portnoy said in an interview. “So they start demanding more pictures or more videos and often with the threat of disclosing what they already have to the community, to that child’s family. [and] friends.”

Tech companies that find sexually explicit images of young people are required by law to report the user who posted the material, but no such standard exists for adults. Dozens of states have adopted statues designed to combat non-consensual pornographic images, but they’re difficult to enforce because Section 230 of the Communications Decency Act gives tech companies legal immunity from user-generated content posted on their websites, Megan Iorio said, Senior Counsel for the Electronic Privacy Information Center.

The interpretations “enable companies not only to ignore requests to remove harmful content, including defamatory information and revenge pornography, but also to ignore injunctions requiring them to remove such information,” Iorio said.

Although Take It Down is only open to children under 18 or their guardians, it follows a similar 2021 effort of Meta to help adults find and remove non-consensual explicit content about themselves. Meta funded and built the technology for a platform called Stop Nonconsensual Intimate Image Abuse, which is operated by Revenge Porn Helpline. Users are allowed to submit a case to the helpline, which is run by SWGfL, a UK-based non-profit organization.. SO participating sites, including Facebook, Instagram, TikTok and Bumble, are removing the content.

Meta tried a similar approach in 2017 where users could flag suspicious images of themselves to trick the company into searching their networks for them and preventing them from being shared again. But the move prompted critical advocates who said the program could compromise users’ privacy.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top