Getting the creeps —

Reddit cracked down on revenge porn, creepshots with twofold spike in permabans

Reddit also launched a transparency center to help users assess platform safety.

Reddit cracked down on revenge porn, creepshots with twofold spike in permabans

A year after Reddit updated its policy on non-consensual intimate image (NCII) sharing—a category that includes everything from revenge porn to voyeurism and accidental nip slips—the social media platform has announced that it has gotten much better at detecting and removing this kind of content. Reddit has also launched a transparency center where users can more easily assess Reddit's ongoing efforts to make the platform safer.

According to Reddit’s 2022 Transparency Report—which tracks various “ongoing efforts to keep Reddit safe, healthy, and real”—last year Reddit removed much more NCII than it did in 2021. The latest report shows that Reddit removed 473 percent more subreddits and permanently suspended 244 percent more user accounts found to be violating community guidelines by sharing non-consensual intimate media. Previously, Reddit labeled NCII as "involuntary pornography," and the 2022 report still uses that label, reporting that the total number of posts removed was 187,258. That includes non-consensual AI-generated deepfakes, also known as “lookalike” pornography.

“It’s likely this increase is primarily reflective of our updated policies and increased effectiveness in detecting and removing non-consensual intimate media from Reddit,” the transparency report said.

Despite banning revenge porn and other forms of NCII, Reddit came under fire in recent years for NCII found on its platform. One prominent example of revenge porn that escaped moderation included infringing posts by a Maryland mayor, Andrew Bradshaw, in 2021. Bradshaw pled guilty to using multiple Reddit accounts to post photos of an ex-girlfriend for two months before he was ultimately charged with 50 counts of distributing revenge porn, AP reported.

By March 2022, Reddit had updated its policy, announcing to users that the platform had made “edits and additions to the policy detail page to provide examples and clarify the boundaries when sharing intimate or sexually explicit imagery on Reddit.”

It seemed like a step in the right direction for Reddit, but then a BBC investigation revealed there was seemingly still more work to be done. In August 2022, BBC found "thousands" of images and videos, some of which were non-consensual intimate media, reporting that women were “facing threats and blackmail from a mob of anonymous strangers.” Reddit users told the BBC at the time that sometimes Reddit removed images promptly, but other times it took months before actions were taken.

Reddit responded to BBC’s report by citing 88,000 non-consensual sexual images removed in 2021. A Reddit spokesperson noted that BBC's report provided "no indication of which images are violating" and "only found 150 sexually explicit images."

As scrutiny heightened, Reddit continued working on blocking more violating content this year. Earlier this month, Reddit joined StopNCII.org, an image-hashing database that automatically blocks reported NCII from spreading. That database is operated by the nonprofit international charity SWGfL’s Revenge Porn Hotline. Meta helped launch the initiative in 2021, and Reddit joining this year signals its seemingly more aggressive effort to join other major platforms in facilitating reporting and preventing the spread of NCII.

“We have already seen promising results from this tool and believe it will help us remove this content even more quickly and accurately,” a Reddit spokesperson told Ars.

SWGfL did not immediately respond to Ars’ request for comment on Reddit’s recent partnership with the organization or its 2022 transparency report.

In addition to removing more NCII, Reddit reported that in the second half of 2022, the platform also removed and reported much more child sexual abuse materials (CSAM) than it had previously. Compared to 2021, Reddit removed 874 percent more CSAM between July and December 2022.

Reddit attributes this spike in content removal to hiring “more staff dedicated to combating child sexual exploitation,” investing in tools and “proactive detection methods,” and moderating more content after adding “new ways to share media (such as images in chat and comments) on the platform.”

“We also take the steps required by US law to report the relevant users/content to the National Center for Missing and Exploited Children (NCMEC), and preserve relevant account data,” a Reddit spokesperson told Ars.

Similar to StopNCII.org, NCMEC partnered with Meta and launched an image-hashing database to prevent teen sextortion last year, called TakeItDown. Unlike Pornhub, OnlyFans, TikTok, Facebook, and Instagram, Reddit has not yet partnered with that initiative. Reddit's spokesperson told Ars they would provide an update if that changes in the future. A Reddit thread upvoted by more than 17,000 users suggests that some of Reddit’s community members are keen to spread awareness that TakeItDown can help minors block revenge porn on other platforms.

Some Reddit users have questioned whether it’s safe or effective to report non-consensual intimate images to StopNCII.org. To help users navigate this sensitive issue, Reddit partners with crisis counselors and provides additional resources. Reddit also provides resources for survivors reporting CSAM on the platform.

Reddit is partially owned by Advance Publications, which also owns Ars Technica parent Condé Nast.

Channel Ars Technica