Yes, really.
According to NBC News, Facebook has partnered with a UK-based nonprofit called Revenge Porn Helpline, which is in the process of building a tool to prevent naked photographs from being shared on social media.
Pjmedia.com reports: And part of that process involves users who are concerned about disgruntled former partners sharing intimate photos on social media submitting their own intimate photos to a global website called StopNCII.org, which stands for Stop Non-Consensual Intimate Images.
The tool, which launched on Thursday, builds on a pilot program Facebook started in 2017 in Australia.
“It’s a massive step forward,” said Sophie Mortimer, the manager of the Revenge Porn Helpline. “The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it.”
Karuna Nain, director of global safety policy for Meta, decided to work with Revenge Porn Helpline rather than handle the collection of private images themselves to reduce the burden on users, giving them one place to submit their intimate images rather than doing so on “each and every platform.”
“Having one system open to all parts of industry is critical,” she said. “We know this material doesn’t just get shared on one platform and it needs a much more joined-up approach.”
This summer, Apple announced that it would begin scanning its customers’ iCloud photo libraries for known images of child sexual abuse, which created privacy concerns.
If Facebook can detect when you’re posting a meme that allegedly gives COVID “misinformation,” I’m sure they can detect when a nude image is shared on the platform, and simply ban the sharing of all such images.
Related posts:
Views: 0