Facebook urges consumers to send nude photos for anti-revenge porn test

Facebook anti-revenge porn program asks users for their nude photos

Facebook Wants You To Upload Nudes Of Yourself To Stop Revenge Porn

Back in April, Facebook took steps aimed at combating revenge porn in Canada and the USA, allowing users to flag an image they suspect was posted without consent.

If there is a photo or video out there you are anxious will be shared without your consent, Australians can contact the e-Safety commissioner.

Facebook is trialling an interesting new approach to combatting revenge porn, when someone uploads intimate photos of a former partner to the Internet.

Facebook, in association with Australia, Canada, the U.S. and the United Kingdom is running a pilot that will help prevent intimate images of users being posted and shared across Facebook, Messenger, Facebook Groups and Instagram.

Facebook did not immediately respond to a request for comment. While the scheme is starting out in Australia, it will also be trialed in the UK, Canada and the USA. "Let's say someone created a fake Facebook account using my name, and they put my naked photographs up there".

It will store a "fingerprint" of images to prevent any copies of them being shared by disgruntled ex-lovers.

First, you upload an explicit image of yourself to Facebook Messenger (you can do so by starting a conversation with yourself).

"The deployment of this technology would not prevent someone from sharing images outside of the Facebook ecosystem, so we should be encourage all online platforms to participate in this program, as we do with PhotoDNA", he said.

It's done by first contacting the e-safety commissioner or regional equivalent (e-safety commissioner is an Australian position, and this test is being carried out in Australia), after which, you will then be advised to send the photo to yourself.

Other social media sites, as well as tech companies such as Google and Microsoft have taken significant steps in tackling the issue of revenge porn.

The proposed system would work by "hashing" abuse images that victims send to themselves using Facebook Messenger, using the same technology that social media companies use to identify terrorism-related or child abuse images. Australia's safety commish Julie Inman Grant defended the experiment, arguing "they're not storing the image".

Latest News