Though it may sound audacious to some but Facebook believes sharing your intimate pictures that you fear might go viral with a trained employee of the company can help it stop their spread. It can be used to protect your privacy. The social networking giant on Tuesday said it was testing a reporting tool so that people who worry that someone might want to harm them by sharing an intimate image can proactively upload it, which will eventually help Facebook to block anyone else from sharing it on Facebook, Instagram, or Messenger.
Facebook said it entered into partnership with safety organisations on a way for people to securely submit photos they fear will be shared without their consent -- images that are also referred to as "revenge porn" or "non-consensual pornography".
"This pilot programme, starting in Australia, Canada, the UK and US, expands on existing tools for people to report this content to us if it's already been shared," Antigone Davis, Facebook's Global Head of Safety, wrote in a Facebook post.
From anxiety and depression to the loss of a personal relationship or a job, the result of having most intimate moments shared without permission can be devastating for a person.
And while these images harm people of all genders, ages and sexual-orientations, women are nearly twice as likely as men to be targeted, Davis said.
"This week, Facebook is testing a proactive reporting tool in partnership with an international working group of safety organisations, survivors and victim advocates, including the Australian Office of the eSafety Commissioner, the Cyber Civil Rights Initiative and The National Network to End Domestic Violence in the US, the UK Revenge Porn Helpline and YWCA Canada," Davis added.
As part of this initiative, anyone who fears an intimate image of them will be shared can contact one of Facebook's partners to submit a form.
After submitting the form, the victim receives an email containing a secure, one-time upload link. The victim can use the link to upload images they fear will be shared.
Thereafter, one of a handful of specifically trained members of Facebook's Community Operations Safety Team will review the report and create a unique fingerprint, or hash, that allows the social network to identify future uploads of the images without keeping copies of them on its servers.
Facebook said once it creates these hashes, it will notify the victim via email and delete the images from its servers within seven days.
"We store the hashes so any time someone tries to upload an image with the same fingerprint, we can block it from appearing on Facebook, Instagram or Messenger," Davis added.
(With IANS Inputs)