Facebook has stressed that its experiment to fight revenge porn that requires users to provide their intimate images proactively to the social network where a "specially trained" professional reviews and hashes the image is voluntary.
"To be clear, people can already report if their intimate images have been shared on our platform without their consent and we will remove and hash them to help prevent further sharing on our platform," said Antigone Davis, Facebook's Global Head of Safety, in a blog post late on Thursday.
"With this new small pilot, we want to test an emergency option for people to provide a photo proactively to Facebook, so it never gets shared in the first place," Davis added.
This programme is completely voluntary.
"It's a protective measure that can help prevent a much worse scenario where an image is shared more widely. We look forward to getting feedback and learning," Davis said.
Facebook launched the experiment in Australia this week to help prevent non-consensual intimate images from being posted and shared on its platforms.
Facebook launched the experiment in partnership with the Australian eSafety Commissioner's Office and an international working group of survivors, victim advocates and other experts.
With this, Australians who fear their intimate image may be shared without their consent can work with the eSafety Commissioner to provide that image in a safe and secure way to Facebook so that it can help prevent it from being shared anywhere on Facebook, Messenger and Instagram.
How does the mechanism work?
Australians can complete an online form on the eSafety Commissioner's official website. To establish which image is of concern, people will be asked to send the image to themselves on Messenger.
The eSafety Commissioner's office notifies Facebook of the submission (via their form). However, they do not have access to the actual image.
Once Facebook receive this notification, a specially trained representative from its Community Operations team reviews and hashes the image, which creates a human-unreadable, numerical fingerprint of it.
Facebook stores the photo hash -- not the photo -- to prevent someone from uploading the photo in the future.
"If someone tries to upload the image to our platform, like all photos on Facebook, it is run through a database of these hashes and if it matches we do not allow it to be posted or shared," Davis said.
"Once we hash the photo, we notify the person who submitted the report via the secure email they provided to the eSafety Commissioner's office and ask them to delete the photo from the Messenger thread on their device. Once they delete the image from the thread, we will delete the image from our servers," Davis noted.
--IANS
gb/na/mr
Disclaimer: No Business Standard Journalist was involved in creation of this content
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
