Facebook is taking on revenge porn.
The social network has added new detection technology and an online resource hub to help prevent non-consensual intimate images, also known as revenge porn.
The artificial intelligence-enabled tech is able to detect “near nude images or videos that are shared without permission on Facebook and Instagram,” the social network said Friday, adding that it’s able to spot this content before the posts are reported by users.
This is an addition to a pilot program that gave users an emergency option to submit a photo to Facebook for review. The social network fingerprints that image and stops it from being shared on the platform. Facebook also said it’s expanding this pilot program over the next few months.
“We are thrilled to see the pilot expand to incorporate more women’s safety organizations around the world, as many of the requests that we receive are from victims who reside outside of the US,” said Holly Jacobs, founder of Cyber Civil Rights Initiative, in a Facebook blog post on Friday.
Facebook has been using AI to help monitor its platform. In October, the social network said it’s using machine learning and AI to proactively detect child nudity and exploitative content when it’s uploaded. The company has been using AI on top of photo matching for years to prevent the sharing of known child exploitation images.
–
credit: cnet