Facebook announced new measures to identify and remove non-consensual intimate images (also known as revenge porn) shared via the social media platform, as well as to support victims of such abuse. The company will be using a new detection technology, powered by machine learning and artificial intelligence (AI), to ‘proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram’. Once identified by the AI tool, the content is reviewed by a member of Facebook’s Community Operations Team, who will decide whether to remove the image or the video. The removal will also be accompanied by disabling the account from which the content was shared without permission  in most cases. Facebook has also launched the Not Without My Consent victim-support hub, for victims of revenge porn to be able to look for organisations and resources to support them.

cross-circle