Recently, Facebook revealed that it had created a new artificial intelligence – based tool that will help in detecting revenge porn before it is reported.
This effort is aimed at saving the victims of the intimate posts the time taken in trying to pull down such posts.
The technology is expected to use both machine learning and AI techniques in proactively detecting near-nude videos and images that are shared across Instagram and Facebook without permission.
The AI tool marks Facebook’s newest endeavor to get rid of abusive content from its platform.
“Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram,” Facebook’s Global Head of Safety Antigone Davis claimed in a blog post.
READ MORE: Facebook’s 5 Most Exciting Deep Learning Applications
READ MORE: Facebook to Double the Size of AI Research Lab by 2020
READ MORE: Facebook Joins the AI Chip Race
The announcement by Facebook follows its earlier photo-matching technology pilot, which saw people directly submitting their intimate videos and photos to Facebook.
The initiative, scheduled to run in collaboration with victim advocate entities, would then develop a digital fingerprint of the particular photo in a bid to ensure that Facebook could prevent it from ever being posted online across all its platforms.
This effort is similar to the way companies nowadays prevent images of child abuse from being uploaded to their websites.
The new AI tool for detecting revenge porn, however, does not require any involvement by the victim.
According to Facebook, this is imperative because victims are at times scared of retribution, which discourages them from reporting the content themselves.
Sometimes they are unaware that their videos or photos are being circulated online.
Once the system flags a video or image, a trained member of Facebook’s Community Operations team will assess the image before taking it down, especially if it goes against the community standards of Facebook.
In many cases, Facebook will disable the particular account that uploaded the video or image.
An appeals process is availed for those who are convinced that Facebook has made an error.
Aside from the existing pilot program and the latest technology, Facebook claims that it also evaluated how its numerous procedures regarding revenge porn reporting could be advanced.
The company discovered, for example, that victims wanted prompt responses after reporting.
According to Facebook, dealing with revenge porn issues is essential since the matter can lead to mental health problems such as suicidal thoughts, depression, anxiety and at times PTSD.
It can also lead to professional problems such as broken relationships with colleagues and loss of jobs.
Facebook acknowledged that it was not looking for a way to “acknowledge the trauma that the victims endure,” when responding to their reports.
It says it’s now re-evaluating the reporting tools and process to make sure they’re more “straightforward, clear and empathetic.”
The company is also rolling out “Not Without My Consent,” a hub located in the Facebook Safety Center, which was created in collaboration with professionals.
The hub is set to provide victims with access to resources and organizations that can support them and will outline the steps to be taken in reporting the content.