Facebook Launches New Technology To Detect “Revenge Porn”

In recent times, Facebook and Instagram have been one of the fastest social media for spreading “revenge porn”. In many cases, these non-consensual contents are taken down but not before a good number of persons lay their hands on it.

Some hours ago, the company released a new detection technology to curb the dissemination of nude media on Facebook or Instagram. It’s statement partly reads

“By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram…

“This new detection technology is in addition to our pilot program jointly run with victim advocate organizations. This program gives people an emergency option to securely and proactively submit a photo to Facebook.”

Furthermore, Facebook will launch a support center called “Not Without My Consent” on its Security Center page. Regarding near-nude content, Google has long said that such photos seriously infringe on personal privacy and hurt personal feelings, especially women. Upon the request of the victim, Google will remove those carcass or pornographic photos from the search results.

Tags

Andre

Andre is a network engineer with a solid technical background and a proven record in building and troubleshooting computer systems, networking, website design and blogging with broad knowledge on call center operations and administration. Above all, a man with great desire in sharing his knowledge and views, cutting across technology, social and politics.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker