- Bumble has released its lewd image detection framework on Github.
- The Private Detector is able to detect lewd images with more than 98 percent accuracy.
- The framework can be deployed as is or tweaked with additional images.
Navigating dating platforms such as Bumble can be risky business given the propensity some folks have to share lewd images.
However, Bumble prefers to keep things PG13 and as such it employs artificial intelligence (AI) to police its halls.
Now the company has released its appropriately named Private Detector to the wider tech community. This tech detects and blurs lewd images while also warning users that they may be viewing unwanted content. As a user, you are then able to view the image, and block or report the user who sent the unwanted content.
“Even though the number of users sending lewd images on our apps is luckily a negligible minority – just 0.1% – our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tailored to achieve the best possible performances on the task. Our Private Detector is trained using very high volume data sets, with the negative samples (the ones not containing any lewd content) carefully selected in order to better reflect edge cases and other parts of the human body (eg. legs, arms) in order not to flag them as abusive,” Bumble explained in a blog post.
While Bumble downplays the number of lewd images sent as just 0.1 percent across all of its products, that is still a fair chunk of people. More so, a survey by YouGov in the UK in 2018 found that 46 percent of female millennials have received a lewd image from a man. Of those, 89 percent said they were unsolicited pictures. By contrast, only 5 percent of millennial men admitted to sending an unsolicited lewd image.
The open-source version of Private Detector is now available on Github under an Apache Licence. The release comes with a ready-to-use SavedModel that can be deployed as is using TensorFlow Serving. Bumble has also included a checkpoint which can be used to train the model further with additional images.
This is a rather great tool and it’s fantastic that it’s open-source and can be used in other applications and platforms.
Of course, the ideal scenario would be this not happening at all but men just seemingly can’t be trusted to behave themselves.