As part of its broader commitment to combat “cyberflashing”, dating app Bumble is sourcing open source for its artificial intelligence tool that detects unsolicited lewd images. First released in 2019, Private Detector (let’s take a moment to let that name sink in) blurs nudes that are sent through the Bumble app, giving the user on the receiving end the choice of whether or not to open the image .
“Even though the number of users who upload obscene images to our apps is fortunately a negligible minority – just 0.1% – our scale allows us to collect an industry-leading data set of obscene and non-obscene images. , tailored to achieve the best possible performance on the task,” the company wrote in a press release.
Now available on GitHub, a polished version of the AI is available for commercial use, distribution, and modification. While it’s not exactly state-of-the-art technology to develop a model that detects nude images, it’s something small businesses probably don’t have the time to develop on their own. So other dating apps (or any other product where people could send dick pics, AKA the entire internet?) could eventually integrate this technology into their own products, helping to protect users from content. undesirable obscenes.
Since the release of Private Detector, Bumble has also worked with US lawmakers to enforce the legal consequences of sending unsolicited nudes.
“There is a need to address this issue beyond Bumble’s product ecosystem and start a broader conversation about how to address the problem of unsolicited obscene photos – also known as cyberflashing – to making the internet a safer and friendlier place for everyone,” Bumble added.
When Bumble first introduced this AI, the company claimed it had 98% accuracy.