In moderation, Computer Vision Model

The AI Whac-a-Mole: Why CV AI is critical to Keeping up with moderation

By Natalie Fletcher

(Cover image from Jessica Lucia)

When it comes to keeping up with moderating the user-generated content (UGC) posted to your website, AI is virtually unbeatable. Using our technology, for example, Photobucket went from moderating 1% of their content to 100%. While there are still cases where human moderation is the better option, generally, computer vision blows us out of the water, which is excellent when you consider the harrowing content we can encounter online.

For content moderators, moderation can be like a game of Whac-a-Mole, though far less fun. Just as they get a handle on moderating for one type of NSFW content, another kind pops up. It may be more like a hydra, as despite even their most herculean efforts, the NSFW-hydra always seems to grow new heads. 

giphy (2)

Like Hercules, though, moderators can complete the laborious task. They just need the right help and the right tool.

Pre-built moderation and NSFW models, or an end-to-end solution can work very well where you are monitoring for more common violations like drug paraphernalia or nudity. For other companies though, the nature of their content requires a custom solution. In addition to the categories mentioned above, one online marketplace needed to moderate for caged animals (a violation of their terms of service) and image quality. By using our computer vision, they saw their operational efficiency increase by 20x.

Wattpad needed to filter out not just explicit content, but specifically, illustrations, as drawings make up the majority of the UGC their platform receives. 

hero-devices

Our in-house team of model-building experts was able to build them a model that could do this in just a few weeks.

The hydra-like problem of ever-changing content is another reason custom computer vision models are a worthwhile investment. For instance, a particular symbol that was previously harmless may become problematic if users start applying and then associating that symbol with something harmful or unsettling. Should one of the businesses mentioned above see an uptick in a new variety of images that they deem to be unwanted, they only need to provide a few examples of such pictures to train their model, before it learns to filter those out too. Custom computer vision models allow these companies to nip the issue in the bud before the uptick escalates.

 

As companies grow and change, so do societies and cultures. What was once considered to be an unharmful image may become unsettling, while what was previously seen as appropriate may become unacceptable. Using computer vision, platforms that rely on UGC can keep up with the times and their dynamic consumers.

New call-to-action

Previous Next