Customer Stories


Photobucket uses Clarifai’s NSFW model to automatically flag and moderate unwanted nudity from user-generated content.


About: Photobucket is a dedicated photo and video sharing service. With over 100 million registered members, Photobucket users upload over four million images and videos per day from the Web, Smartphones, and connected digital cameras.

Content Management
Use Case


improvement in unwanted content detection


of Photobucket’s human moderation team transitioned to full-time customer support

2 Million

all images uploaded to Photobucket everyday pass through Clarifai’s NSFW filter


Photobucket is one of the world’s most popular online image and video hosting communities. The platform hosts over 15 billion images, with two million more uploads every day. While user-generated content (UGC) is Photobucket’s bread and butter, it also poses a Trust and Safety risk of users who upload illegal or unwanted content. With a firehose of content continually flowing in, it’s impossible for a team of human moderators to catch every image that goes against Photobucket’s terms of service.


Photobucket needed a solution that would provide a highly scalable system for moderating user-generated content while improving the hit ratio of finding offensive content and the productivity of Photobucket’s human moderation team.



Photobucket developer Mike Knowles was looking for a quick and easy way to implement machine learning-based image recognition technology in his tech stack. After ruling out building machine learning in-house as too costly and inefficient in the long-run, Mike decided using a computer vision API would be the best way to validate his idea and go to market quickly. He tested half a dozen computer vision APIs including Google Cloud Vision and Amazon Rekognition before deciding that Clarifai offered the best possible solution for his business.


Mike selected Clarifai based on the superior accuracy and ease of use of the technology, the transparency of the online demo, the completeness of the documentation, and the enthusiasm and professionalism of Clarifai’s team. He was also excited about the wide range of computer vision models Clarifai has to offer, including the General model that recognizes over 11,000 concepts and the Moderation model that currently recognizes different levels of nudity (e.g. explicit and suggestive) along with gore and drugs, with future plans to recognize symbols of hate and violence.


With a product team of four, Mike was able to launch Photobucket’s new content moderation workflow using Clarifai in 12 weeks from concept to internal rollout of the new moderation workflow process. With the new workflow increasing productivity for the human moderation team, 80% of Photobucket’s human moderation team was able to transition to full-time customer support.




Before turning to Clarifai for computer vision powered moderation, Photobucket used a team of five human moderators to monitor user-generated content for illegal and offensive content. These moderators would manually review a queue of randomly selected images from just 1% of the two million image uploads each day. Not only was Photobucket potentially missing 99% of unwanted content uploaded to their site, but their team of human moderators also suffered from an unrewarding workflow resulting in low productivity.


To catch more unwanted UGC, Photobucket chose Clarifai’s Not Safe for Work (NSFW) nudity recognition model to automatically moderate offensive content as it is uploaded to their site. Now, 100% of images uploaded every day on Photobucket pass through Clarifai’s NSFW filter in real-time. The images that are flagged ‘NSFW’ then route to the moderation queue where only one human moderator is required to review the content. Where’s rest of the human moderation team? They’re now doing customer support and making the Photobucket user experience even better.