Today marks the tenth anniversary of the founding of Clarifai, something I'm pretty proud of. In a year when generative AI has disrupted and surprised the industry, it's a year when I've become even more confident in my original purpose of starting Clarifai: to get the best of AI into everyone's hands. Generative AI has opened eyes to extracting value from structured and unstructured data in ways that promise to drastically reduce the time and cost associated with many customer and employee use cases. Clarifai's value is enabling developers of all skill levels to use this amazing technology.
My journey is personal and professional. At a time when the speed of AI technology, innovation, and leadership has never been faster, it's fun to reflect on parts of Clarifai's journey that make it so memorable for me and those who've been here.
In my early days as part of the Google Brain team in 2012 and 2013 under Jeff Dean and his team, I learned how to do proper software development collaboratively and to scale AI massively. I saw that my PhD work was better than Google's at the time - and my quest to bring AI to every developer in the world began! I purchased a gamer rig with GTX 580s - the fastest gaming cards available, and we built our first neural network library in Python, leading to winning Imagenet and putting us on the map as leaders in computer vision three weeks after our incorporation on November 20, 2013.
Many people have asked me how I came up with the Clarifai name. I can assure you that there were MANY terrible ones before Clarifai, most with "deep" in their name from deep learning. Then, I decided to think more broadly about all of AI and all the words that contain 'AI.' I stumbled upon 'bonsai,' which sounds like it ends in 'y,' which led me to words ending in "y" that would be relevant to AI…and ‘clarify’ popped into my head - as in how AI helps you understand your data or enables you to clarify it. Thus, the name Clarifai was born!
My New York City apartment soon was home to server-grade systems we could rack for more compute power for the novel AI research we'd started, bought on my credit card. We hand-wrote our own CUDA kernels, wrapped in a straightforward Python interface, years before TensorFlow and PyTorch were available. We rolled our own stack, which was very easy to use and flexible to extend to the CUDA kernels right from the Python code. If only we had open-sourced it back then!
Well ahead of its time, we had the basis for the world's first serverless AI inference engine, with configurable node types, GPU memory, and tons of optimizations for low latency requests, which made this unique. Always keeping foremost simplicity for developers, we built a platform offering serverless inference. You never have to think about machines, sizing them, or replicating them; no deploy button; request base pricing per image - things the market had never seen then - all so developers can focus on building an AI application and not on the infrastructure that powers it.
In 2014, our deep learning API was a pioneer in the field. With one model to recognize 10,000 different concepts (10x larger than Imagenet), we had what could be considered the first foundation model in today's vernacular. The model was wrapped in a Python Django API layer, offered very low latency, and was accessible through a demo page so you could try it out. Soon after, we launched the world's first video recognition API that decodes video, runs AI across the frames, and tells you what's in them, then industry-specific Model Galleries and a mobile SDK that caught the attention of Apple and Google engineers for running inference with models 100% on edge devices, before coreML, Metal, and other on-device optimizations.
The full stack platform came to life in 2016 when we saw that tools we'd built internally to produce production-quality AI were the exact tools needed by any developers who wanted to build AI themselves. This became the driving force for our work. We pioneered the first product vector database - complete with indexing - so we could store the embedding vectors when custom training models. With a query image, you could easily find things that look alike, hiding all the complexity of embedding and vectorDB from the builder. We introduced custom training based on transfer learning because we had strong embedding models that index your data on upload, enabling you to label a few examples and train in seconds. This method remains the fastest way to build a custom model.
My biggest regret came when, as a young leader, we had this idea for the Clarifai Hub, which eventually became our Clarifai Community. The concept of a model gallery, applications to organize your AI components into reproducible projects (precursor to today's first-of-a-kind Clarifai AI Lake), and all the tools for users to create AI on their own in our full-stack AI platform gelled into the need for a community for people to share the models and datasets they create publicly. It wasn't until many years later that we brought this to fruition because I let internal debates fester about being a social media company vs an AI company. I should have made the call and aligned everyone, which would have moved our market leadership even further ahead much faster. Today, the Community welcomes about 5000 new users a month.
More innovations and firsts followed. My favorite contribution to the open-source community was the first Kubernetes GPU plugin, well before Nvidia paid attention to Kubernetes. Today, we’re the only remaining vendor of four invited by the US Department of Defense to Project Maven, the first large-scale AI program for the government. We introduced workflows to combine models into more complex processing configurations, a function today called automation, chains, flows, and agents - all simply a graph of computation of AI models and other functional logic. Edge deployments followed for deployments in the battlefield and on-premise and air-gapped after that. In 2019, we added auto data labeling, delivering 100x efficiency improvements over human labeling tools. We built saved searches, allowing you to save and share a search with others, using the dynamic search query as a dataset or labeling task and one-click deep training. We extended beyond visual understanding to text and audio, opening up new use cases and setting the stage for large language models several years later. We ended 2019 with Forrester publishing the first ever Computer Vision Platform Wave report where Clarifai was a Visionary Leader alongside the huge trillion-dollar hyper scalers - but the only one who doesn’t lock you into using a specific cloud.
Today, millions of AI models built on Clarifai are instantly available to serve predictions with no deploy button and no infrastructure configurations needed, and personal access tokens to provide easier collaboration across applications and with teammates. We introduced the most sophisticated permissioning of any AI platform on the market with fine-grained access controls that gave complete control to the user to define access to projects, keys, tokens, and collaborators, paving the way for our teams and organizations structure and giving administrators in large organizations user permission control and control over sharing of AI assets from a centralized AI Lake. We added SSO and several products for speech recognition and text-to-speech, thus completing our journey to supporting multiple unstructured data types: image, video, text, and audio in 2021, enabling our customers to have one cohesive platform for many use cases. In early 2022, IDC’s MarketScape for Worldwide Computer Vision AI Software Platforms named Clarifai as a leader, again alongside the huge hyperscalers, quite a testament to our world-class AI technology leadership.
As AI in the workplace and governments expanded, so did the need for making AI trustworthy, explainable, and reproducible. Datasets were launched to package inputs and annotations into reusable building blocks for training and evaluation, with each trained model keeping track of the dataset it uses. We hardened the AI Lake by providing one central system of record for data, annotations, models, workflows, modules, configurations, prompts, and more so that large organizations pursuing multiple AI projects could have visibility to all their projects and foster collaboration to realize their AI strategy. In 2022, we finally launched our Clarifai Community and a wholly rebuilt UI to encourage the accelerating and growing AI community to share what they create all in one platform.
Each innovation - and the many more not mentioned in our first nine years - set the foundation for today's launch of the Clarifai full stack AI platform, empowering the generative AI wave. In 2023, we quickly enabled access to the best third-party LLMs such as OpenAI, Anthropic, Google, Cohere, AI21, etc., in addition to importing the best open-source LLMs like Falcon, Llama 2, Mistral, Zephyr, StarCoder and others on an ongoing basis. This allows developers to build generative AI into their applications with one simple, scalable API that will always stay up to speed with state-of-the-art! We enabled transfer learning on LLMs and LLM fine-tuning. We were the first to allow you to perform automated data labeling with GPT 3.5/4 and LLM wrappers. Partnering with Streamlit, we built UI modules to perform computation around the API, extend Clarifai's UI, or build customer solutions with quick and easy web apps. And more. (Check out 10 innovations in our 10th year!)
I wrote this mostly from memory while on a plane back to DC from the West Coast. I’ve personally lived through the innovation and, in many cases, the development of these exciting capabilities with the team. For the last ten years, we’ve been pioneering the full-stack AI platform from the ground up. Many of the lower layers of the stack have been commoditized while the distance from AI infrastructure and AI applications remains large. In this complicated, extensive set of layers, Clarifai consolidates the state-of-the-art approaches that accelerate our users to leverage AI in production. Recalling the last ten years of pioneering the full AI stack was exciting. This platform has already allowed over 270,000 users to produce millions of AI models. Our decade of innovation instills trust in the largest of customers who adopt Clarifai to stay at the cutting edge.
If reading this excites you, check out our jobs at clarifai.com/company/careers to join us for the next 10 years. If you want to learn more about something, don't hesitate to contact us at email@example.com; we’re happy to help!