Generative AI has a rich history, starting with early chatbots and culminating in the development of generative adversarial networks (GANs) and Transformers. These advancements led to the creation of Large Language Models (LLMs) like GPT, which power applications like ChatGPT and GitHub Copilot. Additionally, latent diffusion models combine the strengths of GANs and Transformers, producing impressive high-resolution images. The availability of open source models has fueled innovation and experimentation in the field of generative AI.
Four trends in this year’s list
Generative AI Infrastructure: OpenAI made a big splash last year with the launch of ChatGPT and again this year with the launch of GPT-4, but their big bet on scale and a technique called Reinforcement Learning with Human Feedback (RLHF) is only one of many directions LLMs are taking. Anthropic and their chatbot Claude use a different approach called Reinforcement Learning Constitutional AI (RL-CAI). The CAI part encodes a set of human-friendly principles designed to limit abuse and hallucination in the outputs. Meanwhile Inflection, a secretive startup founded by DeepMind’s Mustafa Suleyman and Greylock’s Reid Hoffman, is focusing on consumer applications.
And these are just the high-profile entrants on the closed source side. In the world of open source, Hugging Face has become the go-to platform for developers that want to train their own models or fine-tune existing ones. Along with Stability’s open source offerings, Hugging Face also hosts recent state of the art models like Facebook’s LLaMA and Stanford’s Alpaca.
Predictive Infrastructure: During the Gold Rush many individual prospectors went bust, but the people selling the picks and shovels made out just fine. This is why investors often focus on novel infrastructure companies during technology shifts. AI in its many forms is about prediction, so let’s call this new category predictive infrastructure.
The largest of these infrastructure companies host the massive amounts of data needed for enterprise AI applications in a format that facilitates all sorts of data pipelines. Databricks has distinguished itself from Snowflake, a notable incumbent in the space, by being specifically designed for the needs of AI/ML data teams.
Because data labeling, cleaning and other processes are so critical to model training, there are now four companies in this bucket on this year’s list: Coactive, Scale, Snorkel and Surge, up from just one last year (Scale). Two other new entrants to the AI 50, MosaicML and Weights & Biases, specifically help AI practitioners train and fine-tune models. Arize and Hugging Face also make it easy to deploy models at scale.
Generative AI Applications: Midjourney and Stable Diffusion benefited from their virality on social media, planting generative AI in the center of popular culture. Then ChatGPT captured the world’s attention and became the fastest product to attain 100 million users. While Google played catch up with its Bard chatbot, Neeva became the first generative AI-native search engine.
Since LLMs were primarily designed to generate text, generative writing apps are a fast-growing category. Two of these applications are on this year’s list: Jasper, which uses GPT-4 to help marketing copywriters, and Writer, which has trained its own proprietary model and is focused on enterprise use cases. As language models have become more capable, they are able to handle more complex applications, like legal text. Harvey is using GPT-4 to do associate level legal work at law and other professional services firms, while Ironclad has automated many contract processes for in-house legal teams.
Generative AI is inherently creative, so it’s natural to see a lot of innovation in other creative fields. Runway generates, edits and applies effects to video that met the quality bar for the Oscar-winning team behind Everything Everywhere All at Once. Descript focuses on both podcast and video workflows, using generative AI to make the editing process less laborious. ChatGPT, Bing and Bard are general purpose chatbots, but crafting bespoke chatbots is a nascent creative space powered by Character.AI, founded by one of the authors of the original Transformer paper, Noam Shazeer.
Making Powerpoint decks is as close as many people get to being creative at work, but new generative AI apps like Tome make it easy to design beautiful presentations that bring your ideas to life with only text prompts. Another take on work productivity comes from Adept, which has built an action model, ACT-1, that’s trained on how people interact with their computers. Its goal is to eventually automate some of the searching, clicking and scrolling you have to do now to get tasks done.
Predictive AI applications: Another useful way to use AI’s predictive power is to detect anomalies and then find ways to mitigate them. Abnormal Security, for instance, analyzes a company’s cloud email environment to identify phishing attempts and other threats and remove malicious emails. On the medical front, Viz.ai quickly surfaces patient imaging that should be reviewed by a specialist, and coordinates the care team to improve outcomes for patients with stroke and other time-sensitive conditions.
On AI’s horizon
By the time next year’s list rolls out, we believe generative AI and LLMs will still be dominant. But the landscape is changing quickly and there are big opportunities for companies that can move with it. Here are three things to look out for in the coming year:
- The Infrastructure layer is very “fat,” with the biggest companies currently in the space offering models and cloud services. This will change as companies building applications learn how to capture value.
- LLM usage will mature with some companies strongly favoring buying AI models from cloud APIs and others just as passionately wanting to build their own. Many have predicted that startups will graduate from APIs to their own smaller, more efficient models as they grow. Companies with large and unique data stores will see clear advantages to training their own models as moats. Bloomberg’s recent announcement of their bespoke LLM, which is focused on financial language processing, is a great case in point.
- The fast but far-sighted will survive as this wave of AI unleashes massive societal changes. The ability to adapt, pivot and take advantage of unforeseen opportunities will be key. Because this technology has huge potential to transform work, the only constant will be change.
Generative AI has turned many assumptions on their head. In the early days we thought AI would replace manual work, but robotics turned out to be harder than some parts of cognitive knowledge work. And, equally surprising, the approximate nature of generative models makes them better than expected at creative work and less than completely trustworthy on rote, mechanical tasks.
Achieving artificial general intelligence, that futuristic self-learning system that some fear could threaten humanity, is still a moving target. But there can be no doubt that the progress of large language models over the past year has been transformative—and their applications increasingly general. We’re seeing new use cases every day that demonstrate how AI will change the way we work, create and play.