I took a deep dive with Covariant co-founder and CEO Peter Chen at ProMat the other week. The timing was either perfect or terrible depending on who you ask. I’m sure the startup’s comms people are thrilled that I’m writing a follow-up a week later on the occasion of a new funding round.
Regardless, $75 million is a hard number to ignore — especially as the Series C extension brings the AI firm’s total raise up to $222 million. Existing investors Radical Ventures and Index Ventures led the round, which also features returning (Canada Pension Plan Investment Board and Amplify Partners) and new (Gates Frontier Holdings, AIX Ventures and Northgate Capital) investors. The round adds on to an $80 million Series C announced in July 2021.
“This investment allows us to further develop the Covariant Brain as a more capable foundation model and to apply the Covariant Brain to even more use cases across a wide variety of sectors,” Chen tells TechCrunch. “With eCommerce demand continuing to grow and supply chain resilience becoming more important, we’ve made tremendous progress with global retailers and logistics providers, and are looking forward to revealing more details on these partnerships soon.”
I was also treated to a demo of Covariant’s tech at the show last month. It’s quite impressive when you know what’s going on beneath the hood to power its picking and placing. At the heart of the logistics play is the Covariant Brain, which is building a massive database of potential package sizes, shapes and materials based on real-world picks.
In our conversation, Chen used generative AI/ChatGPT as an analogy. It’s more than just a tenuous connection to the latest hype cycle, however, as three of the team’s four co-founders have a direct connection to OpenAI.
Chen says:
Before the recent ChatGPT, there were a lot of natural language processing AIs out there. Search, translate, sentiment detection, spam detection — there were loads of natural language AIs out there. The approach before GPT is, for each use case, you train a specific AI to it, using a smaller subset of data. Look at the results now, and GPT basically abolishes the field of translation, and it’s not even trained to translation. The foundation model approach is basically, instead of using small amounts of data that’s specific to one situation or train a model that’s specific to one circumstance, let’s train a large foundation-generalized model on a lot more data, so the AI is more generalized.
The funding will be used to further the deployment of Covariant’s system to retailers and logistics firms.