Abstract big data image
Image Credits:koto_feja / Getty Images

Modular closes $30 million seed round to simplify the process of developing AI systems

AI has transformative potential. But if you ask the co-founders of Modular, a startup emerging from stealth today, the software used to develop it is “monolithic,” fractured into silos piled with layers of complexity. Big Tech companies have made helpful contributions, like TensorFlow and PyTorch — AI development frameworks maintained by Google and Facebook, respectively. But these companies, the Modular co-founders posit, show a preference for their tooling and infrastructure at the expense of the AI’s progress.

Modular aims to change that. Founded by former Apple and Google engineers and execs, the company today closed a large ($30 million) seed round led by GV (formerly Google Ventures), with participation from Greylock, The Factory and SV Angel to realize its vision of a streamlined, platform-agnostic AI system development platform.

“The industry is struggling to maintain and scale fragmented, custom toolchains that differ across research and production, training and deployment, server and edge,” Modular CEO Chris Lattner told TechCrunch in an email interview. “Many of the world’s largest, non-big tech firms naively believe that the open-source community and the open-source infrastructure owned by Google, Meta, and Nvidia, will eventually provide this, when their priorities and limitations show otherwise.”

Lattner has an impressive resume, having spearheaded the creation of Swift, the programming language that powers much of the Apple ecosystem. He previously was the VP of Tesla’s self-driving division and president of engineering and product at SiFive, which provides intellectual property to chip design companies. During a tenure at Google, Lattner managed and built a range of AI-related products, including TPUs at Google Brain, one of Google’s AI-focused research divisions, and TensorFlow.

Modular’s other co-founder, Tim Davis, is accomplished in his own right, having helped set the vision, strategy and roadmaps for Google machine learning products spanning small research groups to production systems. From 2020 to early 2022, Davis was the product lead for Google machine learning APIs, compilers and runtime infrastructure for server and edge devices.

Modular
Image Credits: Modular

“The most pressing issue facing companies who aren’t ‘Big Tech’ is how to productionize AI within performance, cost, time, and talent bounds. The opportunity cost of this challenge is enormous. For individual companies, this means innovations not making it to market, inferior product experiences, and ultimately a negative impact on their bottom line,” Lattner said. “AI can change the world, but not until the fragmentation can be healed and the global developer community can focus on solving real problems, not on the infrastructure itself.”

Modular’s solution is a platform that unifies popular AI framework frontends via modular, “composable” common components. Details are a bit murky — it’s early days, Lattner cautioned — but the goal with Modular is to let developers plug in custom hardware to train AI systems, deploy those systems to edge devices or servers and otherwise “seamlessly scale [the systems] across hardware so that deploying the latest AI research into production ‘just works,’” Lattner said.

Modular stands in contrast to the emerging MLOps category of vendors, delivering tools for gathering, labeling and transforming the data needed to train AI systems as well as workflows for authoring, deploying and monitoring AI. MLOps, short for “machine learning operations,” seeks to streamline the AI life cycle by automating and standardizing development workflows, much like DevOps was meant to accomplish for software.

Driven by the accelerating adoption of AI, analytics firm Cognilytica predicts that the global market for MLOps solutions will be worth $4 billion by 2025 — up from $350 million in 2019. In a recent survey, Forrester found that 73% of companies believe MLOps adoption would keep them competitive while 24% say it would make them an industry leader.

“Modular’s main competition is the mindset that dominates AI software development within Big Tech, and Big Tech itself,” Lattner said. “The reason those companies are successful at deploying AI is that they amass armies of developers, incredibly talented AI tinkerers, and use their vast compute and financial resources to further their own efforts and products — including their own clouds and AI hardware. Despite their incredible contributions to the field, their self-preferencing highlights a deep chasm in AI and places an industry-limiting ceiling on the rest of the world’s ability to use this technology to fight some of our most significant socioeconomic and environmental problems.”

Lattner — without naming names — claims that Modular is already working with “some of the biggest [firms] in tech.” The near-term focus is expanding Modular’s 25-person team and readying the platform for launch in the coming months.

“Changing economic conditions mean that the world’s largest AI companies have spent billions on AI to focus on production — and making money — from AI, rather than tinkering,” Lattner said. “Many of the best and brightest computer scientists — effectively, the 100x engineers within organizations where 10x engineers are the norm — are fighting just to maintain and make these systems work for basic use cases — most of which are focused on revenue optimization projects, not changing the world. To that end, technical decision makers are looking for infrastructure that is more usable, flexible, and performant, streamlining e2e AI development and deployment and enabling AI research to move to production faster. They are really just looking to realize much greater value from AI at lower deployment cost.”

Topics

, , , , , , , , , , , , , , , ,

Related