Technology

FedML raises $11.5 to combine MLOps tools with a decentralized AI compute network

Interest in AI among the enterprise continues to rise, with one recent survey finding that nearly two-thirds of companies plan to increase or maintain their spending on AI and machine learning into this year. But often, these companies are encountering blockers in deploying various forms of AI into production.

A 2020 poll from Rexer Analytics found that only 11% of AI models are always deployed. Elsewhere, one Gartner analyst estimated that close to 85% of big data projects fail.

Inspired to tackle the challenges, Salman Avestimehr, the inaugural director of the USC-Amazon Center on Trustworthy Machine Learning, co-founded a startup to allow companies to train, deploy, monitor and improve AI models on the cloud or edge. Called FedML, it raised $11.5 million in seed funding at a $56.5 million valuation led by Camford Capital with participation from Road Capital and Finality Capital.

“Many businesses are eager to train or fine-tune custom AI models on company-specific or industry data, so they can use AI to address a range of business needs,” Avestimehr told TechCrunch in an email interview. “Unfortunately, custom AI models are prohibitively expensive to build and maintain due to high data, cloud infrastructure and engineering costs. Moreover, the proprietary data for training custom AI models is often sensitive, regulated or siloed.”

FedML overcomes these barriers, Avestimehr claims, by providing a “collaborative” AI platform that allows companies and developers to work together on AI tasks by sharing data, models and compute resources

FedML can run any number of custom AI models or models from the open source community. Using the platform, customers can create a group of collaborators and sync AI applications across their devices (e.g. PCs) automatically. Collaborators can add devices to use for AI model training, like servers or even mobile devices, and track the training progress in real time.

Recently, FedML rolled out FedLLM, a training pipeline for building “domain-specific” large language models (LLMs) a la OpenAI’s GPT-4 on proprietary data. Compatible with popular LLM libraries such as Hugging Face’s and Microsoft’s DeepSpeed, FedLLM is designed to improve the speed of custom AI development while preserving security and privacy, Avestimehr says. (To be clear, the jury’s out on whether it accomplishes that, exactly.)

In this way, FedML doesn’t differ much from the other MLOps platforms out there — “MLOps” referring to tools for streamlining the process of taking AI models to production and then maintaining and monitoring them. There’s Galileo and Arize as well as Seldon, Qwak and Comet (to name a few). Incumbents like AWS, Microsoft and Google Cloud also offer MLOps tools in some form or another (see: SageMaker, Azure Machine Learning, etc.)

But FedML has ambitions beyond developing AI and machine learning model tooling.

The way Avestimehr tells it, the goal is to build a “community” of CPU and GPU resources to host and serve models once they’re ready for deployment. The specifics haven’t been worked out yet, but FedML intends to incentivize users to contribute compute to the platform through tokens or other types of compensation.

Distributed, decentralized compute for AI model serving isn’t a new idea — Gensys, Run.AI and Petals are among those who have attempted — and are attempting — it. Nevertheless, Avestimehr believes FedML can achieve greater reach and success by combining this compute paradigm with an MLOps suite.

“FedML enables custom AI models by empowering developers and enterprises to build large-scale, proprietary and private LLMs at less cost,” Avestimehr said. “What sets FedML apart is the ability to train, deploy, monitor and improve ML models anywhere and collaborate on the combined data, models and compute — significantly reducing the cost and time to market.”

To his point, FedML, which has a 17-person workforce, has around 10 paying customers including a “tier one” automotive supplier and a total of $13.5 million in its warchest, inclusive of the new funding. Avestimehr claims that the platform is being used by more than 3,000 users globally and performing over 8,500 training jobs across more than 10,000 devices.

“For the data or technical decision-maker, FedML makes custom, affordable AI and large language models a reality,” Avestimehr said, with confidence. “And thanks to its foundation of federated learning technology, its MLOps platform and collaborative AI tools that help developers train, serve and observe the custom models, building custom alternatives is an accessible best practice.”

Leave a Reply

Your email address will not be published. Required fields are marked *