Celestial AI raises $100M to transfer data using light-based interconnects
David Lazovsky and Preet Virk, technologists with backgrounds in semiconductor engineering and photonics, came to the joint realization several years ago that AI and machine learning workloads would quickly encounter a “data movement” problem. Increasingly, they predicted, it would become challenging to move data to and from compute hardware as AI models scaled past what could be kept on the die of any one memory chip.
Their solution — architected by Phil Winterbottom, previously a researcher at the distinguished Bell Labs — was an optical interconnect technology for compute-to-compute, compute-to-memory and on-chip data transmission. Along with Winterbottom, Lazovsky and Virk founded a startup, Celestial AI, to commercialize the tech. And now, that startup is attracting big backers.
Celestial AI today announced that it raised $100 million in a Series B round led by by IAG Capital Partners, Koch Disruptive Technologies and Temasek’s Xora Innovation fund. The tranche, which brings Celestial AI’s total raised to more than $165 million, will be used to support the production of Celestial’s photonics platform by expanding the company’s engineering, sales and technical marketing departments, according to CEO Lazovsky.
Celestial has around 100 employees at present — a number that Lazovsky expects will grow to 130 by the end of the year.
“Today, compute and memory are closely coupled. The only way to add more high bandwidth memory is to add more compute, whether the additional compute is required or not,” Lazovsky told TechCrunch via email. “Celestial’s tech enables memory disaggregation.”
In a data center, memory is often one of the most expensive resources — in part because it’s not always used efficiently. Because memory is tied to compute, it’s challenging — and sometimes impossible, due to bandwidth constraints and sky-high latency — for operators to “disaggregate” and pool the memory across hardware within the data center.
According to an internal Microsoft study, up to 25% of memory in Azure is “stranded,” or left over, after the servers’ cores have been rented to virtual machines. Reducing this stranded memory could cut data center costs by 4% to 5%, the company estimated — potentially significant savings in the context of a multibillion-dollar operation.
Celestial — which began as a portfolio company of The Engine, the VC firm spun out of MIT in 2016 — developed an ostensible solution in its photonics-based architecture, which scales across multiple-chip systems. Using light to transfer data, Celestial’s tech can beam information both within chips and chip-to-chip, making both memory and compute available for AI — and other — workloads.
Celestial also claims that its tech can reduce the amount of electricity necessary for data movement, indirectly boosting a chip’s performance. Typically, chips devote a portion of the electricity they draw to data movement between their circuits, which takes away from the electricity that the chip can direct to computing tasks. Celestial’s photonics reduce the power required for data movement, allowing a chip to — at least in theory — increase its compute power.
Celestial’s photonics tech, which is compatible with most industry interconnect standards (e.g. CXL, PCIe), delivers 25x higher bandwidth and 10x lower latency and power consumption than optical alternatives, the company asserts.
“With the growth in AI , especially large language models (LLMs) and recommendation engine workloads, there is a shift towards accelerated compute,” Lazovsky said. “The key problem going forward is memory capacity, memory bandwidth and data movement — i.e. chip-to-chip interconnectivity — which is what we are addressing with Celestial’s photonic fabric.”
Celestial is offering its interconnect product through a licensing program, and says that it’s engaged with several “tier-one” customer including hyperscalers and processor and memory companies.
The interconnect product appears to be priority number one for Celestial. Celestial sells its own AI accelerator chip, dubbed Orion, built on the company’s photonics architecture. But as investors told TechCrunch in a recent piece for TC+, AI photonics chips have yet to overcome engineering challenges that would make them practical at scale. Unless Celestial stumbled upon breakthroughs in the areas of data-to-analog conversion and signal regeneration — top stumbling blocks for today’s photonics chips — it’s unlikely that Orion is much further along than the competition.
Chip aside, Celestial has a number of competitors in a photonic integrated circuit market that could be worth $26.42 billion by 2027.
Ayar Labs, which makes chip solutions based on optical networking principles, has raised over $200 million in venture capital since its founding in 2015. Ravonus, another rival, recently landed a $73.9 million investment.
There could be consolidation ahead in the broader optical interconnection space, though. Around three years ago, Marvell bought Inphi, an optical networking specialist, for $10 billion. After a period of quiet, Microsoft last year acquired Lumenisity, a startup developing high-speed optical cables for data center and carrier networks.
Both Inphi and Luminensity were targeting different use cases with their tech. But the enthusiasm from Big Tech around optics and photonics is worth making a note of.
Samsung Catalyst, Smart Global Holdings, Porsche Automobil Holding SE, The Engine Fund, imec.xpand, M Ventures and Tyche Partners also participated in Celestial’s Series B.
Celestial AI raises $100M to transfer data using light-based interconnects by Kyle Wiggers originally published on TechCrunch