As generative AI startups race to court investors and Big Tech players make aggressive competitive moves, both parties are running into the same problem: extremely high computing demands coupled with sky-high energy consumption.
Celestial AI, which develops photonics-based chips and technology frameworks, just raised a $100 million Series B to address these challenges. The round was led by IAG Capital Partners, Koch Disruptive Technologies and Temasek's Xora Innovation fund. Samsung Catalyst and automaker Porsche also joined, among others.
The startup is the latest to capitalize on the generative AI wave and the need for massive computing power to process and train large language models like OpenAI's GPT-4 and Google's PaLM 2. Celestial AI also licenses its models and frameworks to clients so that they can develop their own light photonics-based semiconductors.
"The demands for AI [are] having a ripple effect through the infrastructure that these workloads are running on," said Dave Lazovsky, the founder and CEO of Celestial AI. "The level of adoption and momentum driven for AI is unprecedented."
Light photonics, which uses lasers and light to transmit data as opposed to traditional silicon-based wiring, has been lauded as a safer, faster and more energy-efficient solution to processing data. And investors are taking note: Just last month Boston-based photonics startup Lightmatter raised $154 million and tripled its valuation to $720 million.
New energy demands are creating a need for completely different benchmarks in the semiconductor industry, which has shifted from performance per watt to data transmission speed and energy efficiency, according to Lazovsky. And these new metrics poise startups like Celestial AI, Ayar Labs and Lightmatter for growth.
"It's a whole different game," Lazovsky said.
Other photonic computing startups have closed sizable rounds. Ayar Labs raised $130 million in 2022 from backers like Intel and Nvidia. PsiQuantum raised $450 million in a 2021 round led by BlackRock for its approach to quantum computing that utilizes photonics.
While photonics is more than 60 years old, it has recently attracted renewed attention as startups and Big Tech players begin to hit a wall with what traditional chipsets can accomplish, in part because of the extreme energy demands required to train these models.
OpenAI used about 1.287 gigawatt hours to train its GPT-3, The Information reported, more electricity than the average American consumes in a century. And while training GPT-4, the company reportedly doubled its losses to $540 million. Chipmaker Nvidia, whose chips were popular for use in crypto mining, briefly hit a $1 trillion market cap as its chips gained traction in powering generative AI applications.
Alphabet and Amazon have been creating AI-centric chips for their data centers, and Meta is said to be developing a competing chip.
This shift in processing being driven by AI is only the beginning, according to Lazovsky.
"Behind all of the AI workloads in development right now is massive amounts of infrastructure," he said. "Our investors are excited since we're just at the beginning of a very long journey with artificial intelligence."
Featured image by dotshock/Shutterstock
This article originally appeared on PitchBook News