Home Artificial Intelligence Sam Altman, others invest million in AI data center energy startup

Sam Altman, others invest $20 million in AI data center energy startup

OpenAI CEO Sam Altman during the OpenAI DevDay event on November 06, 2023 in San Francisco, California.
Photo: Justin Sullivan (Getty Images)

An energy startup aimed at cleaning up artificial intelligence’s massive use of electricity is getting a multi-million dollar boost from investors, including the AI industry’s top leader.

Exowatt, a startup developing modules that store energy as heat and produce electricity for AI data centers, is counting OpenAI chief executive Sam Altman and venture capital firm Andreessen Horowitz among its investors — to the tune of $20 million, the Wall Street Journal reported.

A solar solution to AI’s energy use

Exowatt develops shipping container-sized modules fitted with solar lenses that turn the sun’s energy into heat, which can warm up cheap material, and can be stored for up to 24 hours. The modules then produce electricity by passing the stored heat through an engine. The company’s goal is to benefit from cost reductions by storing energy as heat.

“You don’t have to go back to fossil fuels to solve the data-center energy problem…That’s counterproductive,” Hannan Parvizian, chief executive of Exowatt, told the Journal.

The company is reportedly focused on using U.S.-made components for its modules to both steer away from Chinese-made parts and qualify for rich subsidies from the Inflation Reduction Act. Exowatt reportedly wants to offer electricity for as low as one cent per kilowatt-hour minus subsidies, and hopes to deliver its modules later this year.

Tech and climate leaders alike have noted how generative AI uses a mammoth amount of energy — and the industry needs to answer to its insatiable demand for electricity.

“We won’t be able to continue the advancements of AI without addressing power,” Ami Badani, chief marketing officer of semiconductor firm Arm, said at Fortune’s Brainstorm AI conference this month. “ChatGPT requires 15 times more energy than a traditional web search.”

According to Badani, 2% of global electricity consumption comes from data centers where AI models are being trained. She warned that the technology could eventually make up a quarter of power consumption in the U.S. by 2030.

Nine out of ten U.S. utility companies reported data centers as their top customer growth area in first quarter earnings, Reuters reported.

 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment