The Zhitong Finance App learned that Chris Kelly (Chris Kelly), the former head of privacy affairs at Facebook, said that the next phase of this boom in the artificial intelligence (AI) industry will focus on improving the efficiency of technology application. Kelly said on Monday that currently, the world's leading AI companies are competing to build infrastructure to support AI computing power requirements, and the AI industry must optimize such energy-intensive infrastructure construction models. He said, “The human brain only needs 20 watts of energy to run. We deduce and think about problems, and we don't need a gigawatt energy center to support it at all. I think how to tap the potential for efficiency will become one of the core exploration directions of leading AI companies.” He added that companies that can achieve breakthroughs in reducing data center costs will ultimately be winners in the AI field.
According to reports, S&P Global statistics show that in 2025, major hyperscale cloud computing service providers set off a global data center construction boom, and the total scale of infrastructure-related transactions in this field has exceeded 61 billion US dollars throughout the year. OpenAI alone has finalized an AI investment plan of over 1.4 trillion US dollars over the next few years, including a series of major partnerships with video card giant Nvidia (NVDA.US), infrastructure industry leaders Oracle (ORCL.US), and CoreWeave (CRWV.US).
However, the current wave of data center construction is also causing growing concerns from the outside world — how to provide sufficient power support for these computing power infrastructures, which are already under pressure. Nvidia and OpenAI jointly announced a cooperation plan in September of this year. The computing infrastructure involved requires at least 10 gigawatts of electricity. This electricity consumption is about equal to the total annual electricity consumption of 8 million American households. According to data from New York independent system operators, this 10 gigawatt electricity consumption is also basically the same as New York City's total demand during the peak electricity consumption period in the summer of 2024.
After DeepSeek launched a free, open source big language model in December 2024, the AI industry's concerns and concerns about costs have further intensified. DeepSeek said the model cost less than $6 million to develop, far lower than US rivals.
Kelly said he expects to see “a number of Chinese companies emerge,” especially after US President Trump recently approved the sale of Nvidia H200 chips to China. Kelly added that the open source model — especially the open source model from China — will provide people with “a basic level of computing power” and the ability to use generative and agentic (agentic) AI.