Data Center Spending Surge: Nvidia, AI Vendors Welcome New Growth Wave

According to the latest forecast data from Wall Street financial giant Citigroup, by 2025, the capital expenditures related to data centers for the four largest U.S. technology giants are expected to grow by at least 40% year-on-year. These substantial capital expenditures are largely linked to generative artificial intelligence, indicating that the computational power demand for AI applications like ChatGPT remains substantial. Citigroup stated that this implies that the giants' spending on data centers is likely to continue to expand significantly on top of the already strong 2024 spending scale. The institution expects this trend to provide a very substantial positive catalyst for the stock prices of AI GPU's undisputed leader, NVIDIA (NVDA.US), and data center interconnect (DCI) technology providers, with NVIDIA's stock price, which has risen for five consecutive days, gearing up to challenge new highs.

The four major technology giants referred to in Citigroup's research report are the global cloud computing giants Amazon (AMZN.US), Google (GOOGL.US), Microsoft (MSFT.US), and Meta Platforms (META.US), the parent company of social media platforms Facebook and Instagram. In this latest research report, Citigroup estimates that by 2025, the capital expenditures for data centers of these four technology giants will grow by 40% to 50% year-on-year. The substantial increase in spending by technology giants on data centers is expected to drive the stock prices of NVIDIA, which holds a "seller of picks and shovels" position in the AI infrastructure field, and data center network technology giants like Arista Networks, to continue to be favored by international capital.

Advertisement

Citigroup's analyst team stated in an investor report: "We expect the growth opportunities for AI GPUs and artificial intelligence networks to expand from connections between individual server systems within data centers to large-scale artificial intelligence computing platforms connected through DCI across multiple flagship data centers." The analyst team is led by Wall Street star analyst Atif Malik.

With the widespread adoption of AI technology, especially large-scale AI models and AI applications like ChatGPT, data centers have become the most central facilities for efficiently processing and storing large amounts of data. The global craze for ChatGPT in 2023, the significant launch of the Sora Wen Sheng video model in 2024, and NVIDIA's unparalleled performance for several consecutive quarters in the AI field may indicate that human society has gradually entered the AI era since 2024. Large data centers can be considered the most core large-scale infrastructure projects of the AI era, which are crucial for the efficient operation of generative AI applications like ChatGPT and the update and iteration of AI large models like GPT-4o.

The world's most important cloud service providers, who are also leaders in AI technology, such as Amazon's AWS, Google Cloud Platform, and Microsoft Azure, are accelerating the investment and construction of large data centers in North America to meet the increasing demand for cloud AI training/inference computing power from their cloud computing clients in recent years. This is also the core logic behind the nearly 75% surge in value of Digital Realty, one of the largest data center REITs in North America, since 2023.

The Citigroup analysis team led by Malik emphasized that the adoption of artificial intelligence (AI) is still in the early to mid-stages, especially with the hot AI application of "artificial intelligence agents" driving the demand for AI applications in enterprises, which will surge, thereby bringing a strong demand for computational resources and stimulating technology giants to significantly expand and build new data centers. They predict that NVIDIA's gross margin for the quarter ending in January will temporarily drop to about 72%, and once the Blackwell architecture AI GPU server shipments are integrated with NVIDIA's performance in large quantities, the long-term gross margin will stabilize at around 75% or even higher.

The Citigroup analysis team also discussed the choice between customized AI ASICs and AI GPUs, two crucial hardware components that data center AI servers must carry for the training/inference computational power of AI large models, and pointed out that AI GPUs have become the preferred choice due to their hardware flexibility and universal adaptability to the rapidly developing AI applications and updates and iterations of AI large models, especially with the extremely wide moat established by the super performance AI GPU + CUDA ecosystem barrier, which gives NVIDIA the ability to occupy a long-term market share of 80%-90% in the field of data center AI servers. The Citigroup analysis team reiterated their "buy" rating for NVIDIA's stock and a target stock price as high as $150.

DCI technology providers also benefit from the expansion of data center spending, with some providers even being direct partners of NVIDIA. The main DCI stock targets that Citigroup expects to benefit include Arista Networks (ANET.US), a leader in the Ethernet field and a partner of NVIDIA in the data center field, as well as network service leaders Ciena (CIEN.US), Cisco (CSCO.US), and Coherent (COHR.US).

NVIDIA's stock price is challenging new highs! Citigroup bets NVIDIA can soar to $150

Amid global stock market fluctuations at high levels, the stock price of NVIDIA, the AI GPU leader in the AI infrastructure field, has quietly risen to near its historical highest point. NVIDIA's stock price has been strongly rising for five consecutive trading days, reaching a high of $133.480 in the U.S. stock market on Tuesday, with a cumulative increase of 14% over five days, getting closer and closer to the historical high of $140.747.On Wall Street, institutions have an exceptionally strong "buy the dip" sentiment towards NVIDIA. The U.S. stock bulls on Wall Street firmly believe that the significant pullback in technology stocks since early August has squeezed out most of the "AI bubble." In the subsequent market, technology companies that can continue to profit from the AI wave are expected to enter a new round of "main uptrend" surges, such as NVIDIA, AMD, TSMC, Advantest, and Broadcom, among other popular chip stocks. Chips are an indispensable core infrastructure for popular generative AI tools like ChatGPT, and these popular chip stocks can be considered the biggest winners of the AI boom, especially the "CUDA ecosystem barrier + high-performance data center server AI GPU," which together form NVIDIA's incredibly strong moat.

The CUDA ecosystem barrier can be said to be NVIDIA's "strongest moat." NVIDIA has been deeply involved in the field of high-performance computing globally for many years, especially with its CUDA computing platform, which has become popular worldwide and is considered the preferred hardware and software collaborative system in the field of AI training/inference and other high-performance computing areas. The CUDA accelerated computing ecosystem is a parallel computing acceleration platform and programming auxiliary software exclusively developed by NVIDIA, allowing software developers and software engineers to use NVIDIA GPUs to accelerate parallel general-purpose computing (only supports NVIDIA GPUs and is not compatible with mainstream GPUs such as AMD and Intel).

CUDA is a platform that generative AI applications like ChatGPT heavily rely on, and its importance is on par with the hardware system. It is crucial for the development and deployment of large AI models. With its extremely high level of technical maturity, absolute performance optimization advantages, and extensive ecosystem support, CUDA has become the most commonly used and widely popularized collaborative platform in AI research and commercial deployment.

According to NVIDIA's official website, using NVIDIA GPUs for CUDA regular accelerated computing programming and some basic tools is a free approach. However, if it involves CUDA enterprise-level large-scale applications and support (such as NVIDIA AI Enterprise), or renting NVIDIA computing power on cloud platforms (such as Amazon AWS, Google Cloud, Microsoft Azure), it may require a subscription to advanced CUDA microservices for developing AI systems, which could incur additional costs.

Based on the incredibly powerful and highly penetrated CUDA platform and high-performance AI GPUs, NVIDIA has recently been continuously increasing its layout in the full-stack ecosystem of software and hardware. In March at GTC, NVIDIA officially launched a microservice called "NVIDIA NIM," which charges per GPU usage time. It is a cloud-native microservice focused on optimization, aiming to shorten the time to market for generative AI applications based on large AI models and simplify their deployment workloads in the cloud, data centers, and GPU-accelerated workstations. This allows companies to deploy AI applications on NVIDIA AI GPU cloud inference computing power and the acceleration provided by the CUDA platform, seeking to establish an AI application full-stack development ecosystem exclusive to NVIDIA GPUs.

This is also why the well-known Wall Street investment institution Rosenblatt is more optimistic about the revenue growth of software centered on CUDA than NVIDIA AI GPU revenue. Rosenblatt's chip industry analyst Hans Mosesmann raised the institution's 12-month target stock price for NVIDIA from $140 to an astonishing level of $200 per share in a research report, ranking as the highest target price for NVIDIA on Wall Street.

Mosesmann said that based on the potential prosperity expected from NVIDIA's software business centered on CUDA, even though the AI chip hegemon NVIDIA's stock price has soared in a year, the stock price of this chip giant will continue to rise in the next 12 months. Therefore, in addition to the huge GPU revenue brought by CUDA's tight binding with NVIDIA AI GPUs, and the revenue generated by CUDA's enterprise-level large-scale applications, the software business derived from CUDA is also an engine for NVIDIA to achieve huge revenue.

Data center network technology providers such as Arista usher in the "Golden Age"

In May 2024, Arista announced a deep cooperation with NVIDIA aimed at improving the efficiency of AI data centers by unified management of AI clusters. This cooperation mainly targets the computing and networking parts of AI computing clusters, providing end-to-end data center network optimization and monitoring by integrating Arista's EOS operating system with NVIDIA SuperNIC network accelerators.

Through this technology, Arista's network devices can work in coordination with NVIDIA's powerful computing resources (including AI GPUs and NICs), ensuring that the job completion time of AI supercomputing clusters is shortened and the latency caused by network issues is reduced. This cooperation demonstrates the efforts of Arista and NVIDIA in promoting the optimization of overall AI computing resource allocation in data centers, simplifying the management of artificial intelligence network systems, and improving performance. Arista's leading position in the data center Ethernet field and its deep cooperation with NVIDIA are the core logic for Citi's bullish view on Arista.Arista Networks is a highly influential company in the field of Ethernet, particularly excelling in large data center network technology and cloud server domains. Arista Networks provides high-performance Ethernet switches suitable for data centers and cloud computing environments. These Ethernet switches feature high throughput, low latency, and a rich set of networking features to support the network demands of large-scale data centers and cloud service providers.

In the AI technology stock investment boom that has swept Wall Street since 2023, Arista Networks, a leader in the Ethernet field with outstanding performance in data center network systems and cloud computing servers, is also one of the biggest winners. The global frenzy of enterprises laying out AI has significantly stimulated the demand for Ethernet, which is closely related to the construction of data center networks, thereby stimulating Arista Networks' stock price to reach new highs repeatedly, with an increase of over 230% since 2023.

The Citi analysis team led by Malik stated in their research report: "In fact, as the scale of artificial intelligence supercomputing clusters is expected to reach 300K+ GPUs by 2025, including large models in a single data center unit is unsustainable in the long run due to limitations in electricity, regulations, and other aspects." "Therefore, hyper-scale technology companies are adopting and planning to further expand the adoption scale of AI large model training models based on multiple data centers."

Citi pointed out that Google is using a novel way of training Gemini 1 Ultra across multiple data centers, and OpenAI, Microsoft, and Anthropic are also adopting this model.

Due to the continuous expansion of global technology companies' spending on DCI, Citi expects Arista's overall revenue scale to grow by 25% year-on-year in 2025. It is expected that most of this growth will come from the Ethernet switch subsystem in artificial intelligence data center networks.

The Citi analysis team has significantly raised Arista's 12-month target price from $385 to $460 and maintained a "Buy" rating. Arista's stock closed at $400.22 on Tuesday in the US stock market, near its historical high. At the same time, Citi also maintained a "Buy" rating and a $68同期 target price for Ciena, a leader in network services.