close
close

46% of Nvidia’s revenue last quarter came from four mystery clients

46% of Nvidia’s revenue last quarter came from four mystery clients

Nvidia’s incredible growth is increasingly dependent on a handful of customers.

Nvidia (NVDA 0.52%) at the beginning of 2023, its market capitalization was $360 billion. Less than two years later, it is now worth more than $3.4 trillion. Although the company supplies graphics processing units (GPUs) for personal computers and even cars, the data center segment has been the main source of its growth during this period.

Nvidia data center GPUs are the most powerful in the industry for developing and deploying artificial intelligence (AI) models. The company is struggling to meet demand from artificial intelligence startups and the world’s biggest tech giants. While this is great, there are potential risks lurking beneath the surface.

Nvidia’s financial results for the second quarter of fiscal 2025 (which ended July 28) showed that the company is increasingly relying on a small group of customers to drive sales. This is why this could lead to vulnerabilities in the future.

Owning GPUs is a game for rich companies

According to a study by McKinsey and Company, 72% of organizations worldwide use AI in at least one business function. This number continues to grow, but most companies do not have the financial resources (or expertise) to build their own AI infrastructure. After all, one of the leading manufacturers is Nvidia GPUcan cost up to $40,000 and often takes thousands of which to train the AI ​​model.

Instead, tech giants such as Microsoft (MSFT 1.26%), Amazon (AMZN 1.29%)And Alphabet (GOOG 1.66%) (GOOGLE 1.77%) buy hundreds of thousands of GPUs and cluster them in centralized data centers. Enterprises can rent this computing power to implement AI into their operations at a fraction of the cost of building their own infrastructure.

Cloud companies such as DigitalOcean Now make AI accessible even the smallest businesses using the same strategy. DigitalOcean allows developers to access clusters containing one to eight Nvidia H100 GPUs, enough for the most basic AI workloads.

Availability is improving. Nvidia’s new Blackwell-based GB200 graphics systems are capable of AI output 30 times faster than older H100 systems. Each individual GB200 GPU is expected to sell for between $30,000 and $40,000, which is roughly the same price as the H100 when it was first released, so Blackwell is offering an incredible cost-effectiveness improvement.

This means that the most advanced, with a trillion parameters large language models (LLM) – which were previously developed only by well-resourced tech giants and leading AI startups such as OpenAI And anthropic — will be financially accessible to a wider number of developers. However, it could be years before GPU prices drop enough for the average business to support its own AI infrastructure.

Risk for NVIDIA

Because only a small number of tech giants and leading AI startups are buying most of AI GPUs, Nvidia’s sales are extremely concentrated at the moment.

In the second quarter of fiscal 2025, the company’s total revenue was $30 billion, up 122% year-over-year. The data center segment contributed $26.3 billion of this revenue, a figure that grew by a whopping 154%.

According to Nvidia’s 10th quarter report for the second quarter, four customers (who are not identified) accounted for nearly half of its $30 billion in revenue:

Client

Nvidia’s revenue share in the second quarter

Client A

14%

Client B

11%

Client C

11%

Client D

10%

Data source: Nvidia.

Nvidia only highlights customers that account for 10% or more of its revenue, so it’s possible that there were other significant customers for its GPUs that didn’t meet the reporting threshold.

Customers A and B accounted for a combined 25% of the company’s revenue in the second quarter, up from 24% in the first quarter of fiscal 2025 just three months earlier. In other words, Nvidia’s earnings are becoming more concentrated, not less.

Here’s why this can be a problem. Customer A spent $7.8 billion on Nvidia in the last two quarters alone, and only a small number of companies worldwide can sustain that kind of chip and infrastructure spending. This means that even if one or two of Nvidia’s largest customers cut their spending, the company could suffer a loss of revenue that cannot be fully recovered.

Nvidia headquarters with a black Nvidia sign in the foreground.

Image source: NVIDIA.

Nvidia secret clients

Microsoft is a regular buyer of Nvidia GPUs, but a recent report from a Wall Street analyst suggests the tech giant is biggest is currently a customer of Blackwell equipment (which will begin shipping later this year). As a result, I believe that Microsoft is Client A.

Nvidia’s other top clients could be some combination of Amazon, Alphabet, Meta platforms, Oracle, Teslaand OpenAI. Here’s how much money some of these companies are spending on AI infrastructure, according to public data:

  • Microsoft committed $55.7 billion in capital expenditures (capex) in fiscal 2024 (which ended June 30), and most of that went to GPUs and data center construction. The company plans to spend even more in fiscal 2025.
  • AmazonThe company’s capital expenditures will exceed $60 billion in calendar year 2024, fueling the growth seen in the artificial intelligence space.
  • Meta Platforms plans to spend up to $40 billion on AI infrastructure in 2024 and more in 2025 to create more advanced versions of its Llama AI models.
  • Alphabet plans to spend about $50 billion on capital expenditures this year.
  • Oracle committed $6.9 billion to artificial intelligence capex in fiscal year 2024 (which ended May 31) and plans to spend double that in fiscal year 2025.
  • Tesla just told investors that its total AI infrastructure spending this year will top $11 billion as it brings 50,000 Nvidia GPUs online to improve its self-driving software.

Based on this information, Nvidia’s revenue stream looks stable for at least the next year. The picture gets a little murkier if we look into the future because we don’t know how long these companies can sustain this level of spending.

Nvidia CEO Jensen Huang estimates that data center operators will spend $1 trillion on building artificial intelligence infrastructure over the next five years. If he’s right, the company could continue to grow into the late 2020s. But competition is emerging online, which may take away some market share.

Advanced microdevices last year released its own data center GPUs powered by artificial intelligence. plans to launch new chip architecture to compete with Blackwell in the second half of 2025. In addition, Microsoft, Amazon and Alphabet have developed their own data center chips, and while it may take time to erode Nvidia’s technological advantage, it will eventually become more cost-effective for them to use this hardware.

None of this is an immediate cause for concern for Nvidia investors, but they should keep an eye on the company’s earnings concentration in the coming quarters. If it continues to rise, it could pose a higher risk of a sharp decline in sales at some point in the future.

John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Randi Zuckerberg, former chief market development officer and spokeswoman for Facebook and sister of Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Suzanne Frey, chief executive of Alphabet, is a member of The Motley Fool’s board of directors. Anthony Di Pizio has no position in any of the stocks mentioned. The Motley Fool holds positions in and recommends Advanced Micro Devices, Alphabet, Amazon, DigitalOcean, Meta Platforms, Microsoft, Nvidia, Oracle and Tesla. The Motley Fool recommends the following options: long January 2026 $395 Microsoft calls and short January 2026 $405 Microsoft calls. The Motley Fool has disclosure policy.