As one of the leaders in the artificial intelligence (AI) revolution, Nvidia (NASDAQ: NVDA) is closely watched within the tech industry for insight into what’s next. The company’s graphics processing units (GPUs) are at the heart of AI processing, and being a component supplier can have significant implications for a company’s success or failure in AI.
Nvidia has been a major contributor to the success of Micron Technology (NASDAQ: MU). Unprecedented demand for Nvidia’s GPUs has fueled equally impressive demand for Micron’s high-bandwidth memory (HBM), DRAM, and NAND processors, as the company is a major chip supplier to Nvidia. However, a persistent shortage of these data center memory chips — which play a crucial role in AI processing — may be working against Micron. For its flagship Vera Rubin chip, scheduled for release in the second half of 2026, Nvidia is reportedly using memory chips from Micron’s biggest competitors.
Will AI create the world’s first trillionaire? Our team just released a report on the one little-known company, called an “Indispensable Monopoly” providing the critical technology Nvidia and Intel both need. Continue »
Reports out this morning suggest that SK Hynix and Samsung Electronics (OTC: SSNLF) will be the sole suppliers of sixth-generation HBM4 for the Vera Rubin processor, according to an article that first appeared in The Korea Economic Daily. The publication cited sources that suggest SK Hynix will supply more than half of Nvidia’s total HBM supply this year, while Samsung will be the leading supplier of HBM4 chips for the Vera Rubin.
SK Hynix and Samsung Electronics are the world’s largest suppliers of HBM, with 34% and 33% of the market, respectively, according to Counterpoint Research. Micron controls 26% of the market, coming in third.
This could have serious implications for Micron’s future. Nvidia is the gold standard and market leader in data center GPUs, with an estimated 92% of the market, according to IoT Analytics. Furthermore, the biggest companies in technology — namely Alphabet, Microsoft, Amazon, and Meta Platforms — have announced plans to devote a combined $700 billion for capital expenditures in calendar 2026, with the vast majority of that spending earmarked for AI-centric data centers and servers.
However, the news isn’t all bad. Citi analyst Atif Malik estimates that the cost of some memory chips will skyrocket 171% this year, fueled by persistent data center demand and the ongoing shortages. As a result, Micron likely won’t have difficulty finding a market for its other memory chips, even those that Nvidia doesn’t integrate into the Vera Rubin chip. Furthermore, Micron may be tapped later in the production cycle as shipments of Vera Rubin increase, according to Radio Free Mobile independent analyst Richard Windsor.
