Micron Stock Up 120% YTD: What the HBM Memory Leader Plans for 2026
October 16, 2025
Beth Kindig
Lead Tech Analyst
Micron’s stock is up 120% YTD – or 3X more YTD than AI heavyweight Nvidia. Recently, the high-bandwidth memory content that Micron supplies has increased 3.5X between GPU generations, leading to a quiet memory boom across DRAM and NAND suppliers.
Memory is typically a cyclical industry that is low margin and lumpy, yet memory is seeing a newfound resurgence from AI that is strong enough to transform commoditized hardware into a secular trend as the AI economy is built out. AI servers use more DRAM and NAND than traditional servers, relying heavily on high-bandwidth memory (HBM) for training and inference.
In fiscal 2025, Micron’s HBM, high-capacity dual in-line memory modules (DIMMs) and low-power (LP) server DRAM revenue reached $10 billion, up more than fivefold from the prior year, while HBM alone reached $2 billion in revenue in Q4. Data center is already proving to be a strong driver of growth for Micron, accounting for 56% of sales in FY25, up from 35% in FY24. On a dollar basis, data center revenue surged 137% YoY to $20.75 billion.
Below, we look at how Micron has quietly outperformed some of the biggest players in AI YTD and if its ability to defy the odds can continue.
Micron Delivers 3.5× More Memory per GPU, Powering 34× Larger AI Models
HBM capacity per chip continues to rise with each new generation of GPU, as its ability to offer higher bandwidth, performance, and lower latency is crucial for increasingly powerful large language models.
For example, we’ve seen a ~3.5x increase in HBM content in short fashion on Nvidia’s GPUs within about three years’ time frame:
- The H100 featured 80GB of HBM2e content per chip. This chip began shipping in Q4 2022 and ramped in early 2023.
- The H200 featured 141GB of HBM3e content per chip, 1.76x higher than its predecessor, which helped drive 1.4x to 1.9x faster inference on leading AI models.
- The B200 features 180GB of HBM3e content, more than double the H100 and a 28% increase versus the H200. In an 8-GPU server configuration, the B200 boasted 1.44TB of HBM content.
- The B300 boasts 288GB of HBM3e content, a 60% increase versus the B200 and over 3.5x more than the H100. In an 8-server configuration, the B300 has 2.3TB of HBM content. This chip is beginning to ship now in Q3-Q4 2025.
Putting in context Nvidia’s rack-scale solutions, the GB200 and GB300 NVL72, shows just how rapidly HBM content is increasing. The GB200 supports up to 13.4TB of HBM content, while the GB300 supports up to 21.7TB of HBM, nearly 34X higher than the 640GB of HBM content in the 8-GPU DGX H100 servers.
AMD is also showing surging memory requirements, to the tune of 3.5X across two main generations:
- The Instinct MI250 featured 128GB of HBM2e memory.
- The MI350X featured 288GB of HBM3e memory, a 125% increase versus the MI250 and on par with Nvidia’s Blackwell Ultra.
- The MI400 series is expected to feature 432GB of HBM4 memory, a 50% increase versus the MI350X and the Blackwell Ultra. In the Helios rack configuration slated for 2026, the MI400 will boast 31.1TB of HBM content, 1.5x more than the GB300 NVL72.
NVIDIA’s Grace CPU Boosts Demand for Micron’s LPDDR5X Memory
When thinking about Nvidia’s GPU platforms, it oftentimes is overlooked that the GB200, GB300 and the older GH200 generations are all paired with Nvidia’s Arm-based Grace CPU (hence the G-based nomenclature). The Grace CPU accelerates CPU-to-GPU connections with Nvidia’s NVLink C2C and helps boost performance and energy efficiency of AI workloads utilizing its Arm Neoverse V2 cores and LPDDR5X (low power double data rate 5) memory.
For example, Micron tested inference performance on Meta’s Llama 3-70B with LPDDR5X memory on the GH200 versus DDR5 on the H100, and found that LPDDR5X delivered 73% less energy consumption with 5x higher throughput and 80% better latency.
The timing and ramp of the GB200 and GB300 throughout the second half 2025 and into 2026 suggests the new LPDDR5X growth curve is at its strongest. To put in perspective, shipments of ~30K GB200/300 racks in 2025 would require more than one million LPDDR5X modules, with each NVL72 rack featuring 17.3TB, or 36 480G modules. If rack shipments double to ~60K in 2026, this would also double LPDDR5X module demand.
Driven by the Blackwell platform, Micron saw revenue from LPDDR5X for servers up 50% QoQ in fiscal Q4 to a record level, though the company did not disclose its exact revenue contribution.
Micron’s Expanding Role in AMD and NVIDIA’s HBM Supply Chain
The HBM market is quite competitive between Micron, Samsung and SK Hynix with Micron historically ranking third. However, Micron plays an increasingly important role in Nvidia’s supply chain, and to a lesser extent, AMD’s. Micron is expanding its presence within HBM, stating in Q4 that it has expanded its HBM customer base from four customers in Q3 to six.
Micron has a range of products designed into Nvidia’s leading platforms:
- Micron’s HBM3e 8-high 24GB cubes are designed into the HGX B200 and GB200 NVL72 platforms.
- Nvidia’s HGX B300 NVL16 and GB300 NVL72 feature Micron’s HBM3e 12-high 36GB cubes.
- Micron’s LPDDR5X supports Nvidia’s GB300 Superchip, with Micron stating in Q4 that “since NVIDIA's launch of LPDRAM in their GB-product family Micron has been the sole supplier of LPDRAM in the data center.”
While Samsung remains a key HBM supplier for AMD, Micron has collaborated with the Nvidia challenger on the Instinct MI350 GPU family as well as its EPYC CPUs. Micron’s HBM3e 12-high 36GB cubes support the MI350X series, while its 128GB DDR5 RDIMM modules support AMD’s 4th gen EPYC CPUs, providing “up to 22% improved energy efficiency and up to 16% lower latency over competitive 3DS through-silicon via (TSV) products.”
mid prompt
Moving through 2026, the industry is shifting to HBM4 products, where Micron believes it outperforms Samsung and SK in terms of performance and power efficiency. The chipmaker noted it is in active discussions with customers for HBM4 volumes and expects to sell out of capacity for 2026 over the next few months.
This role of supplying both Nvidia and AMD with core memory products and leading on performance and power on the upcoming HBM generation positions Micron well for growth in 2026 and 2027.
Micron Reports Strong 44% Revenue Growth Driven by AI Memory Demand
- Surging AI data center demand drives record FQ4 revenue.
- Strong AI-demand for high-performance memory is creating tight supply, which in turn is driving higher DRAM and NAND prices.
- Data center market growth is complemented by improvement in other end markets.
Micron reported record FQ4 revenue of $11.32 billion. The primary drivers of last quarter’s record revenue were the company’s DRAM segment, specifically High Bandwidth Memory (HBM) products, which benefited from the rapid expansion of AI datacenters.
Revenue growth accelerated 9.4 percentage points sequentially to 46% YoY, and on a sequential basis, growth was 21.7% QoQ, a solid 6.2-point acceleration. Micron guided a fresh record in FQ1 at $12.5 billion at midpoint, pointing to 43.5% YoY growth and a 10.5% sequential growth. Analysts expect revenue growth to accelerate to 60.6% in FQ2.
Micron’s FQ4 revenue growth accelerated by 9.4 percentage points sequentially to 46%, marking continued momentum in AI-driven demand.
FQ4 DRAM revenue grew by 69% YoY and 27% QoQ to $8.98 billion, a second consecutive quarter of strong sequential growth. DRAM revenue accounted for 79% of total revenue. The growth was driven by bit shipments in the low-teens percent sequentially and prices also increased in the low double-digit percentage range.
FQ4 NAND revenue was down (-5%) YoY and up 5% sequentially to $2.25 billion. NAND bit shipments declined in the mid-single digits, and prices increased in the high single digits percentage sequentially due to a favorable mix and tight supply.
Micron benefits from other markets such as smartphones and automotive, represented by the mobile and client business unit (MCBU) up 5% YoY and up 16% sequentially to $3.76 billion.
For FY25, revenue rose 49% YoY to $37.38 billion, driven primarily by DRAM and HBM revenue, which rose more than 62% YoY to $28.58 billion. HBM reached an annualized run rate of $8 billion in FQ4, with HBM share expected to grow again in FQ1 and HBM4 capacity in discussions to soon be sold out for calendar 2026.
Micron has not provided a full-year guide for revenue, but current consensus estimates call for 43% growth to $53.5 billion in revenue.
Micron Leads the AI Memory Supercycle with Record Data Center Growth
The AI-driven demand for memory (especially HBM and high-performance DRAM) is still in the early stages of a multiyear growth cycle. The company’s CEO and Chairman, Sanjay Mehrotra, also mentioned in the recent earnings call, “Memory is very much at the heart of this AI revolution. This means a tremendous opportunity for memory and certainly a tremendous opportunity for HBM."
In fiscal 2025, Micron's data center reached a record 56% of company revenue, with growth primarily driven by DRAM products and aided by data center SSDs and NAND components. Overall, data center revenue increased 137% YoY to $20.75 billion.
Micron’s Cloud Memory Business Unit (CMBU), which consists of its HBM, high-capacity dual in-line memory modules (DIMMs), and low-power server DRAM solutions, saw revenue surge to ...
Sign up for FREE below to unlock the following information on Micron’s 120% YTD stock performance:
- Micron’s AI-related segment that surged over 200% last fiscal year and what management is saying about growth in the upcoming fiscal year
- The key line item in the income statement for every AI-related hardware player, and if Micron has what it takes to see sustained, bullish price action
More To Explore
Newsletter
Micron Stock Up 120% YTD: What the HBM Memory Leader Plans for 2026
Micron’s stock is up 120% YTD – or 3X more YTD than AI heavyweight Nvidia. Recently, the high-bandwidth memory content that Micron supplies has increased 3.5X between GPU generations, leading to a qui
Palantir Stock Forecast 2025: Can PLTR Justify Its High Valuation?
Palantir leads the AI software pack in terms of strong earnings reports this past quarter as the company achieved significant milestones, the most impressive being US commercial revenue grew 93% YoY a
CoreWeave Stock Soars 200% Since IPO — Can It Defy the Odds?
CoreWeave saw muted price action following the latest earnings report; yet the soft price action is rare for the AI darling. The company went public in March and has stood out as the premier IPO among
Meta Stock Emerges as a Strong Mag 7 AI Leader
The AI frenzy has investors fixated on revenue growth as proof of returns on AI spending that can be as high as $100 billion per year, depending on the company. Yet, Meta is proving that a stronger si
Updated Nvidia Stock Price Target - AI “Bubble” Narrative Ignores Re-Acceleration in Big Tech Capex
In the analysis below, my firm crunched the hard data on Q2 capex numbers and what is coming down the pipe for Q3. If you are an AI investor like we are, this is an analysis you will not want to miss
Oracle Soars After Earnings – Is ORCL Stock Still a Buy?
The market is clearly excited about this report, and for good reason. Remaining performance obligations (RPO) grew 359% YoY with cloud RPO growing “nearly 500%” on top of 83% growth last year. Another
Nvidia Stock Forecast: The Path to $6 Trillion
Two years ago, the April 2023 quarter delivered a historic 18% beat, followed by an even bigger 30% beat in July 2023. Compare that to the most recent quarter ending July 2025 — just a 4% beat, the sm
Bitcoin Bull Market Guide: When to Hold, Trim, or Re-Enter (Webinar)
Our track record including a more recent 600% move in Bitcoin is not the product of hype but of a systematic framework—one built on technical analysis, on-chain metrics, and a close watch on global li
Reddit Stock Blows the Doors Off - Can it Last?
Reddit’s stock has surged 62% in one month, easily placing the company’s earnings report as one of the best to come out of the tech sector this quarter. The world’s leading forum site has only 416 mil
ServiceNow Q2 Earnings: Inside the AI Push Toward $1 Billion ACV by 2026
Last month, after ServiceNow reported second quarter results that exceeded expectations on multiple fronts, shares of NOW rose by 6%. The company is attempting to reposition itself beyond a provider o