tradingkey.logo

Micron's HBM Revenue Surged Past $1 Billion, Fueling AI Demand

TradingKeyApr 7, 2025 1:55 AM
  • Micron’s Q2 FY25 revenue hit $8.1B, with HBM sales surpassing $1B, growing 50% sequentially, reinforcing its AI memory leadership.
  • DRAM accounted for 76% of total sales at $6.1B, up 47% YoY, while NAND reached $1.9B, reflecting shifting memory market dynamics.
  • Micron’s 1-gamma DRAM leads in efficiency, reducing energy use by 20%, boosting density by 30%, and outpacing SK Hynix’s power performance.
  • With the Nvidia partnership secured and HBM4 sampling for 2026, Micron’s AI-driven structural margin expansion justifies a rerating in valuation.

TradingKey - In the crowded space of semiconductor giants, Micron Technology (NASDAQ: MU) has emerged quietly among the most crucial providers to the generative AI revolution. While Nvidia dominates headlines with GPU dominance, Micron has made a crucial play providing the memory bandwidth required to enable next-generation compute in the first place. Against narratives that render memory to be commoditized and cyclical, Micron's shift to High Bandwidth Memory (HBM) and Low Power DRAM in data centers puts it in place to be a high-margin, structurally favored player. With its Q2 FY25 record-breaking $1 billion in HBM revenue, Micron is in a high-stakes transition from being a commodity provider to being an AI infrastructure kingmaker.

Institutional investors should understand that this shift is not simply the result of tighter demand or prices—it is fueled by purposeful capital discipline, IP-intense innovation, and geopolitical positioning in U.S.-aligned semiconductor policy. At the heart of this shift is Micron's ability to marry compute power and memory bandwidth, a talent now indispensable to hyperscalers like Nvidia, Microsoft, and Amazon. The market has to a degree priced in AI upside, but we believe Micron remains undervalued in its earnings power resiliency and valuation multiple expansion potential. Within this note, we discuss the business model shift, competitive strength, and capital efficiency strategy that makes Micron a core AI memory infrastructure holding.

chip market

Source: tweaktown.com

Memory Reimagined: Micron’s Business Model Drives the AI Backbone

Micron’s shift to high-performance value-added memory solutions from commodity DRAM and NAND supply constitutes a fundamental change in its business model. From a marginal participant in a commoditized cycle, Micron now dominates High Bandwidth Memory (HBM), LP DRAM, and high-end NAND targeted at AI, data center, and automotive markets. These verticals that aren’t cyclical in nature have structurally expanding demand patterns with inference workloads, energy efficiency constraints, and the explosive growth in edge and cloud deployments.

Essentially, Micron has a vertically integrated memory manufacturing strategy. It owns the complete stack—from wafer production and process node development to module production and custom packaging to end customers. Integration matters especially in the HBM market where package and power efficiency dictate design wins. Notably, Micron alone is shipping low-power DRAM in volume to servers used in artificial intelligence applications, a technical feat that underscores its architectural leadership.

In Q2 2025, revenues were 76% from DRAM at $6.1 billion, up 47% year over year. NAND delivered $1.9 billion, up 18% year over year but down 17% sequentially because prices were under pressure. The headlines mask the reality: HBM revenue grew over 50% sequentially to over $1 billion with volume shipments ahead of expectations. It marks a fundamental inflection because demand for HBM grows not linearly but exponentially with GPU cycles.

The integration of Micron’s HBM3E 8H and 12H modules within Nvidia’s GB200 and GB300 platforms places it solidly in the generative AI stack. The upcoming HBM4, in customer sampling now and set to ramp in calendar 2026, will offer 60% bandwidth improvement over HBM3E. Notably, Micron’s plans to expand capacity in existing facilities and in a new Idaho DRAM fab and Singapore advance package facility offer near-term flexibility and long-term capacity.

Most importantly, Micron’s shift to AI-based memory goes beyond DRAM. Its QLC NAND G8 powers Pure Storage’s high-capacity modules and its G9 SSDs have secured vendor wins with Nvidia. The convergence between DRAM and NAND in AI servers, edge computing, and mobile devices provides a portfolio-level moat that competitors will struggle to replicate. Micron’s ability to optimize memory density, thermals, and energy consumption across segments enhances customer stickiness and gross margins.

MU earnings

Source: Q2-FY25 Deck

Leading the Competition in the AI Arms Race: Micron’s Competitive Position

In global memory space, Samsung and SK Hynix have had scale and first-mover advantage in process nodes for a long time. But Micron is gradually closing—if not overtaking—ground in AI-applicable niches. Its 1-gamma DRAM node (the first to use EUV) delivers 20% less energy consumption, 15% improved performance, and over 30% improved bit density relative to 1-beta, beating Hynix’s comparable offerings on energy characteristics. As energy efficiency becomes a constraint to scalability in AI servers, Micron’s positioning gets stronger in every procurement cycle.

Most notably, Micron’s HBM roadmap not only caught up with SK Hynix but now matches it in tier-1 design wins. The company won multi-billion-dollar HBM contracts and sold through 2025. With HBM3E 12H in volume ramp and HBM4 already in alignment to customer silicon tapeouts, Micron will be in position to match its overall DRAM market share (~24%) in HBM in Q4 2025. Hynix has about 50% of HBM now, but capacity constraints and exposure to legacy tech may shift share in Micron’s direction.

On the NAND front, Micron’s QLC architecture and in-vehicle embedded leadership remain unsung. Micron gained historic share in Q4 2024 in data center SSDs with design wins in Nvidia’s GB200 NVL72 server platforms. Automotive had the rollout of the industry’s first LP5x DRAM qualified to be used in the automotive space and the first enterprise-grade SSD certified to be used in-vehicle. Niche markets have higher ASP and gross margins and lower volatility in prices than consumer-grade NAND.

Conversely, Samsung’s leadership in NAND comes with a cost in terms of overcapacity in consumer and low-end SKUs and hence cyclical pressure on prices. SK Hynix leads in HBM in terms of volume but trails in power efficiency and breadth in DRAM/NAND portfolios. Western Digital and Kioxia are NAND-driven and have no exposure to DRAM and hence have limited upside in workloads related to AI.

The actual strength of Micron lies in its capital discipline and geopolitical positioning. As a U.S.-based producer with CHIPS Act funding, Micron has a strong position to be a go-to supplier to hyperscalers with secure supply chains in priority. The Idaho fab and Singapore package line reflect forward-thinking CapEx, with peers facing deferred investments in the backdrop of unclear demand visibility.

DRAM market

Source: yolegroup.com

Strategic and Financial Deep Dive: AI Growth, Operating Leverage and Cash Efficiency

Micron’s business is enjoying a number of secular tailwinds—AI demand for memory, auto space digitization, and HDD replacement with NAND. The company now expects mid-teens to high-teens growth in calendar 2025 DRAM bit demand and low double-digit growth in demand for NAND. The trends here not only reflect the hyperscaler CapEx (which remains robust) but also growing deployment of AI in enterprise and edge environments.

In FQ2 2025, Micron had revenue of $8.1 billion (38% YoY growth) with $1.56 in non-GAAP EPS and $3.9 billion in operating cash flow, 49% of revenue. Total adjusted free cash flow was $857 million after $3.1 billion in CapEx. Gross margins were back to 37.9%, the 2-quarter high, with better ASPs and HBM and LP DRAM mix shift. Operating income was $2.0 billion with improving scale economics.

Micron's FY25 capex holds firm at ~$14 billion with spending discipline on long-cycle items: advanced packaging, DRAM node ramps and HBM scaling. Importantly, the company is repurposing idle NAND fabs to shift to higher nodes, and expect a >10% structural reduction in wafer capacity in NAND in end-FY25. It not only reinforces supply discipline but long-term margin normalization as well

The best metric to use here is Micron’s ability to expand gross margins irrespective of growth in volumes. DRAM bit shipments actually declined sequentially in Q2 but ASPs were mid-single digit higher and HBM revenue increased 50% QoQ. The divergence between volumes and profitability reflects the scarcity value and pricing ability of AI-grade memory.

Also, Micron’s HBM product cycle follows well with the cadence of Nvidia—HBM3E in GB200, HBM3E 12H in GB300, and HBM4 in GB400 in 2026. As such design wins translate to multi-year shipment volumes, Micron’s revenue foundation becomes more recurring in nature. Long-term investors view the memory-cycle risk to be converted into an AI-cycle upside.

Valuation Reversal: Micron Is Worth a Premium Multiple

Despite its margin recovery and HBM breakout, Micron still trades at a discount to peers. On forward EV/Revenue, Micron trades at ~2.90x whereas SK Hynix trades at 4.2x and Nvidia at ~18x. On forward P/E, Micron trades at ~17x, much lower than 20–22x deserved in high-margin growth-oriented semis.

We estimate Micron to earn $34–36 billion in revenue and $6.5–7.2 billion in net income in FY25 based on ASP trends and recovery in the NAND market. With a midpoint estimate of $6.85B in net income and a 20x P/E multiple, we get a fair market capitalization value of $137 billion or about $125 per share, 35% higher than where we are now.

A DCF calculation using $6.8B FY25 FCF, 10% WACC, and 5% long-term growth gives an intrinsic value in the $120–130 range and indicates that Micron is undervalued on relative and absolute basis.

The competitive advantage of Micron—AI memory bandwidth management—deserves to be rerated multiple times. As investor interest shifts from chip performance to memory efficiency, Micron will be a structural core holding in portfolios in AI infrastructure.

altText

Source: Q2-FY25 Deck

Risk Factors: Cyclicality, Capacity, and Execution Gaps

The largest threat in here remains demand cyclicality, particularly if hyperscaler CapEx dials back or AI inference efficiency increases faster than expected. While demand for HBM appears sold through 2025, cost pressure may return in the second half of 2026 with new capacity from competitors.

Risk of execution looms over expanding HBM4 and adding high-end packaging in Singapore. Delays in tooling up the fab or yields issues would constrict margins. Geopolitical risks—such as China trade tensions or export controls—could also disrupt supply chains. Finally, NAND remains a drag. ASP declines in Q2 (~high teens %) indicate excess capacity, and tepid demand recovery will temper margins in spite of managed CapEx. 

altText

Source: iiss.org

Conclusion: Micron Is No Longer A Memory Stock Micron is becoming mission-critical 

AI-enabling infrastructure in a hurry. Its leadership in HBM and LP DRAM and capital discipline along with design wins with Nvidia and hyperscalers set it apart from traditional memory peers. With structural margin growth, robust FCF generation and under-estimated valuation, Micron offers asymmetric upside to long-term investors with exposure to the compute-memory AI flywheel.

Reviewed byTony
Disclaimer: The content of this article solely represents the author's personal opinions and does not reflect the official stance of Tradingkey. It should not be considered as investment advice. The article is intended for reference purposes only, and readers should not base any investment decisions solely on its content. Tradingkey bears no responsibility for any trading outcomes resulting from reliance on this article. Furthermore, Tradingkey cannot guarantee the accuracy of the article's content. Before making any investment decisions, it is advisable to consult an independent financial advisor to fully understand the associated risks.
tradingkey.logo
tradingkey.logo
Intraday Data provided by Refinitiv and subject to terms of use. Historical and current end-of-day data provided by Refinitiv. All quotes are in local exchange time. Real-time last sale data for U.S. stock quotes reflect trades reported through Nasdaq only. Intraday data delayed at least 15 minutes or per exchange requirements.
* References, analysis, and trading strategies are provided by the third-party provider, Trading Central, and the point of view is based on the independent assessment and judgement of the analyst, without considering the investment objectives and financial situation of the investors.
Risk Warning: Our Website and Mobile App provides only general information on certain investment products. Finsights does not provide, and the provision of such information must not be construed as Finsights providing, financial advice or recommendation for any investment product.
Investment products are subject to significant investment risks, including the possible loss of the principal amount invested and may not be suitable for everyone. Past performance of investment products is not indicative of their future performance.
Finsights may allow third party advertisers or affiliates to place or deliver advertisements on our Website or Mobile App or any part thereof and may be compensated by them based on your interaction with the advertisements.
© Copyright: FINSIGHTS MEDIA PTE. LTD. All Rights Reserved.