The explosive growth of artificial intelligence is entering a critical new phase, shifting the investment spotlight from processing power to data storage. As AI models grow exponentially in size and complexity, the industry is hitting a memory bottleneck, making the hardware that stores and retrieves data the next major frontier. This has catalyzed a „memory supercycle,” elevating companies that produce specialized high-bandwidth memory (HBM) and NAND flash from cyclical commodity players to essential pillars of the AI infrastructure. Analysts argue that the demand for memory in AI servers and data centers is creating an unprecedented, sustained boom, fundamentally altering the growth trajectory and valuation models for key firms in the sector.
At the forefront of this shift is Micron Technology, which has transformed into a cornerstone of the AI server stack. The primary catalyst is its execution in the high-margin HBM market, a specialized DRAM variant crucial for training AI models. Micron projects the total addressable market for HBM to reach $100 billion by 2028, representing a 40% compounded annual growth rate. The complexity of HBM production is so high that it is consuming capacity traditionally used for consumer electronics, granting Micron significant pricing power and fatter margins. Despite a 240% stock surge, its valuation remains at a steep discount compared to the broader market and peers like Nvidia, leading some analysts to view it as a uniquely undervalued play on the AI revolution.
While Micron is a U.S. favorite, many analysts see South Korea’s SK Hynix as the current undisputed leader in the memory boom. As the primary HBM supplier to Nvidia, it commands roughly 60% market share. Its advanced technological lead, however, presents a double-edged sword: severe capacity constraints. The bull case hinges on its ability to meet explosive demand for the next-generation HBM4, with UBS forecasting its share could grow to 70% by 2026 as it integrates with Nvidia’s upcoming Rubin platform. The key risk is that any failure to ramp production could cede ground to competitors in the critical 2026 timeframe.
Beyond the HBM giants, the memory narrative is expanding to include long-term storage. A surprise standout has been Sandisk, which soared over 800% following its spin-off from Western Digital. While AI discussions often focus on DRAM (short-term memory), Sandisk is a leader in NAND flash (long-term storage). This technology is becoming vital for „AI at the edge,” powering autonomous devices like robots and cars that must process and store vast amounts of data locally. This trend underscores that the AI memory boom is not a single-market phenomenon but a comprehensive wave impacting all layers of the data storage hierarchy, from cutting-edge data centers to next-generation consumer devices.
Ez a cikk a Neural News AI (V1) verziójával készült.