Marvell Develops Industry's First 2nm Custom SRAM for Next-Generation AI Infrastructure Silicon | MRVL Stock News

Author's Avatar
Jun 17, 2025
Article's Main Image
  • Marvell Technology (MRVL, Financial) unveils industry's first 2nm custom SRAM, boosting AI infrastructure performance.
  • The new SRAM offers up to 6 gigabits of memory, 15% die area savings, and reduces power usage by 66%.
  • This innovation is part of Marvell's broader strategy to enhance AI and cloud infrastructure.

Marvell Technology, Inc. (MRVL), a prominent leader in data infrastructure semiconductor solutions, has announced the development of the industry's first 2nm custom Static Random Access Memory (SRAM). This breakthrough is aimed at significantly enhancing the performance of custom XPUs and devices used in cloud data centers and AI clusters. The innovative SRAM technology delivers up to 6 gigabits of high-speed memory while operating at a frequency of 3.75 GHz.

One of the key benefits of this 2nm custom SRAM is its ability to reduce the total die area by up to 15%, which allows chip designers to leverage this reclaimed silicon to integrate more compute cores, expand memory, or reduce device size and cost. Additionally, the SRAM achieves the highest bandwidth per square millimeter in the industry and reduces on-chip memory standby power consumption by up to 66%.

This development is a crucial part of Marvell's expansive custom technology platform strategy, which has historically included other innovations such as CXL technology and custom HBM technology. These are aimed at improving the memory hierarchy performance in accelerated infrastructure. The innovation aligns with the industry's transition beyond Moore’s Law, where companies are focusing on custom silicon to optimize performance and power efficiency rather than relying on traditional transistor scaling.

Marvell’s comprehensive approach to enhancing memory performance at multiple levels—on-die, inside chip packages, and at the system level—is poised to transform the performance and economics of AI infrastructure. This positions Marvell well in the custom silicon era, where the focus is on workload-specific optimizations.

"Custom is the future of AI infrastructure," said Will Chu, senior vice president of Custom Cloud Solutions at Marvell. "The methodologies and technologies used by hyperscalers today to develop cutting-edge custom XPUs will percolate to more customers, more classes of devices, and more applications."

Disclosures

I/We may personally own shares in some of the companies mentioned above. However, those positions are not material to either the company or to my/our portfolios.