Micron Explores Stacked GDDR for AI Inference, Targets Prototype by 2027

Key Takeaways

Micron Technology is developing stacked GDDR for AI inference, aiming to deliver a lower-cost alternative to HBM with prototypes expected by 2027. While it offers advantages in cost and scalability, its success will depend on overcoming technical challenges and proving competitive performance against HBM from Samsung Electronics and SK hynix.

Micron Technology is reportedly taking a first-mover approach in developing stacked GDDR, marking a potential shift in the memory landscape for AI. According to ETNews, the company has initiated work on a new architecture that vertically stacks GDDR dies, with process testing expected to begin in the second half of 2026. Early versions are rumored to include around four layers, with initial prototypes possibly arriving as soon as 2027.

Traditionally, GDDR has been designed for graphics-intensive workloads such as video processing and 3D rendering, making it a staple in GPUs and gaming devices. However, its role is now expanding into AI accelerators, particularly in inference workloads. While GDDR cannot match the bandwidth of HBM, its significantly lower cost makes it an attractive alternative for specific use cases where efficiency matters more than peak performance.

By adopting a stacked design similar to HBM, Micron aims to bridge part of the performance gap while increasing memory capacity beyond conventional GDDR. This approach could position stacked GDDR as a middle-ground solution, offering improved performance without the high cost and complexity associated with HBM. Beyond AI, demand is also expected from the high-end gaming GPU segment, which continues to evolve toward higher performance requirements.

Micron-GDDR-202603310-1024x614
 

According to Global Economic, the key advantage of stacked GDDR lies in its simpler packaging and cost efficiency. In contrast, HBM relies on advanced technologies such as TSVs (through-silicon vias) and complex packaging methods like CoWoS, which create supply constraints and increase production costs. Stacked GDDR, with less demanding manufacturing requirements, could provide a more scalable and cost-effective alternative.

If successful, Micron could secure an early competitive edge in this emerging segment. Reports suggest the company is moving ahead of competitors like Samsung Electronics and SK hynix, betting on strong growth in AI-driven demand.

Notably, if stacked GDDR gains traction in inference workloads, it could challenge the current dominance of HBM. As highlighted by Global Economic, large-scale AI deployment often requires tens of thousands of servers, making cost a critical factor. With pricing estimated at only 5–10% of HBM per GB, GDDR-based solutions could become a compelling option for scaling AI infrastructure.

Technical Challenges Remain

Despite its potential, stacked GDDR is still at an early stage of development. ETNews notes that there is no established path to mass production yet, and several technical hurdles remain unresolved.

Key challenges include achieving reliable die stacking, managing power efficiency, and controlling thermal output. In addition, maintaining a competitive cost-to-performance ratio will be crucial. Without clear advantages over HBM in real-world deployments, adoption could remain limited.

In short, while stacked GDDR presents a promising alternative in the AI memory landscape, its commercial success will depend on whether these technical and economic challenges can be effectively addressed.

 

Chat FacebookChat Facebook