Micron has provided an update on its next-gen HBM4 and HBM4E processes, as the firm reveals that mass production is expected to be initiated by 2026.
Micron To Utilize TSMC’s Foundry Service For HBM4’s Logic Semiconductors, Coming On-Par With SK Hynix
HBM4 is indeed the future “holy grail” of the HBM markets, mainly since the technology pledges to bring in cutting-edge performance and efficiency figures, which is the gateway towards upscaling AI computational power. Micron, alongside the likes of SK Hynix and Samsung, is also in the race for HBM4 dominance, and in the latest investors conference, the firm has revealed that their HBM4 development is right on track, and “work is already underway” for HBM4E as well, which is indeed exciting to see.
Leveraging the strong foundation and continued investments in proven 1β process technology, we expect Micron’s HBM4 will maintain time to market and power efficiency leadership while boosting performance by over 50% over HBM3E. We expect HBM4 to ramp in high volume for the industry in calendar 2026.
Development work is well underway with multiple customers on HBM4E, which will follow HBM4. HBM4E will introduce a paradigm shift in the memory business by incorporating an option to customize the logic base die for certain customers using an advanced logic foundry manufacturing process from TSMC. We expect this customization capability to drive improved financial performance for Micron.
– Micron
For those unaware, HBM4 is revolutionary in many ways, but one interesting point to note here is that the industry plans to integrate memory and logic semiconductors into a single package. This means that there won’t be a need for packaging technology, and, given that individual dies would be much closer to this implementation, it would prove to be much more performance efficient. This is why Micron mentions that they will use TSMC as their “logic semiconductor” supplier, similar to what SK Hynix employs.
Micron has also mentioned the presence of the HBM4E process, becoming the only first, alongside SK Hynix, to reveal development on the technology. While we are currently uncertain about Micron’s HBM4 lineup specifications, the firm did reveal that HBM4 is expected to stack up to 16 DRAM dies, each with a capacity of 32 GB, along with a 2048-bit wide interface, making the technology much more superior to its previous-gen counterpart.
In terms of adoption, HBM4 is expected to be featured in NVIDIA’s Rubin AI architecture alongside AMD’s Instinct MI400 Instinct lineup, so the process is set for widespread market recognition. HBM demand is at its peak right now, and Micron themselves have revealed production lines being booked by 2025, so the future is even going to be much brighter.
Read full on Wccftech
Discover more from Technical Master - Gadgets Reviews, Guides and Gaming News
Subscribe to get the latest posts sent to your email.