SINGAPORE/SEOUL – A version of Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, or HBM3E, has passed Nvidia’s tests for use in its artificial intelligence (AI) processors, three sources briefed on the results said.
The qualification clears a major hurdle for the world’s biggest memory chipmaker which has been struggling to catch up with local rival SK Hynix in the race to supply the advanced memory chips capable of handling generative AI work.
Samsung and Nvidia have yet to sign a supply deal for the approved eight-layer HBM3E chips but will do so soon, the sources said, adding that they expect supplies would start by the fourth quarter of 2024.
The South Korean technology giant’s 12-layer version of HBM3E chips, however, has yet to pass Nvidia’s tests, the sources said, declining to be identified as the matter remains confidential. Both Samsung and Nvidia declined to comment.
HBM is a type of dynamic random access memory or DRAM standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications.
Samsung has been seeking to pass Nvidia’s tests for HBM3E and preceding fourth-generation HBM3 models since last year but has struggled due to heat and power consumption issues, Reuters reported in May, citing sources.