Samsung Electronics' fifth-generation high bandwidth memory (HBM) chips, known as HBM3E, have successfully completed Nvidia's tests for integration into its AI processors, according to three informed sources. This milestone removes a significant barrier for the world's largest memory chip manufacturer, which has been lagging behind its domestic competitor, SK Hynix, in the supply of advanced memory chips suitable for generative AI tasks. Although Samsung and Nvidia have not yet finalized a supply agreement for the approved eight-layer HBM3E chips, such a deal is expected to be concluded shortly, with initial supplies anticipated by the fourth quarter of 2024, the sources added. However, the South Korean tech giant's 12-layer version of HBM3E chips still needs to pass Nvidia's evaluations, the sources noted, requesting anonymity due to the confidential nature of the information. Both Samsung and Nvidia have chosen not to comment on the matter.
HBM, a type of dynamic random access memory (DRAM) introduced in 2013, features vertically stacked chips to conserve space and reduce power usage. It is a critical component in GPUs for AI, aiding in the processing of vast amounts of data from complex applications. Samsung has been attempting to meet Nvidia's standards for HBM3E and its fourth-generation HBM3 predecessors since last year but faced challenges with heat and power consumption, as reported by Reuters in May. The company has since revised its HBM3E design to tackle these issues, according to the sources familiar with the situation. Samsung refuted the claims in the Reuters article, stating that its chips had not failed Nvidia's tests due to thermal and power issues.
Dylan Patel, founder of semiconductor research firm SemiAnalysis, commented, "Samsung is still playing catch up in HBM. While they will start shipping 8-layer HBM3E in Q4, their rival SK Hynix is already shipping 12-layer HBM3E." Samsung's shares rose by 4.3 percent on Wednesday, outperforming the broader market's 2.4 percent increase, while SK Hynix's shares were up by 3.4 percent. This latest approval comes after Nvidia certified Samsung's HBM3 chips for less advanced processors targeted at the Chinese market, as reported by Reuters last month. The approval of Samsung's latest HBM chips by Nvidia coincides with the surging demand for sophisticated GPUs driven by the generative AI boom, a demand that Nvidia and other AI chipset manufacturers are struggling to fulfill.
HBM3E chips are expected to dominate the HBM market this year, with shipments peaking in the second half, according to TrendForce. SK Hynix, the market leader, predicts that the demand for HBM memory chips could grow annually by 82 percent until 2027. Samsung anticipates that HBM3E chips will account for 60 percent of its HBM sales by the fourth quarter, a goal that analysts believe is attainable if its latest HBM chips receive final approval from Nvidia by the third quarter. Samsung does not disclose revenue details for specific chip products; however, its total DRAM chip revenue for the first half of this year was estimated at 22.5 trillion won ($16.4 billion) by a Reuters survey of 15 analysts, with some estimating that about 10 percent of that could be from HBM sales. The HBM market is primarily served by three manufacturers: SK Hynix, Micron, and Samsung. SK Hynix has been the primary supplier of HBM chips to Nvidia and began supplying HBM3E chips in late March to an undisclosed customer, with sources indicating that these shipments were destined for Nvidia. Micron has also announced plans to supply Nvidia with HBM3E chips.