Advertisement
Advertisement
Semiconductors
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
South Korean semiconductor firm SK Hynix was until March the sole supplier of high-bandwidth memory chips to Nvidia. Photo: Shutterstock

Nvidia supplier SK Hynix says its high-bandwidth memory chips used in AI processors almost sold out for 2025

  • SK Hynix has already sold out its high-bandwidth memory chips for this year, as enterprises aggressively expand AI services
  • The South Korean memory chip maker forecast annual demand growth for HBM chips to be about 60 per cent in the mid- to long-term
South Korea’s SK Hynix said on Thursday that its high-bandwidth memory (HBM) chips used in artificial intelligence (AI) processors were sold out for this year and almost sold out for 2025, as businesses aggressively expand AI services.
The Nvidia supplier and the world’s second-largest memory chip maker will begin sending samples of its latest HBM chip, called the 12-layer HBM3E, in May and begin mass-producing them in the third quarter.
“The HBM market is expected to continue to grow as data and [AI] model sizes increase,” SK Hynix chief executive Kwak Noh-Jung told a news conference. “Annual demand growth is expected to be about 60 per cent in the mid- to long-term.”
SK Hynix, which competes with US rival Micron Technology and domestic behemoth Samsung Electronics in HBM, was until March the sole supplier of these chips to Nvidia, according to analysts, who added that major AI chip buyers are keen to diversify their suppliers to better maintain operating margins. Nvidia commands some 80 per cent of the global AI chip market.
Nvidia chief executive Jensen Huang unveils the company’s new graphics processing unit design, called Blackwell, that is multiple times faster at handling artificial intelligence projects at an event in San Jose, California, on March 18, 2024. Photo: AP

Micron has also said its HBM chips were sold out for 2024 and that most of its 2025 supply was already allocated. It plans to provide samples for its 12-layer HBM3E chips to customers in March.

“As AI functions and performance are being upgraded faster than expected, customer demand for ultra-high-performance chips such as the 12-layer chips appear to be increasing faster than for 8-layer HBM3Es,” said Jeff Kim, head of research at KB Securities.

Samsung, which plans to produce its HBM3E 12-layer chips in the second quarter, said this week that this year’s shipments of HBM chips are expected to increase more than three-fold and that it has completed supply discussions with customers. The company did not elaborate further.

Last month, SK Hynix announced a US$3.87 billion plan to build an advanced chip packaging plant in the US state of Indiana with an HBM chip line and a 5.3 trillion won (US$3.9 billion) investment in a new DRAM chip factory at home with a focus on HBMs.
Memory chips made by South Korean semiconductor company SK Hynix are seen on a computer circuit board on February 25, 2022. Photo: Reuters

Kwak of SK Hynix said investment in HBM differed from past patterns in the memory chip industry because production capacity is being increased after making certain of demand first.

By 2028, the portion of chips made for AI, such as HBM and high-capacity DRAM modules, is expected to account for 61 per cent of all memory volume in terms of value from about 5 per cent last year, SK Hynix’s head of AI infrastructure Justin Kim said.

Last week, SK Hynix said in a post-earnings conference call that there may be a shortage of regular memory chips for smartphones, personal computers and network servers by the year’s end if demand for tech devices exceeds expectations.
Post