Samsung has developed a 12-stack memory of 36 GB to accelerate AI algorithms

This will significantly speed up AI training. Samsung announced the development of a high-bandwidth memory (HBM) chip of the HBM3E standard. Record memory capacity – 36 GB per stack. DiscussSamsung has developed a 12-stack memory of 36 GB to accelerate AI algorithms

© Samsung Electronics

Reportedly, HBM is a type of computer memory, the manufacture of which uses a multi-level arrangement of several layers of chips, the integrated circuits of the device are located on top of each other. This makes it possible to reduce the size of the chip and increase the bus width and, as a result, data transfer speed.

Samsung said that such a chip with a 12-layer layout will increase the processing speed and information transfer in AI algorithms by 50%. compared to the current standard of 8-layer HBM3 generation devices. According to Bae Yoon-chul, executive vice president of computer memory product planning at Samsung Electronics, the new chip will increase the average training speed of AI models by 34%.

The chips are also said to have spacing between memory dies in a stack is no more than 7 microns. At the same time, Samsung specialists were able to increase the density of the chip layout by 20%.

“Companies in the field of AI increasingly require high-performance HBM devices, the new 12-layer chip will help them with this,” said Bae Yoon Chul. .


Date:

by