Samsung Shows Off In-Memory Processing For HBM2, GDDR6 And Other Memory Standards

3 years ago
Anonymous $drS9DEX_Sj

https://wccftech.com/samsung-shows-off-in-memory-processing-for-hbm2-gddr6-and-other-memory-standards/

Samsung announced they are planning to expand their innovative processing-in-memory tech to more HBM2 chipsets, but also DDR4, GDDR6 and LPDDR5X chipsets for the future of the memory chip technology. This information is in light of earlier this year when they reported to be producing HBM2 memory that utilizes an integrated processor that runs computations as high as 1.2 TFLOPS that can be manufactured for AI workloads, something that only CPUs, FPGAs, and graphics cards ASICs are usually expected to complete. This maneuver by Samsung will allow them to pave a spot in the near future for it's next generation HBM3 modules.

Put simply, the chips have an AI engine injected inside each DRAM bank. That allows the memory itself to process data, meaning that the system doesn't have to move data between the memory and the processor, thus saving both time and power. Of course, there is a capacity tradeoff for the tech with current memory types, but Samsung says that HBM3 and future memories will have the same capacities as normal memory chips.

Samsung Shows Off In-Memory Processing For HBM2, GDDR6 And Other Memory Standards

Aug 25, 2021, 12:16pm UTC
https://wccftech.com/samsung-shows-off-in-memory-processing-for-hbm2-gddr6-and-other-memory-standards/ > Samsung announced they are planning to expand their innovative processing-in-memory tech to more HBM2 chipsets, but also DDR4, GDDR6 and LPDDR5X chipsets for the future of the memory chip technology. This information is in light of earlier this year when they reported to be producing HBM2 memory that utilizes an integrated processor that runs computations as high as 1.2 TFLOPS that can be manufactured for AI workloads, something that only CPUs, FPGAs, and graphics cards ASICs are usually expected to complete. This maneuver by Samsung will allow them to pave a spot in the near future for it's next generation HBM3 modules. > Put simply, the chips have an AI engine injected inside each DRAM bank. That allows the memory itself to process data, meaning that the system doesn't have to move data between the memory and the processor, thus saving both time and power. Of course, there is a capacity tradeoff for the tech with current memory types, but Samsung says that HBM3 and future memories will have the same capacities as normal memory chips.