News
Samsung HBM-PIM memory expands to more applications
On August 24, Samsung showcased its latest advancements with processing-in-memory (PIM) technology at Hot Chips 33, which is a leading semiconductor conference where the most notable microprocessor and IC innovations are unveiled each year.
At the conference, Samsung launched the first PIM-enabled High Bandwidth Memory (HBM-PIM) into a commercialized accelerator system and expanded PIM applications to embrace DRAM modules and mobile memory as part of accelerating the move toward the convergence of memory and logic.
First Integration of HBM-PIM Into an AI Accelerator
Starting 2021, the Korean tech firm unveiled the world’s first HBM-PIM (Aquabolt-XL), which incorporates the AI processing function into Samsung’s HBM2 Aquabolt, to enhance high-speed data processing in supercomputers and AI applications.
Samsung’s HBM-PIM memory has since been tested in the Xilinx Virtex Ultrascale+ (Alveo) AI accelerator, where it delivered an almost 2.5X system performance gain as well as more than a 60% cut in energy consumption.
Join Sammy Fans on Telegram
Mobile Memory That Brings AI From Data Center to Device
Samsung’s LPDDR5-PIM mobile memory technology can deliver independent AI capabilities even without data center connectivity. Simulation tests have shown that the LPDDR5-PIM can more than double performance while reducing energy usage by over 60% when used in applications such as voice recognition, translation and chatbot.
Energizing the Ecosystem
According to Samsung, it’s planned to expand its AI memory portfolio by working with other industry leaders to complete the standardization of the PIM platform in the first half of 2022.