Samsung Electronics unveiled its next-generation memory solution that will lead the era of artificial intelligence (AI) in Silicon Valley on the 26th (local time)

MEMORY CON 2024 HOMEPAGE

At the global semiconductor conference “MemCon 2024” held in Mountain View, California on the same day, Samsung Electronics introduced “Compute Express Link” (CXL) technology-based memory and high-performance, high-capacity HBM (high-bandwidth memory) that will lead the AI era.

“These solutions are leading the innovation of the industry. CXL technology will lead the future AI era in terms of memory capacity and HBM in terms of bandwidth.” CXL is a technology that enhances data processing speed and capacity by connecting central processing units (CPUs), graphics processing units (GPUs), and memory semiconductors. It is drawing attention as a next-generation memory technology as the amount of data has increased significantly due to Generative AI. “CXL enables efficient management of memory and improves the safety of systems, which can significantly improve memory capacity and bandwidth through various CXL-based solutions unique to Samsung Electronics,” Vice President Choi Jin-hyuk said.

MEMORY CON 2024 HOMEPAGE

Samsung Electronics has introduced a number of CXL-based solutions, including CMM-D (DRAM), CMM-H (Hybrid) that uses NAND and DRAM together, and CMM-B (Box), a memory-pulling solution. Samsung also emphasized HBM resolution. HBM is a high-performance memory that innovatively speeds up data processing by connecting multiple DRAMs vertically. It is drawing attention to drive Generative AI.

“We will continue our leadership in high-performance, high-capacity memory in the AI era by mass-producing 12-layer 5th generation HBMs and 32-gigabit-based 128GB DDR5 products in the first half of the year, following the 3rd generation (HBM2E) and 4th generation (HBM3) being mass-produced,” said Hwang Sang-joon, vice president of Samsung Electronics. Samsung Electronics plans to continue innovating memory semiconductors in the AI era by applying buffer dies, a control device, to the bottom of the stacked memory. MemCon was the first conference held last year to discuss AI-related memory solutions in-depth, and Samsung Electronics, SK Hynix, Microsoft, Meta, Nvidia, and AMD participated.

EJ SONG

US AISA JOURNAL

spot_img

Latest Articles