
SK hynix presented its leading memory solutions optimized for AI servers and AI PCs at Dell Technologies World (DTW) 2025 in Las Vegas from May 19–22.
Hosted by Dell Technologies, DTW is an annual conference which introduces future technology trends. In line with DTW 2025’s theme of “Accelerate from Ideas to Innovation,” a wide range of products and technologies aimed at driving AI innovation was showcased at the event.

Based on its close partnership with Dell, SK hynix has participated in the event every year to reinforce its leadership in AI. This year, the company organized its booth into six sections: HBM,1 CMM (CXL2 Memory Module)-DDR5, server DRAM, PC DRAM, eSSDs3, and cSSDs4. Featuring products with strong competitiveness across all areas of DRAM and NAND flash for the AI server, storage and PC markets, the booth garnered strong attention from visitors.
1High Bandwidth Memory (HBM): A high-value, high-performance product that significantly enhances data processing speeds compared to conventional DRAM by vertically stacking multiple DRAM chips. There are six generations of HBM, starting with the original HBM and followed by HBM2, HBM2E, HBM3, HBM3E, and HBM4.
2Compute Express Link (CXL): A next-generation interface that efficiently connects CPUs/GPUs and memory in high-performance computing systems, supporting large-scale, ultra-fast computations. Applying CXL to existing memory modules can expand capacity by more than 10 times.
3Enterprise Solid State Drive (eSSD): SSDs designed for enterprise use, typically installed in servers and data centers.
4Client Solid State Drive (cSSD): SSDs intended for consumer use, commonly found in personal electronic devices such as PCs and tablets
In the HBM section, SK hynix presented samples of its 12-layer HBM4, capable of processing over 2 terabytes (TB) of data per second with industry-leading speed optimized for AI, alongside the 12-layer HBM3E, the industry’s best-performing and highest-capacity HBM on the market. Currently under development, HBM4 is a next-generation HBM set to offer enhanced base die performance and reduced power consumption. SK hynix recently became the first in the world to provide samples of HBM4 to major customers, reaffirming its leadership in AI memory. At the exhibition, the company also displayed its HBM3E paired with the B100, NVIDIA’s latest Blackwell GPU, highlighting its strong partnerships with global customers.
The CMM-DDR5 section featured SK hynix’s high-capacity 96 and 128 gigabyte (GB) CMM-DDR5 products. As a DDR5 memory module based on CXL technology, CMM-DDR5 offers 50% greater capacity and 30% more bandwidth compared to conventional configurations. A video demo captured significant attention from visitors by showcasing how SK hynix’s CMM-DDR5 delivers stable performance while overcoming capacity limitations in systems equipped with server processors from major CPU manufacturers.
In the server DRAM section, SK hynix presented a lineup of DDR5-based memory modules for servers. The spotlight was on the DDR5 RDIMM5 and MRDIMM6 products built using the 1c node,7 the sixth generation of the 10 nm process technology. The company also showcased various DRAM solutions designed to maximize AI server performance and reduce power consumption, including its 96 GB DDR5 RDIMM, 256 GB 3DS8 DDR5 RDIMM, and DDR5 MRDIMM in both 96 GB and 256 GB capacities.
5Registered Dual In-line Memory Module (RDIMM): A server memory module product in which multiple DRAM chips are mounted on a substrate.
6Multiplexed Rank Dual In-line Memory Module (MRDIMM): A server memory module product in which multiple DRAM chips are mounted on a substrate. Speed is enhanced by operating two ranks—the basic operating units of the module—simultaneously.
710 nm DRAM process technology has progressed through six generations: 1x, 1y, 1z, 1a, 1b, and 1c. In August 2024, SK hynix became the first in the world to successfully develop 1c process technology.
83D Stacked Memory (3DS): A high-bandwidth memory product in which two or more DRAM chips are packaged together and interconnected using TSV technology.
Visitors to the PC DRAM section explored SK hynix’s key DRAM offerings for the on-device AI market. These included LPCAMM2, which integrates multiple LPDDR5X9 chips into a single module for power efficiency and high performance, and GDDR7, the industry’s best-performing graphics memory product.
9LPDDR5X: DRAM used in mobile products, such as smartphones, tablets and laptops, featuring low-voltage operation to minimize power consumption. The standard includes the prefix “LP,” which stands for “low power.” The LPDDR standard has been developed in the order of LPDDR1, 2, 3, 4, 4X, 5, and 5X, with LPDDR5X representing the seventh-generation LPDDR product.
Various next-generation NAND products were displayed in the eSSD/cSSD section. Notably, the booth featured a Dell server system (R7625) equipped with the PS1010. A PCIe10 5.0 eSSD which combines multiple 176-layer 4D NAND chips, the PS1010 is optimized for AI-driven applications and data centers.
10Peripheral Component Interconnect express (PCIe): A high-speed input/output interface with a serial structure used on the main board of a digital device.
In addition, the section also featured PS1012 and PCB01 SSDs which are leading the AI server and PC markets. The PS1012, boasting a significant capacity of 61 TB, is a PCIe 5.0/QLC11-based eSSD, meeting the rapidly increasing demand for high-capacity eSSDs.
11Quad-level cell (QLC): A type of memory cell used in NAND flash that stores four bits of data in a single cell. NAND flash is categorized as single-level cell (SLC), multi-level cell (MLC), triple-level cell (TLC), QLC, and penta-level cell (PLC) depending on how many data bits can be stored in one cell. As the amount of information storage increases, more data can be stored in the same volume.
The PCB01, a cSSD product boasting sequential read speeds of 14 GB per second and sequential write speeds of 12 GB per second, was demonstrated during the exhibition. Offering excellent performance and compatibility in a desktop environment with an Intel PC processor, this cSSD is expected to play a key role in the on-device AI market, including in AI PCs.
Meanwhile, SK hynix shared insights on AI memory and next-generation technologies through its presentation sessions.
Technical Leaders Gyoyoung Lee and Jungwon Park from SSD Product Planning gave a joint presentation on “Key Things to Consider for Your AI storage.” Additionally, Technical Leader Santosh Kumar from DRAM Technology Planning (DRAM TP) at SK hynix America emphasized the future value of CMM products with his presentation on “Why Do We Need CMM Devices in the AI Era?”
Through DTW 2025, SK hynix observed the rapid growth in demand for AI memory and storage across AI servers, PCs and the broader industry. Moving forward, the company will continue to strengthen technological collaboration based on one-team partnerships with customers, providing high-performance products in a timely manner to meet market needs.