SK hynix presented a wide range of leading memory products for AI servers, PCs, and mobile devices at COMPUTEX Taipei 2025 from May 20–23.   

As one of Asia’s leading IT exhibitions, COMPUTEX Taipei brings together global tech companies to showcase their latest technologies and solutions. It also serves as a platform for sharing innovations and emerging trends in key industries, including AI and data centers. Held under the theme “AI Next,” this year’s show featured around 1,400 companies operating booths covering AI, robotics, next-generation technologies, and future mobility.

  • SK hynix’s booth at COMPUTEX Taipei 2025

At the exhibition, SK hynix presented groundbreaking products under the slogan “Memory, Powering AI and Tomorrow” at a booth featuring four sections: HBM for AI, Data Centers, Mobile/PC, and Ethics & ESG, highlighting its diverse lineup of AI memory solutions. 

  • The HBM for AI section

In the HBM for AI section, SK hynix showcased its latest HBM solutions, including the 12-layer HBM4 and 12-layer HBM3E. HBM4 — currently the industry’s fastest HBM with a speed of 2 terabytes per second (TB/s) — is being developed with enhanced base die performance to improve connectivity between the HBM stack and logic chips, while also reducing power consumption. In March 2025, SK hynix became the first in the industry to supply 12-layer HBM4 samples to major customers. The product is slated for mass production in the second half of this year. Furthermore, the company unveiled a roadmap for its 16-layer HBM4, targeted for launch in 2026. 

In addition, SK hynix presented its 36 GB 12-layer HBM3E, installed in NVIDIA’s AI server GPU module GB200.

  • The Data Center section at SK hynix’s booth

In the Data Center section, the company introduced server DRAM modules and eSSD products that maximize AI server performance while reducing power consumption.  

The server DRAM lineup included RDIMM1 products with 64 GB to 256 GB capacities, equipped with DRAM which can reach speeds of up to 8 gigabit per second (Gbps); MRDIMM2 modules with 96 GB to 256 GB capacities featuring 12.8 Gbps-class DRAM; and a 128 GB SOCAMM3 module built with 7.5 Gbps-class LPDDR5X4. Notably, the company’s DDR5-based high-performance modules — developed using the 1c node5, the sixth generation of the 10 nm process technology — received significant attention from attendees. 

In the eSSD zone, SK hynix presented storage solutions which offered ultra-high capacities of up to 61 TB. In particular, the PCIe6 Gen5 PEB110, based on 238-layer 4D NAND, and the QLC7-based 61 TB PS1012, deliver optimal performance for AI data centers. 

  • The Mobile and PC section at SK hynix’s booth

Visitors to the Mobile/PC section could check out DRAM and NAND products optimized for on-device AI. The company’s displayed mobile device solutions included LPDDR5X, which can reach speeds of up to 10.7 Gbps, and UFS 4.1 storage built on 238-layer 4D NAND. For PCs, the company presented various solutions including CSODIMM8 based on 1c DDR5 DRAM; LPCAMM29, featuring multiple LPDDR5X chips in modular form; the high-performance PCB01 SSD for AI PCs; and GDDR7, the world’s fastest graphics DRAM with speeds of up to 28 Gbps per chip.

SK hynix also introduced DRAM and NAND products installed in hardware from global partners such as GIGABYTE, ASUS, and Lenovo, demonstrating its portfolio’s ability to meet demands from various devices including laptops and servers. 

Lastly, in the Ethic & ESG section, the company highlighted its selection as one of the World’s Most Ethical Companies® 2025, emphasizing its efforts to build trust with stakeholders through world-class ethical management. 

  • SK hynix President and CMO Justin Kim (second from left) with NVIDIA CEO Jensen Huang (second from right)

At COMPUTEX Taipei 2025, SK hynix enhanced awareness of its AI memory leadership and strengthened partnerships with global players. Looking ahead, the company plans to continue delivering timely and optimized products through collaboration with customers across various sectors from AI servers and data centers to on-device AI, as it looks to become a full stack AI memory provider.