Businessfeatured

SK hynix Announces the World’s Largest Capacity 16-Layer HBM3E Under Development at SK AI Summit 2024

By November 6, 2024 No Comments

The SK AI Summit was held at COEX in Seoul, South Korea

 

SK hynix participated in the SK AI Summit 2024 held at COEX in Seoul, South Korea from November 4–5. The company officially announced it is developing the world’s largest capacity HBM1, the 48 GB 16-layer HBM3E, at the event and shared other key achievements along with its vision to become a full stack AI memory provider.

Held under the slogan “AI Together, AI Tomorrow,” the event was formerly SK Group’s annual SK Tech Summit but has evolved with a greater focus on AI into the SK AI Summit. Global AI leaders gathered to explore strategies on thriving in the AGI2 era and discuss ways to strengthen the AI ecosystem.

Prominent business figures in attendance included SK Group Chairman Chey Tae-won, SK hynix CEO Kwak Noh-Jung, SK Telecom President and CEO Ryu Young-sang, OpenAI Chairman and President Greg Brockman, and Microsoft Corporate Vice President for Azure Hardware Systems and Infrastructure Rani Borkar. Major tech companies such as Amazon Web Services (AWS) and Microsoft, as well as members of the K-AI Alliance3, also participated in the event, operating booths and interacting with industry stakeholders and visitors.

1High Bandwidth Memory (HBM): A high-value, high-performance memory that vertically interconnects multiple DRAM chips using through-silicon via (TSV) and dramatically increases data processing speed in comparison to traditional DRAM products. HBM3E is the extended version of HBM3, the fourth-generation product that succeeds HBM, HBM2 and HBM2E.
2Artificial General Intelligence (AGI): A theoretical form of AI research that aims to create software with human-like intelligence and the ability to self-teach.
3K-AI Alliance: Established by SK Telecom, the K-AI Alliance consists of Korean tech companies which aim to lead the development and global expansion of the Korean AI industry.

As a key player in the AI industry, SK hynix showcased its leading AI memory products and achievements, with particular attention drawn to the announcement of the 16-layer HBM3E. Several SK hynix executives attended the product presentation which underlined the company’s technological leadership, including: CEO Kwak; Vice President Uksong Kang, head of Next Generation Product Planning; Vice President Munphil Park, head of HBM Product Engineering (PE); Kangwook Lee, head of Package Development; Youngpyo Joo, head of Software Solution; and SK hynix America’s Vice President of Technology Paul Fahey.

SK Group Chairman Chey Tae-won delivers opening remarks at the SK AI Summit

SK Group Chairman Chey Tae-won delivers opening remarks at the SK AI Summit

 

SK Group Chairman Chey opened the summit by presenting the group’s vision for AI in line with the summit’s slogan. “AI is in its early stages and there are many unknowns, so the participation and cooperation of numerous stakeholders is crucial to solve problems and make progress,” said Chey. “SK is a global company involved in everything from chips to energy, data center construction and operation, service development, and technology commercialization. We are working with the best partners in each field to foster global AI innovation.”

SK hynix’s Keynote Address: CEO Kwak Announces 16-Layer HBM3E

On the first day, CEO Kwak delivered a keynote address titled “Next AI Memory: Hardware to Everywhere” in which he officially announced the 16-layer HBM3E is under development.

SK hynix CEO Kwak Noh-Jung delivers a keynote speech at the summit

SK hynix CEO Kwak Noh-Jung delivers a keynote speech at the summit

SK hynix CEO Kwak Noh-Jung delivers a keynote speech at the summit

 

“We stacked 16 DRAM chips to realize 48 GB capacity and applied Advanced MR-MUF technology4 proven for mass production. In addition, we are developing hybrid bonding5 technology as a backup process,” explained Kwak. “The 16-layer HBM3E can improve AI learning performance and inference6 performance by up to 18% and 32%, respectively, compared to the 12-layer HBM3E.” The 16-layer HBM3E is planned to be commercialized in 2025.

4Advanced mass reflow-molded underfill (MR-MUF): Next-generation MR-MUF technology with warpage control, which enables warp-free stacking of chips 40% thinner than conventional chip thicknesses, and improved heat dissipation properties due to new protective materials.
5Hybrid bonding: A technology that directly bonds chips without forming a bump between them during stacking. This reduces the overall thickness of the chip, enabling high stacking. SK hynix is looking at both Advanced MR-MUF and hybrid bonding methods for 16-layer and higher HBM products.
6AI inference: The process of running live data through a trained AI model to make a prediction or solve a task.

At the announcement, Kwak also revealed the company’s roadmap featuring three areas: World First (world-first developed and mass-produced products), Beyond Best (next-generation, high-performance products), and Optimal Innovation (system-optimized products for the AI era). “Following the development of the 16-layer HBM3E and the 1cnm DDR5 RDIMM, we plan to develop next-generation, high-performance products such as HBM4 and UFS7 5.0,” he said. “In the long term, we will commercialize custom HBM and CXL®8 optimized for AI to become a full stack AI memory provider.”

7Universal Flash Storage (UFS): A mobile storage standard that allows simultaneous read and write functions, unlike the existing embedded MultiMediaCard (eMMC). Developed up to version 4.0, it combines the high speed of PC storage (SSD) with the low power of mobile storage (eMMC).
8Compute Express Link® (CXL®): A next-generation interface for efficiently utilizing high-performance computing systems.

Kwak also gave more information about HBM4. “We are working with the world’s leading foundry to improve the performance of the base die and ultimately reduce power consumption,” said Kwak. “With our ‘one-team’ partnership, we will deliver the most competitive products and further solidify our position as the HBM leader.”

Presentations: Key Executives Deliver Insights on HBM & Next-Gen Memory

During the talk sessions, SK hynix executives were key presenters in the AI chip and service areas, providing insights from their respective fields.

(From the first image) Professor Sungjoo Yoo of Seoul National University; Munphil Park of HBM PE; Professor Gunjae Koo of Korea University; and Youngpyo Joo, head of Software Solution, deliver presentations

(From the first image) Professor Sungjoo Yoo of Seoul National University; Munphil Park of HBM PE; Professor Gunjae Koo of Korea University; and Youngpyo Joo, head of Software Solution, deliver presentations

(From the first image) Professor Sungjoo Yoo of Seoul National University; Munphil Park of HBM PE; Professor Gunjae Koo of Korea University; and Youngpyo Joo, head of Software Solution, deliver presentations

(From the first image) Professor Sungjoo Yoo of Seoul National University; Munphil Park of HBM PE; Professor Gunjae Koo of Korea University; and Youngpyo Joo, head of Software Solution, deliver presentations

(From the first image) Professor Sungjoo Yoo of Seoul National University; Munphil Park of HBM PE; Professor Gunjae Koo of Korea University; and Youngpyo Joo, head of Software Solution, deliver presentations

 

Munphil Park of HBM PE and Professor Sungjoo Yoo of Seoul National University’s Department of Computer Science and Engineering presented on accelerator trends and the outlook for HBM. In response to Professor Yoo’s prediction of rising AI inference cost, Park emphasized that custom HBM products are emerging for improved performance and cost optimization. In addition, Park noted that three-way collaboration between customers, foundries, and memory providers is addressing this advancement.

Youngpyo Joo, head of Software Solution, and Professor Gunjae Koo of Korea University’s Department of Computer Science and Engineering discussed future architectures and new memory solutions. Professor Koo expressed the need for a new system software structure, and Joo explained how SK hynix meets this demand through its new CXL and PIM9 memory solutions.

9Processing-In-Memory (PIM): A next-generation technology that adds computational capabilities to memory to solve the problem of data movement congestion in AI and big data processing.

(From the first image) Vice President Uksong Kang, head of Next Generation Product Planning, and Paul Fahey, SK hynix America’s Vice President of Technology, participating in a fireside chat

(From the first image) Vice President Uksong Kang, head of Next Generation Product Planning, and Paul Fahey, SK hynix America’s Vice President of Technology, participating in a fireside chat

(From the first image) Vice President Uksong Kang, head of Next Generation Product Planning, and Paul Fahey, SK hynix America’s Vice President of Technology, participating in a fireside chat

 

SK hynix Vice Presidents Uksong Kang and Paul Fahey participated in a fireside chat on “Digital Neural Networks in the Era of Hyperconnectivity: AI and Memory Shaping the Future of the Industrial Landscape”. The pair spoke about how three technologies will play a significant role in the future: next-generation HBM with logic processes applied to base die; PIM, which will be crucial for on-device AI10; and CXL, which has memory sharing capabilities. They also revealed that SK hynix is working closely with partners to create next-generation AI memory products.

10On-device AI: A technology that enables AI processing directly on the device, improving responsiveness and delivering personalized, real-time services without the need for cloud-based computation.

Kangwook Lee, head of Package Development, taking part in a panel discussion with industry stakeholders

Kangwook Lee, head of Package Development, taking part in a panel discussion with industry stakeholders

Kangwook Lee, head of Package Development, taking part in a panel discussion with industry stakeholders

 

In a panel discussion, Kangwook Lee joined industry stakeholders to discuss the “Future Evolution of AI Chips and Infrastructure.”

(From the first image) Byungkyu Lee, Munuk Kim, and Senam Jang of DT; Jongoh Kwon, team leader of Package Development; Intae Whoang of P&T, Sunghyun Yoon of Infra Tech Center, and Junghan Kim of R&D; Jaeyung Jun and Kyungsoo Lee of Memory Systems Research, Sangsu Park of R&D; and Eunyoung Park, team leader of Safety Culture, giving presentations

 

In addition, Technical Leader Byungkyu Lee of Digital Transformation (DT) presented a talk titled “Eliminate Repetitive Tasks via E2E Automation and Shift to High Value-Added Work in Semiconductor Fabrication.” Meanwhile, Technical Leaders Munuk Kim and Senam Jang of DT delivered a joint presentation on “Improving Competitiveness of Production Quality by Intelligent Image Classification AI System Application.”

During another session, Jongoh Kwon, team leader of Package Development, along with Technical Leaders Intae Whoang of Package & Test (P&T), Sunghyun Yoon of Infra Tech Center, and Junghan Kim of R&D, covered the use of technologies in HBM. The presenters shared their technical insights on 16-layer HBM packaging technology, HBM production advancement using deep learning, and semiconductor research using AI. Meanwhile, Technical Leaders Jaeyung Jun, Kyungsoo Lee from Memory Systems Research and Sangsu Park from R&D spoke about future memory technologies. The presentation covered the potential of heterogeneous memory, requirements for on-device AI, and next-generation computational memory.

In a talk on AI services, Eunyoung Park, team leader of Safety Culture, attracted significant attention for her presentation titled “The Story of SK hynix’s Guardian, A Fusion of 24-Hour Unmanned Patrol Robot Ga-on and Vision AI.”

Booth: Groundbreaking AI Memory Lineup With Major Announcements

Under the theme “Deep Dive Into AI,” SK Group operated a booth with its subsidiaries. SK hynix’s full AI memory lineup was on display, including its latest HBM3E products.

(From the first image) The SK Group joint booth featured SK hynix’s key products, including: CMM-DDR5; AiMX; DDR5 MCR DIMM; eSSDs; LPCAMM2; LPDDR5X; Beetle X31; and automotive solutions

 

SK hynix also presented its CMM (CXL Memory Module)-DDR511 and AiMX, which are set to play a key role in the future. CMM-DDR5 can theoretically expand system bandwidth by up to 50% and capacity by up to 100% compared to systems equipped with only DDR5. Meanwhile, AiMX is an accelerator card with multiple GDDR6-AiM12 chips that performs both storage and computation functions to improve AI performance.

11CXL Memory Module-DDR5 (CMM-DDR5): A next-gen DDR5 memory module based on CXL that enhances system bandwidth, speed, and performance for AI, cloud, and high-performance computing.
12Accelerator-in-Memory (AiM): SK hynix’s product name for PIM semiconductors, including GDDR6-AiM.

Visitors to the booth could also check out DDR5 MCR DIMM13, an ultra-fast memory module for high-performance computing (HPC) and AI servers which reaches speeds of up to 8.8 gigabits per second (Gbps). In addition, LPCAMM2, an LPDDR5X14-based module solution, was highlighted as a product that will play an active role in on-device AI. Enterprise SSDs (eSSDs), including PS1010 E3.S, which are optimized for AI and data centers were also on display.

13Multiplexer Combined Ranks Dual In-line Memory Module (MCR DIMM): A module product with multiple DRAMs bonded to a motherboard in which two ranks, basic information processing units, operate simultaneously, resulting in improved speed.
14Low Power Double Data Rate 5 eXtended (LPDDR5X): Low-power DRAM for mobile devices, including smartphones and tablets, aimed at minimizing power consumption and featuring low voltage operation. LPDDR5X is the seventh generation product and was succeeded by LPDDR5T. The eighth generation LPDDR6 is under development.

The booth also featured AI-based systems and solutions. The company unveiled an AI-driven material quality prediction system and an automotive solution which attracted the attention of visitors.

“We provide AI memory solutions that span the entire spectrum of AI. Looking ahead, we are ready to create new experiences in the future with you,” said CEO Kwak. Following the successful announcement of the 16-layer HBM3E, SK hynix plans to further prepare for the future market in line with the vision presented at the SK AI Summit 2024.