Businessfeatured

[SK hynix’s 41st Anniversary] “Celebrating 40+1” … Harnessing 40 Years of Technological Expertise to Stand Alone as the Global No. 1 AI Memory Provider

By October 10, 2024 No Comments

SK hynix entered the semiconductor business in 1983 and has ascended to become the global no. 1 AI memory provider following more than 40 years of relentless effort and innovation. Building on this longstanding technological expertise and entering a new chapter in 2024, the company is strengthening its leadership and marking the start of its “40+1 renaissance.” At the core of this success are AI memory solutions such as HBM, PIM, and CXL® which are powered by advanced processes and packaging technologies. To mark SK hynix’s 41st anniversary, the newsroom reflects on the history, technological achievements, and the dedication of the company’s employees that have driven these innovative products.

SK hynix has embarked on a 41-year journey to become a leader in HBM and AI memory

SK hynix has embarked on a 41-year journey to become a leader in HBM and AI memory

 

The Rise of SK hynix & Its HBM Propelled by the AI Era

SK hynix’s rise to become the leader in the global memory market has been driven by the growth of the AI industry. Since the emergence of generative AI in 2022, a wide range of products and services have adopted AI as the technology has rapidly evolved. This has led to a surge in demand for high-performance memory, which is essential for processing massive datasets and enabling fast training and inference1. In response to this demand, SK hynix is providing advanced memory products and thereby playing a defining role in the development of the AI industry.

1AI inference: The process of running live data through a trained AI model to make a prediction or solve a task.

SK hynix has continually advanced its HBM lineup to reach new standards in performance

SK hynix has continually advanced its HBM lineup to reach new standards in performance

 

SK hynix solidified its capabilities even before the AI boom by focusing on developing the early generations of HBM, a high-bandwidth memory which rapidly transmits large volumes of data. The company then gained market leadership and expanded its influence with the third-generation HBM, HBM2E. HBM3, the successor to HBM2E which is optimized for AI and high-performance computing (HPC), also drew significant attention. Most notably, the company established itself as a key partner in the AI and data center markets by supplying HBM products to NVIDIA. Around this time, SK hynix achieved a 50% market share in the HBM sector to strengthen its HBM leadership.

Moving into 2024, SK hynix has maintained its prominence in the AI memory market. The company began supplying the world’s best-performing 8-layer HBM3E, first developed in 2023, to leading global tech giants in March 2024. Offering maximum data processing speeds of around 1.2 terabytes (TB) per second, HBM3E helped SK hynix further bolster its status as the global no. 1 AI memory provider.

Next-Generation HBM: Utilizing 15 Years of HBM Technology Know-How

SK hynix’s HBM success story can be traced back to 2009. This was the year when the company began full-scale product development after discovering that TSV2 and WLP3 technologies could break memory performance barriers. Four years later, the company introduced the first-generation HBM, incorporating these TSV and WLP technologies. Although HBM was hailed as an innovative memory solution, it did not receive an explosive market response because the HPC sector had not yet sufficiently matured enough for widespread HBM adoption.

2Through-silicon via (TSV): A technology that drills thousands of microscopic holes in the DRAM chip and connects the upper and lower chip layers with electrodes that vertically penetrate through these holes.
3Wafer-level packaging (WLP): A method that is a step beyond the conventional package, where wafers are cut into individual chips and then packaged. In WLP, the packaging is completed at the wafer level, producing finished products.

Building on its 15-year HBM history, SK hynix is well-placed to develop next-generation HBM products

Building on its 15-year HBM history, SK hynix is well-placed to develop next-generation HBM products

 

Despite this, SK hynix pressed forward, focusing on developing the next generation of HBM and pursuing the goal of achieving the “highest performance.” During this period, the company applied MR-MUF4 technology, known for its high thermal dissipation and production efficiency, to HBM2E which changed the market landscape. Building upon this, SK hynix developed Advanced MR-MUF technology, which excelled in thin chip stacking, thermal management, and productivity, and applied it to both HBM3 and HBM3E. Leveraging this technology, SK hynix set a series of industry-best performance records, successfully mass-producing the 12-layer HBM3 (24 GB) in 2023 and the 12-layer HBM3E (36 GB) in 2024.

4Mass reflow-molded underfill (MR-MUF): A technology that ensures secure and reliable connections in densely stacked chip assemblies by melting the bumps between stacked chips.

These achievements were driven by a strategy that precisely aligned with the rise of the AI revolution. SK hynix launched its AI memory products at the right time, fully meeting market demands. This was made possible through 15 years of accumulated technological expertise based on research and development, unwavering employee trust in the company’s know-how, and forward-looking strategic investments.

SK hynix has continued taking strategic steps to strengthen its AI leadership in 2024. In April, the company signed an investment agreement to build an advanced packaging production facility in the U.S. state of Indiana which will produce next-generation HBM and AI memory. In the same month, SK hynix entered a technology agreement with TSMC. The deal aims to establish a collaborative three-way framework between the customer, foundry, and memory provider to overcome technological limits and secure an advantage in the AI market.

Beyond HBM: Relentless Innovation & Strengthening the AI Memory Lineup

SK hynix’s pursuits and innovations are unfolding across all areas of memory. The company has established its “memory-centric”5 vision and is developing a wide range of memory solutions based on over 40 years of accumulated technological knowledge. In 2024, SK hynix is making concerted efforts to strengthen its lineup with PIM, CXL, and AI SSD products to mark the first year of its renaissance.

5Memory-centric: An environment where memory semiconductors play the central role in ICT devices.

SK hynix’s AI memory lineup includes PIM, CXL, and AI SSD products

SK hynix’s AI memory lineup includes PIM, CXL, and AI SSD products

 

SK hynix is developing its lineup of processing-in-memory (PIM), an intelligent semiconductor memory which breaks the boundary between storage and computation. PIM, which features a processor for computational functions, is capable of processing and delivering the data required for AI computation. In terms of PIM-based products, SK hynix has launched the GDDR6-Accelerator-in-Memory (AiM) and last year introduced the accelerator card AiMX, an AiM-based accelerator that boosts performance by connecting multiple AiM units. In 2024, the company drew attention by unveiling a 32 GB version of AiMX which offers double the capacity of its predecessor.

SK hynix is also actively investing in Compute Express Link (CXL), a technology that integrates different interfaces such as those for CPUs and memory, to expand memory bandwidth and capacity. In May 2024, the company introduced the CXL Memory Module (CMM)-DDR5 which offers 50% greater bandwidth and double the capacity compared to standard DDR5. Then in September, SK hynix integrated key features of its CXL-optimized software HMSDK6 into the open-source operating system Linux, setting a new standard for the use of CXL technology.

6Heterogeneous Memory Software Development Kit (HMSDK): SK hynix’s proprietary heterogeneous memory software development tool. Enhances the performance of heterogeneous memory systems, including CXL memory, through effective memory control.

Ultra-high-speed, high-capacity enterprise SSDs (eSSDs) for AI servers and data centers are another area of focus for SK hynix. A prime example is the 60 TB Quad Level Cell (QLC) eSSD, co-developed with the company’s U.S. subsidiary Solidigm. This product stores 4 bits per cell while maintaining low power consumption. Looking ahead, the company is planning to develop a 300 TB eSSD and launch the product in 2025.

The company also offers a robust lineup for on-device AI. SK hynix developed the low-power DRAM, LPDDR5T7, in January 2023 to enhance the performance of AI smartphones. In November of the same year, the company unveiled the modularized version of the LPDDR5X, LPCAMM2, which is expected to deliver excellent performance in AI desktops and laptops. SK hynix has also completed the development of the high-performance client SSD (cSSD), PCB01, for AI PCs and the mobile NAND solution for AI, Zoned UFS (ZUFS) 4.0.

7Low Power Double Data Rate 5 Turbo (LPDDR5T): Low-power DRAM for mobile devices, including smartphones and tablets, aimed at minimizing power consumption and featuring low voltage operation. LPDDR5T is an upgraded product of the 7th generation LPDDR5X and will be succeeded by the 8th generation LPDDR6.

Shaping the Future of Total AI Memory

Today, AI is being used to write reports, generate images, and create various types of content. In healthcare, AI aids in making diagnoses, while in education AI serves as an assistant for teachers. These are just a small selection of the current applications of AI, and the possibilities for the future are expected to be almost limitless as the technology continues to advance.

SK hynix plans to develop customized AI memory and emerging memory products

SK hynix plans to develop customized AI memory and emerging memory products

 

At the center of this technological revolution is AI memory. Various AI memory solutions, such as HBM, PIM, CXL, and SSDs, transmit large volumes of data quickly with high bandwidth or send only the processed results directly to the processor, minimizing bottlenecks and enhancing AI learning and inference performance. Furthermore, these technologies improve the energy efficiency of AI systems, contributing to the establishment of more sustainable AI infrastructure. These advanced AI memory technologies are expected to be applied across a wider range of industries such as the automotive and healthcare sectors, enabling faster and more efficient AI services.

To further AI’s development, SK hynix is continuously overcoming technological limitations. The company is focused on developing custom AI memory optimized for each customer in line with the growing diversification of AI services. Moreover, SK hynix is also working on next-generation emerging memory, which is based on new structures and principles as well as innovative components, such as ReRAM8, MRAM9, and PCM10. By relentlessly investing in technology development, SK hynix aims to secure differentiated competitiveness with advanced technologies and establish a leading position in future markets.

8Resistive RAM (ReRAM): A type of emerging memory with a simple structure containing a filament in which data is stored by applying voltage to the filament. It features a large data storage capacity through process miniaturization and low power consumption.
9Magnetic Random Access Memory (MRAM): A type of emerging memory which utilizes both charge and spin, with resistance in the device changing based on the direction of the spin.
10Phase-Change Memory (PCM): Semiconductor memory which stores data by utilizing the phase change of a specific material (phase-change memory). It combines the benefits of non-volatile flash memory, which retains data even when powered off, with the rapid processing speeds of DRAM.

The semiconductor market itself is poised for significant growth. The World Semiconductor Trade Statistics (WSTS) forecasts that the semiconductor market will expand by 16% year-on-year in 2024. In particular, the semiconductor memory sector is predicted to grow by an impressive 76.8% as demand surges for AI memory such as HBM.

Standing at the forefront of the huge AI wave, SK hynix is preparing for another leap forward by building on its past achievements. As SK hynix celebrates its 41st anniversary, the company aims to maintain HBM leadership while securing dominance in the next-generation semiconductor market to stand alone in an era where its products become “the heart of AI.”