From accelerating drug development pipelines to advancing autonomous vehicle performance and strengthening financial security, AI is driving paradigm shifts across industries. As AI rapidly reshapes daily life across work, education, homelife and travel, understanding this changing ecosystem is essential for navigating the AI era. To shed light on this, the SK hynix Newsroom is running a new series, “Exploring the AI Ecosystem.”

The vast and complex AI ecosystem spans a wide range of areas including industry-specific applications; models and platforms; infrastructure; accelerators; and computing infrastructure. While competition in this space remains fierce, collaboration is also key as companies within the ecosystem evolve through technological interconnectivity and mutual dependence.

At the foundation of the entire AI ecosystem lies AI computing infrastructure, which features high-performance memory at its core. As a leader in the memory field, SK hynix plays a pivotal role in powering the AI industry. This second article in the Exploring the AI Ecosystem series will cover how SK hynix’s industry-leading HBM products have become a key driver of AI innovation.

HBM: A Core Technology Driving the AI Ecosystem

Following the evolution of AI, HBM has emerged as a key solution to data bottlenecks

In 2016, the historic Go match between Google DeepMind’s AlphaGo program and professional player Sedol Lee demonstrated the potential of AI. This pivotal event was the culmination of decades of technological evolution. Over time, the accumulation of algorithms for machine learning and deep learning, coupled with the advancement of artificial neural networks, have not only powered AI’s development but also increased AI’s computational power demands.

In the late 2000s, the computational performance of artificial neural networks began to improve after they were integrated with semiconductors such as graphics processing units (GPUs). Originally used for graphics processing in games, GPUs can simultaneously handle multiple inputs for neural network computations thanks to their unique parallel processing capabilities — helping to accelerate AI innovation. However, the data bottlenecks between processors and memory remain a hurdle for AI. Therefore, the speed at which data can be read and written, essentially the speed and efficiency of memory, has become a key area of focus for improving AI performance.

In July 2025, NVIDIA made headlines after it became the first company in the world to briefly reach a market capitalization of USD 4 trillion during intraday trading. It was a moment that underscored the prominence of GPUs in the AI era. Today, many global companies are fiercely competing to secure NVIDIA’s GPUs for generative AI and accelerated computing.

One of the key technologies that determine GPU performance is HBM. This semiconductor memory product vertically stacks multiple DRAM chips using TSV1 and provides significantly more bandwidth than conventional DRAM, dramatically boosting data processing speed. As a high-performance product that supports ultra-fast computation while reducing power consumption, HBM enables the smooth operation of AI systems which must handle vast amounts of data.

SK hynix stands as the global leader in HBM technology. The company’s HBM provides a memory environment that supports high-performance computing devices such as GPUs. Such an environment is essential for LLMs2, which require immense computational power for training and inference. Through its world-class HBM, SK hynix therefore provides the technological foundations for AI and plays a key role in advancing the AI ecosystem.

Unrivaled Technological Leadership

Since launching the world’s first HBM in 2013, SK hynix has become the industry leader

SK hynix has established unrivaled technological leadership in the AI memory field. Since developing the world’s first HBM in 2013, the company has led industry advancements by introducing the second-generation HBM, HBM2, followed by HBM2E, HBM3, and HBM3E. In March 2025, SK hynix became the first in the world to deliver 12-layer HBM4 samples to major customers.

This technological leadership is also reflected in the company’s performance. Driven by HBM growth, the company achieved record-breaking results in 2024, posting KRW 66 trillion (USD 47.6 billion) in revenue and KRW 23 trillion (USD 16.6 billion) in operating profit. In the first quarter of 2025, the company became the leader of the global DRAM market in terms of revenue for the first time. On the product side, SK hynix is rapidly expanding supply of its latest mass-produced product, HBM3E, while also accelerating the development and mass production of HBM4. To support these efforts, SK hynix harnesses its “one-team spirit” to ensure all its employees, organizations, partners, and customers move in unison, further solidifying its tech leadership.

As the AI industry ecosystem matures, global tech giants are driving demand for customized semiconductors. Companies such as Amazon, Google, and Meta are emerging as new HBM customers in the field of application-specific integrated circuit (ASIC) design. In response, SK hynix is dedicating its R&D efforts to custom HBM. Beyond simply enhancing product performance, the company aims to redefine the boundary between memory and logic chips to provide tailored AI memory solutions optimized for customers’ specialized requirements. Through this approach, SK hynix is evolving into a full-stack AI memory provider, offering a comprehensive portfolio ranging from general-purpose products to solutions with diverse functions and specifications.

Strengthening AI Ecosystem Partnerships

SK hynix is helping to connect the broader AI ecosystem through its global partnerships

The AI ecosystem is evolving through interdependence and collaboration. Within this structure, where models, hardware, and infrastructure are organically connected, SK hynix is strengthening its collaboration with global partners.

SK hynix and NVIDIA have formed a key partnership in the AI memory industry. In March 2025, SK hynix unveiled a model of its 12-layer HBM4 at NVIDIA’s annual AI conference, GPU Technology Conference (GTC) 2025. Just two months later at COMPUTEX Taipei 2025, NVIDIA CEO Jensen Huang further underscored the strength of the partnership by visiting the SK hynix booth.

SK hynix is also working with TSMC to enhance CoWoS3 packaging technology and maximize HBM performance. In addition, SK hynix is jointly pursuing R&D and commercialization of AI data center solutions with SK Telecom and Penguin Solutions. Building on such ecosystem-centered collaboration, the company is positioning itself as a central player that connects the broader AI industry.

Contributing to Scientific and Industrial Progress

SK hynix is also committed to advancing Korea’s scientific and technological capabilities and strengthening industrial competitiveness through talent-driven technological innovation. In March 2025, Vice President Taesu Jang of the Future Technology Research Center received the Presidential Commendation at the 52nd Commerce and Industry Day ceremony. He was recognized for the world-first application of the sixth-generation 10 nm process technology to DDR5 DRAM in a short timeframe. “Through process miniaturization, we can expand the capacity and functionality of HBM while also achieving more effective thermal management,” said Jang.

In April 2025, Vice President Seungyong Doh, head of Digital Transformation (DT) was awarded the Bronze Tower Order of Industrial Service Merit at the 2025 Science and ICT Day Ceremony. Doh received the award for his contributions to strengthening memory manufacturing competitiveness, including for HBM, by building a smart factory powered by AI and digital transformation technologies.

To fuel progress in the semiconductor industry and strengthen technological competitiveness, SK hynix has long supported academia-industry collaboration for talent development and research. The company has established industry-partnered departments at Korean universities including KAIST, POSTECH, Korea University, Sogang University, and Hanyang University. Since 2013, it has also hosted the annual SK hynix Grand Prize of Outstanding Inventions in Industry-Academic Research to encourage the creation of new technologies. These initiatives have laid the foundation for SK hynix to enhance Korea’s global standing in the semiconductor industry.

Leading the Memory-Centric AI Era

As AI models develop more advanced learning and inference capabilities, it is becoming increasingly critical for memory to rapidly and efficiently process vast amounts of data in real time. With HBM emerging as a key factor in determining AI performance, memory is set to become another axis of innovation within the AI ecosystem. Leveraging its world-class technological expertise, SK hynix is leading the AI memory sector and is poised to power innovation across the AI industry ecosystem as a full-stack AI memory provider.

The final episode in the series will explore the future of the AI ecosystem and SK hynix’s vision.