In today’s digital world powered by binary code, the SK hynix Newsroom delves into the present and future industries powered by semiconductors through a new series. The Age of Semiconductors series offers in-depth analysis of the most pressing industry challenges paired with expert insights into potential solutions and the path forward. Each article will explore how small yet revolutionary semiconductor technologies create a ripple effect — driving change far beyond their size. This first episode examines the climate challenges posed by AI data centers and spotlights semiconductor innovations that are emerging as potential solutions.

Generative AI may now be capable of answering almost any question but there are hidden costs to this revolutionary technology. For example, even adding short phrases such as “please” and “thank you” to chats can increase power consumption, heating up the earth and contributing to climate change. In light of these concerns that increased AI usage could exacerbate global warming, the SK hynix Newsroom explores AI’s environmental impact and potential solutions to mitigate its effects.

AI Servers Driving Power Consumption in Data Centers

The launch of ChatGPT in 2022 propelled generative AI into the spotlight, driving transformative changes across industries and everyday life. One of the most impacted areas has been data centers. As AI services rapidly expanded, computational workloads grew exponentially, leading to a significant surge in power consumption and carbon emissions.

Global data center power consumption is set to rise in the coming years

The 2025 Energy and AI report by the International Energy Agency (IEA) estimates that data centers consumed approximately 415 terawatt-hours (TWh) of electricity in 2024. This is comparable to charging 5.5 billion electric vehicles, each with a driving range of 500 kilometers. Even more concerning is the expected surge in consumption, with electricity demand from data centers set to reach almost 945 TWh in 2030.

Professor Byoung Hun Lee, of POSTECH’s Department of Semiconductor Engineering, explaining AI server’s power consumption

This predicted growth in energy consumption can be traced to accelerated servers, commonly referred to as AI servers, which are responsible for handling AI computations. Powered by GPUs, these servers consume significantly more energy compared to general CPU-based servers. Professor Byoung Hun Lee, from POSTECH’s Department of Semiconductor Engineering, explained: “AI servers use 7 to 8 times more power than general servers, particularly during GPU-centric AI computation processes.”

According to the IEA’s Energy and AI report, AI servers account for a substantial share of data center power consumption. By 2030, electricity consumption from AI servers alone is projected to reach 300 TWh, representing one-third of the total data center energy usage. In short, as the adoption of AI grows, the power consumption and associated carbon emissions from AI servers are expected to rise significantly.

Integrated Solution Needed for Carbon Emissions and Climate Change

Professor Sujong Jeong, Director of the Climate Tech Center at Seoul National University, warned: “Data centers’ power consumption is projected to reach 945 TWh by 2030, equivalent to Japan’s total electricity consumption today. This level of energy usage could potentially generate 2.5 billion tons of carbon emissions.”

Professor Sujong Jeong shares concerns about AI data centers’ carbon emission issues

Professor Jeong added that greenhouse gas emissions and other pollutants from data centers could accelerate global warming, triggering a range of negative consequences.

Since slowing down AI progress is not an option, experts are united in calling for integrated solutions to reduce carbon emissions. According to the IEA, AI-optimized software, such as energy management systems and workload scheduling, has the potential to cut energy consumption by up to 15%. Additionally, efforts are underway to improve data center efficiency, adopt eco-friendly renewable energy, and explore new alternative energy solutions such as small modular reactors1 and carbon capture utilization and storage2.

As AI servers are the main contributors to power consumption in data centers and semiconductors are their core components, improving the performance and efficiency of semiconductors has emerged as one of the most effective solutions. This is why innovation in semiconductor technology is essential. Professor Byoung Hun Lee explained: “Research is currently underway to develop new semiconductor technologies that could reduce power consumption to less than 1/100 of current levels. New architectures designed to minimize data transfer volumes are also being explored.”

SK hynix’s AI Memory Technology Tackles Data Center Power Issues

To help enhance the power efficiency of data centers, SK hynix is also actively developing a range of AI memory technologies.

SK hynix memory products offering solutions to AI data center consumption issues

The company has enhanced its High Bandwidth Memory (HBM) with Advanced MR-MUF3 technology, delivering improved heat dissipation and stability. Moreover, the latest HBM4 samples showcase advancements in base die and power efficiency.

In terms of SSDs, SK hynix has developed high-capacity eSSDs designed to reduce power consumption for AI data centers. The QLC4-based PS1012 can store more data in a limited space and facilitate faster transfer speeds than the previous generation, helping to reduce AI training time and lower power consumption.

SK hynix is also developing SOCAMM5, a low-power DRAM-based memory module specialized for AI servers. With a smaller form factor compared to traditional server memory, this product is expected to contribute to improving energy efficiency in AI data centers.

As such, it’s clear that SK hynix is developing a variety of high-efficiency memory technologies to provide direct solutions to data center problems.

The advancement of AI is expected to continue alongside developments in AI memory technology and industry investments. The Stargate Project in the U.S. plans to build gigawatt (GW)-scale data centers by 2029, while Taiwan has also announced a partnership with NVIDIA to establish AI data centers. In South Korea, SK Group is collaborating with Amazon Web Services (AWS) to construct a 100 megawatt (MW) eco-friendly AI data center in Ulsan, with plans to expand it to a 1 GW scale in the long term.

As the number of data centers continues to grow, it is not an exaggeration to suggest that the pace of AI development hinges on the efficiency of energy management. And perhaps, it all begins with a single semiconductor chip.