SK Group Chairman Forecasts Memory Chip Shortage Until 2030, Warns of Sustained Price Increases

SK Group Chairman Forecasts Memory Chip Shortage Until 2030, Warns of Sustained Price Increases

SK Group Chairman Chey Tae-won predicts the global memory chip supply crunch could persist until around 2030, with wafer supply lagging demand by over 20% and prices continuing to rise.

6h ago·2 min read·6 views·via @kimmonismus
Share:

What Happened

SK Group Chairman Chey Tae-won has publicly stated that the current global shortage of memory chips may not be resolved until approximately 2030. The forecast, reported via a social media post citing his remarks, indicates a prolonged period of structural imbalance between supply and demand in the semiconductor market.

According to the statement, the core issue is a significant deficit in wafer supply, which is reportedly running "more than 20% behind demand." This persistent shortfall is expected to drive continued price increases for memory chips, a critical component for AI hardware, data centers, and consumer electronics.

Context

SK Group is the parent company of SK Hynix, the world's second-largest memory chip manufacturer and a major supplier of high-bandwidth memory (HBM) essential for training and running large AI models like those from NVIDIA. Chairman Chey's prediction carries significant weight given his company's position at the center of the AI hardware supply chain.

The forecast extends previous industry warnings about chip shortages. While many analysts have pointed to cyclical factors and pandemic-era disruptions, Chey's timeline suggests a more entrenched, multi-year constraint rooted in fundamental capacity limitations and the immense capital and time required to build new semiconductor fabrication plants (fabs).

The statement implies that the explosive demand for AI accelerators—which consume vast quantities of advanced memory—is outpacing the industry's ability to scale production, creating a bottleneck that could last for the remainder of the decade.

AI Analysis

This forecast has direct, tangible implications for AI infrastructure and development. A memory chip shortage extending to 2030 fundamentally constrains the scaling of AI training clusters. If wafer supply remains 20% behind demand, the availability of HBM and DRAM will dictate the pace of new GPU production from companies like NVIDIA and AMD. This creates a hard ceiling on the total compute available for training frontier models, potentially slowing the rate of parameter scaling and forcing a greater focus on algorithmic efficiency over brute-force scaling. For practitioners, this means the cost of training and inference will remain structurally elevated. Cloud GPU instance prices are unlikely to see significant deflation, and securing large-scale, long-term capacity commitments will become even more critical for AI labs. This environment favors well-capitalized incumbents and could stifle innovation from smaller players who cannot secure reliable supply. The industry may respond with increased investment in alternative architectures or memory technologies, but those solutions are years from volume production.
Original sourcex.com

Trending Now

More in Products & Launches

Browse more AI articles