A $725bn hyperscaler capex ramp and a 20% HBM price increase have pushed the South Korean chipmaker to the position of the second-most valuable company on the KOSPI. The bigger question is when supply will finally catch demand.
There is a running joke in semiconductor circles about the most expensive part of an AI server. For a long time, the GPU was the obvious answer. Over the past 18 months, that answer has increasingly become the memory sitting next to it.
The market reflected that reality on Monday.
SK Hynix, the South Korean memory specialist that supplies much of the world’s high-bandwidth memory for AI accelerators, rose as much as 12% in Seoul. Shares reached around 1.4 million won, or about $970, in morning trading.
The rally made SK Hynix the second-most valuable company on the KOSPI, behind only Samsung Electronics. The move also reflected foreign buying after strong earnings and renewed AI infrastructure spending plans from US hyperscalers the week before.
By any standard, this is a striking run for a company that most consumers have never heard of.
The main driver is simple. Big Tech’s combined 2026 capital expenditure is now expected to land somewhere between $650bn and $725bn, depending on the estimate. That would be about 77% higher than in 2025.
Microsoft has guided to as much as $190bn for the calendar year, with its CFO publicly saying that about $25bn of that is tied to rising memory-chip and component costs. Meta raised its own forecast to $125bn to $145bn, also pointing to similar pressure. Amazon’s Andy Jassy has committed roughly $200bn. Google has not been quiet either.
Most of this spending ultimately feeds AI training and inference clusters. A meaningful share also ends up in the bill of materials for high-bandwidth memory, where SK Hynix has a dominant position.
By late 2025, SK Hynix was estimated to hold about 57% of the global HBM market. That is unusually concentrated for a business that is still close to commodity hardware, and it helps explain why the company’s earnings now look more like those of a software platform than a traditional memory supplier.
Its first-quarter operating profit, reported on 23 April, was a record. Some sell-side estimates put operating margins on its memory line above 70%.
Margins like that do not last forever. They do, however, tend to hold as long as demand keeps outpacing supply.
Why supply is not catching up
HBM is not standard DRAM. It is stacked, 3D-packaged memory designed to feed bandwidth-hungry GPUs, and making it requires a set of advanced packaging steps that the industry has been slower to scale than buyers would like.
According to TrendForce, both Samsung and SK Hynix have raised HBM3E prices by roughly 20% for 2026. Supply is also being booked years in advance by hyperscalers and accelerator vendors.
Samsung’s memory chief warned earlier this year that major memory shortages could continue through 2027. The chair of SK Group has gone even further, saying the broader chip-wafer constraint may last until 2030.
Whether those timelines prove accurate or not, they help explain why long-term supply deals are becoming standard. In practice, hyperscalers are reserving output years ahead. SK Hynix and Samsung are also increasingly signing those agreements directly with Microsoft and Google.
In 2026, there are really two kinds of memory: the kind anyone can buy, and the kind your AI roadmap depends on. SK Hynix is in the second category.
There are, of course, skeptics. The CAPE ratio on US equities is now around 38, close to dot-com-era levels. That comparison still makes people uncomfortable, even though many of today’s major AI-exposed companies are actually profitable.
SK Hynix belongs to that profitable group, but it is also highly exposed to one product cycle. If hyperscaler capex slows, or if competing accelerator designs reduce HBM demand per chip, the same operating leverage that has driven record quarters could work in reverse.
There is also the question of whether the AI build-out will continue generating returns that justify current spending levels. Meta, for example, has been announcing record AI investment while also cutting roughly 8,000 jobs as it restructures around that spending. Investors are still willing to fund both sides of that story, but technology cycles have a habit of changing quickly.
The best way to read SK Hynix’s 12% jump is not as a prediction for AI’s long-term future. It is more like a live measure of how confident the market currently is that AI training clusters will keep being built at this pace. Every dollar of hyperscaler capex eventually becomes an order for a memory supplier. SK Hynix sits right in the bottleneck.
If that bottleneck eases, whether through Samsung scaling HBM4, Micron’s $25bn capacity push, or architectural changes that reduce dependence on HBM, the advantage gets shared and valuations can compress. None of that has happened yet.
For now, the market is saying that one of the biggest winners in the AI boom is the company selling the shovels. And it is not short of buyers.
The more interesting question is what the chart looks like in a year. That will say less about SK Hynix itself and more about whether the AI build-out still has momentum, and whether the people writing the cheques in Redmond, Menlo Park, and Seattle are as confident as they were last week.

