The Next AI Bottleneck - High Bandwidth Memory
Fresh & Hot curated AI happenings in one snack. Never miss a byte 🍔
This snack byte will take approx 4 minutes to consume.
In the wild world of semiconductors, volatility isn’t a stranger, but lately, the ups and downs are even testing the veterans.
Just last month, ASML’s shares took a nosedive after reporting a disappointing Q3 order book, yet TSMC, the world’s top chip manufacturer, went on to celebrate a record profit. So, what gives?
In two words: Artificial Intelligence.
While the demand for AI-specific chips has skyrocketed, other chip segments have seen demand stagnate. In October, Samsung apologized for weak financial results due to sluggish chip demand outside the AI boom.
Meanwhile, SK Hynix, Samsung’s fierce rival in the memory market, reported record profits, with high-bandwidth memory (HBM) chips pulling their financial weight.
So, why is HBM suddenly the big player? Let’s dive into how these tiny but powerful chips are shaping the future of AI and becoming the latest “bottleneck” in the industry.
What’s the Big Deal with HBM?
Running an AI model involves more than just crunching numbers on a single chip. It requires a lot of back-and-forth between memory and processing chips, often consuming over 90% of an AI model’s response time just in moving data.
This is where High-Bandwidth Memory (HBM) chips shine. Designed to transfer data at lightning speeds while sipping power, HBMs are essentially stacks of memory chips working seamlessly with logic chips. This design makes them a game-changer for high-demand AI tasks.
Research firm Arete projects that HBM sales will balloon to $18 billion this year, up from $4 billion just last year. That figure is projected to jump to $81 billion by 2026. And it’s not just the market share that’s impressive.
Profit margins on HBMs are around five times that of standard memory chips. SK Hynix controls over 60% of the HBM market, and a whopping 90% of the market for HBM3, the most advanced version.
Their early bet on HBMs—and close partnerships with giants like TSMC and Nvidia—has catapulted them to a dominant position as AI demand ramps up.
The landscape is not just about SK Hynix and Samsung, though. Micron, an American player, is also going big on HBM production, preselling much of its 2024 output.
And both SK Hynix and Micron are investing heavily to ramp up capacity, but new production won’t be fast-tracked overnight. Samsung, on the other hand, which manufactures about 35% of the world’s HBM chips, has been grappling with production challenges. Reportedly, Samsung plans to cut HBM output by 10% next year—ouch.
Nvidia, meanwhile, continues to fuel the HBM frenzy, as its graphics processing units (GPUs) dominate the AI model landscape. Each new Nvidia product launch creates a demand surge for HBM, squeezing the already strained supply chain.
And competitors like AMD are also eyeing HBM, hinting that more firms are waking up to this technology’s potential.
The Geopolitical Game of Chip Poker
While companies scramble to secure their HBM supply, global politics is adding its own layer of tension. With AI considered a critical technology, governments are starting to eye HBM with national interest. The U.S. is pressuring South Korea to restrict HBM exports to China as part of the ongoing tech war. Rumors are flying that America’s next wave of chip sanctions may target some advanced HBM models, further tightening supply.
South Korea is caught in the middle, with both SK Hynix and Samsung’s future production plans potentially impacted by political decisions. As demand for these chips soars, global politics could steer the direction of the HBM market just as much as technological advances or corporate strategies.
Demand for HBM chips isn’t just a spike; it’s more like a tidal wave. While firms like SK Hynix and Micron are expanding as fast as they can, the current supply pipeline is already close to maxed out. Samsung’s plans to reduce output next year might make the shortage even more pronounced.
This crunch has many investors wondering: Will HBM production keep up with AI’s growth curve? Or will it become the next speed bump in AI development? Whatever the future holds, one thing’s clear: If you’re an AI company, you want your hands on all the HBM you can get.
The high-bandwidth memory frenzy is reshaping the semiconductor industry. With HBM chips positioned as the backbone of AI, the stakes are high, and the competition is fierce.
As the industry navigates production woes, market demand, and geopolitical moves, it’s clear the AI bottleneck is here—and HBM is the throttle.
Whether we’re heading toward a production windfall or a traffic jam in the memory lane, one thing’s for sure: the HBM market is worth watching.