On-Prem

HPC

SK hynix ships blazing fast HBM3E DRAM samples – but most customers have to wait

Everything in 2023 is about AI, which this silicon is said to speed


South Korean chipmaker SK hynix has shipped samples of HBM3E DRAM, claiming it should be able to process 1.15 terabytes of data in a second.

As described on our sister site Blocks and Files, HBM3E is the next generation of the High Bandwidth Memory (HBM) standard overseen by JEDEC. HBM matters because it is faster and uses less power than other forms of memory – such as Double Data Rate (DDR) or Graphics Double Data Rate (GDDR) memory.

Speed and power consumption matter more than ever during the world's current surge of interest in AI – which is why Nvidia recently promised to add HBM3E memory to a forthcoming version of its Grace Hopper superchip.

SK hynix's announcement of sample shipments describes the memory as "the highest-specification DRAM for AI applications currently available."

We get it: nobody wants their AI to crawl along. It's vital to employ 2023's most-discussed tech before investors frown, or competitors pounce.

But SK hynix's announcement asks would-be-buyers to slow down – it will mass produce HBM3E "from the first half of next year." That could mean as late as June – ten months from now. The chipmaker also omitted mention of production volume, so this memory could be even scarcer than GPUs.

But it will be less hot to the touch than previous HBM kit. The silicon slinger claims that its heat dissipation is ten percent better thanks to "Advanced Mass Reflow Molded Underfill" – aka MR-MUF2, a packaging technology the Korean chip champ developed in-house.

One factoid that may produce warm feelings is that HBM3E is backwards compatible with HBM3 – so buyers who splash for the current generation of memory can buy now, safe in the knowledge they have an upgrade path.

SK hynix's announcement includes appreciative quotes from Nvidia, but does not explicitly state the GPU-maker is the customer testing the HBM3E samples.

Our sibling site Blocks and Files has more info on HBME3 and how it's built. ®

Send us news
Post a comment

Now is a good time to buy memory because prices rise next year, Gartner predicts

To blame? The usual suspect – AI's appetite for chips

Apple exec defends 8GB $1,599 MacBook Pro, claims it's like 16GB in a PC

8,388,608KB ought to be enough for anybody, huh?

Samsung's results hint it's time to RAM some money into your memory budget

Chaebol reckons market is recovering and prices will rise. Analysts agree

Micron, SK-Hynix's shipping bandwidth-boosting LPDDR5 for on-device AI

So long as you don’t mind that it's soldered down

SK hynix puts the boot into Kioxia-Western Digital merger

'The company is not agreeing to the deal at this time'

China's YMTC scrounges for billions to help bypass US sanctions

Meanwhile, RISC-V has Uncle Sam rattled

US allows Samsung and SK hynix to keep making chips in China

Investments protected, diplomatic rift averted … even Beijing likes it

The clock is ticking and Korea wants to know if its chipmakers will get their export license extension

SK hynix and Samsung do so much memory-making in China, ending sanction exemptions would be extraordinary

Japan confirms ¥192 billion will flow to help Micron build Hiroshima plant

That's $1.3 billion – pocket change compared to CHIPS Acts elsewhere

SK hynix vice-chair denies selling to Huawei, calls for memory probe

PLUS: Hong Kong’s CoinEx crypto exchange frozen; Uber eyes off India; and more!

Micron revenue halved in FY23 as China ban bites

Reason for the block still a mystery – but most have their guesses

Samsung wants to push CAMM format into memory mainstream

Smaller footprint and detachable