Why embedded AI is shaking up HBM chip design

The adoption of on-device AI technology, i.e. the ability to perform artificial intelligence tasks directly on computers and smartphones rather than in cloud computing, is reshaping infrastructure. And this also concerns the machine’s memory. We are, therefore, observing changes in the high bandwidth memory (HBM).

And this is a necessity as the demand for products specializing in low power consumption (LP) at the expense of HBM performance has increased significantly recently. Therefore, some players are demanding HBM designs with low power consumption features for the in-device AI market.

What is an HBM chip?

HBMs are memory chips made up of multiple D-RAMs stacked vertically. They are characterized by a significant increase in bandwidth, with 1 24 input/output (I/O) terminals that allow greater passage of electrical signals, compared to conventional D-RAMs (up to 32). And higher bandwidth means faster data processing.

Demand for HBM chips is growing rapidly in the artificial intelligence industry, where there is a need to process large amounts of data. A typical example is AI accelerators that integrate multiple HBMs with GPUs (graphics processors) for servers designed by Nvidia and AMD.

But in addition to power, HBM chipmakers must now emphasize sobriety.

HBM on cell phones and cars

“Recently, several customers have suggested that HBMs could be more energy efficient, even at the expense of bandwidth,” said a semiconductor software industry analyst. “Although demand is not as strong as for server chips, there is a lot of demand for HBMs to be used in local mode.”

On-device AI is a technology that performs AI functions on the device itself, bypassing cloud computing and the data center. Samsung Electronics, Qualcomm, Intel and others have recently released chips and products that highlight in-device AI capabilities.

These measurements suggest that the actual operating environment for on-device AI will involve a “hybrid AI” operation that utilizes cloud and on-premises computing power.

An executive at an AI semiconductor company adds: “I think it’s entirely possible that AI chips aimed at mobile and automotive devices could be combined with HBM at the same time.”

To go further into HBM chips

Source: “ZDNet Korea”

Leave a Comment