Dell APJ chief: Industry won't wait for Nvidia H100

Canalys mostly agrees, but thinks GPU giant still has a way to go

CANALYS APAC FORUM Buyers won't tolerate Nvidia's long lead times to deliver GPUs, enabling new entrants to enter the market, according to Dell Asia Pacific and Japan president, Peter Marrs.

Cloud service providers are currently selling GPUs as a service to smaller customers to enable businesses to build out AI products, explained Marrs.

Speaking at Canalys APAC Forums on Wednesday, he referred to Nvidia as "one big monster" and "the only one who's really in the game with GPUs."

However, this scenario is set to change. The president argued the lead times Nvidia currently offers just won't do for many customers – especially as both AMD and Intel are set to debut their own GPUs and neural processing units (NPUs) on the client side.

Marrs reckons customers simply "can't wait a year."

"There's going to be others. And you got this whole NPU technology that's going to be on the device. It's going to change and I don't want to use the word democratize but this will come where it's going to be pervasive. And everybody's going to be able to afford it," predicted Marrs.

The Dell exec further explained that while Nvidia has a range of GPUs, most customers still want to stick with the premium product – the H100 – which he referred to as "the biggest baddest thing in town."

"But as LLMs mature, I think this will change," he added.

Canalys vice president Alex Smith told The Register on Wednesday that new GPU market entrants offering more meaningful comnpetition will happen, though it will likely take years.

Smith detailed that other businesses like Intel and AMD have some things they are launching – but it's not all about having a product. "The competitive advantage is more than hardware and silicon, it's in the software too," explained the Canalys veep. "Nvidia has the clear competitive advantage." He then ventured that while most of the focus was on the H100 and its performance and capacity, only around 20 to 30 existing companies actually need that size of computing.

"What you'll see more is smaller LLMs that don't need that level of processing because they have much more narrow deployments and use cases that don't need H100s. With more of that, there will be more room for silicon companies to challenge and compete," explained Smith.

Smith told a media roundtable that the upcoming players which could pose a challenge to Nvidia are hyperscalers who are starting to invest in their own capabilities – like Google and Microsoft, with AWS quickly coming up.

And while Alibaba and Baidu do have formidable resources, a gap still exists between Chinese clouds and their US-based peers.

Meanwhile, Dell told investors last week that demand for AI servers has surged, but supply chain constraints have their customers waiting 39 weeks for their gear.

"The backlog that we've talked about is Nvidia-based 39-week lead time. We're working our behinds off every day to get more supply," revealed COO Jeff Clarke in an earnings call.

"As we look forward into calendar year '24, there's clearly alternatives coming," Clarke conceded. "There's work to be done in those alternatives – software stacks have to be taken care of, resolve the opportunities around them. But there's more options coming." ®

More about


Send us news

Other stories you might like