Nvidia H100 Accelerators

Back in simpler times, startups in Silicon Valley used to brag about how much funding they’d received as a useful metric to compare their importance with their rivals. These days, for AI-based startups, at least, that metric is now the number of Nvidia H100 GPUs they can acquire. With Nvidia’s finite resources, the way it dishes out GPUs to AI companies has resulted in a pecking order that signifies a company’s importance to the industry. The infighting between companies trying to acquire GPUs has apparently gotten so bad that one investor equated the clash to the popular TV show Game of Thrones.
These days, Nvidia’a AI accelerators are considered the gold standard for training large language models (LLM), so if you’re an AI startup of any size, you need to acquire thousands of GPUs as soon as possible. Though some firms are content to use the older Ampere-based A100 GPUs, Nvidia’s newest H100 Hopper GPUs are the most in demand because they’re its most powerful accelerators. The need for these GPUs is currently off the charts, and it’s gotten so bad some folks are trying to scalp them on eBay for over $40,000 each. Elon Musk said that acquiring GPUs for AI training is even more difficult than buying drugs.
To give you an idea of the current demand, VentureBeat flagged a lengthy post discussing the “GPU shortage” currently riling the AI industry. The author estimates that a company like OpenAI, behind ChatGPT, will need up to 50,000 Nvidia H100 accelerators to train the upcoming GPT-5 system. Meta might need anywhere from 25,000 to over 100,000 GPUs as well.
The “big cloud” providers like Amazon, Oracle, and Microsoft will probably want around 35,000 GPUs each. Add another 10,000 GPUs each for a variety of smaller startups, and overall you’re looking at about $15 billion in GPUs alone, or a total of 432,000 H100 accelerators that need to be sold as quickly as possible.
For its part, Nvidia says there is no such thing as a GPU shortage currently, but it is being slowed down by a global supply chain that requires numerous vendors to have parts ready to create a single H100 GPU. Nvidia’s Charlie Boyle, who is VP and GM of Nvidia’s DGX Systems (pictured at top), stated to VentureBeat, “When people use the word GPU shortage, they’re really talking about a shortage of or a backlog of, some component on the board, not the GPU itself.”
For now, things are looking pretty rosy for Nvidia. In the first quarter of 2023, the company shattered Wall Street’s expectations by pulling in $7.19 billion in revenue, and it projected that number would increase to $11 billion for the second quarter. The company has allegedly practically halted production of its current GeForce 40-series GPUs to shift some of TSMC’s resources to producing AI chips instead of gaming cards. That decision is likely because its AI accelerators are in higher demand than its gaming GPUs, and the margins are also much higher for those chips.
ExtremeTech supports Group Black and its mission to increase greater diversity in media voices and media ownerships.
© 2001-2023 Ziff Davis, LLC., a Ziff Davis company. All Rights Reserved.
ExtremeTech is a federally registered trademark of Ziff Davis, LLC and may not be used by third parties without explicit permission. The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or the endorsement of ExtremeTech. If you click an affiliate link and buy a product or service, we may be paid a fee by that merchant.

source

accelerators companies nvidia possible score trying 2023-08-16

About admin

Previous Dungeons & Dragons Illustrations
Next OpenAI’s New Web Crawler

Check Also

Inno3D Launches Nvidia GeForce

People who build PCs are always looking for better ways to hide their unseemly cables, …

Leave a Reply

Your email address will not be published. Required fields are marked *

Bizwhiznetwork Consultation