Market Report

Home - Events - Current article

Jen-Hsun Huang’s latest interview: Blackwell chips are priced at US$30,000-40,000. The climb of artificial intelligence has just begun. Unprecedented progress will be seen in the next 10 years.

The NVIDIA GTC conference, regarded as the "AI weather vane", kicked off yesterday. Huang Renxun launched the latest generation of AI chip architecture Blackwell. Huang Renxun then accepted questions from the media and answered questions related to Blackwell chips and AI development in some media interviews.


Here are the main points from the interview:


1. Blackwell chips are priced at about US$30,000 to US$40,000, but Nvidia does not sell a single GPU, preferring to sell supporting network equipment and software services.


2. AI development has just begun and will subvert thousands of industries in the future.


3. NVIDIA has created a new way of computing, and its technology is integrated into all computer manufacturer products, connecting the world together. This is why NVIDIA is everywhere in the world.


4. AI on NVIDIA hardware is changing daily life, just like the invention of electricity. Even this comparison underestimates the importance of NVIDIA.


5. There is essentially no so-called discontinuity, and the underlying logic of multiple industries is connected.


6. In the next ten years, we will see unprecedented advances in computing technology, and NVIDIA is researching a combination of traditional supercomputers and quantum computers.


Blackwell chip pricing: $30,000 to $40,000, supporting services for sale

Huang Renxun said in an interview with CNBC:


The price of Blackwell GPU is planned to be 30,000-40,000 US dollars, but this is only an approximate price. According to the needs of each customer, the price difference of different systems is very large. NVIDIA does not sell chips, but data centers.


The global data center market reached about US$250 billion last year and is growing at a rate of 20% to 25%. This is where Nvidia's market opportunities lie.


In terms of cost, an analyst at Raymond James believes that the cost of manufacturing a B200 GPU accelerator is about $6,000.


Each Blackwell GPU is actually integrated by two Compute Dies, connected through 10TB/second NVLink-HBI (High Bandwidth Interface) technology. In addition, the two computing chips are equipped with eight 8-layer stacked HBM3e memories, with a total capacity of up to 192GB. , bandwidth up to 8TB/second.


The analysis points out that the cost of the dual-chipset GB200 equipped with 192GB HBM3E will be significantly higher than the single-chipset GH100 processor equipped with 80GB of memory. Each H100 costs about $3,100, while each B200 should cost around $6,000.


Of course, this does not include Nvidia’s R&D and design expenditures, Huang said:


The development of the GB200 was a huge undertaking, with Nvidia spending up to $10 billion on next-generation GPU architecture and design.


Another point worth mentioning about the B200 is that Nvidia prefers to sell supporting network equipment and software services, not just the accelerator itself. For example, there are DGX B200 servers with eight Blackwell GPUs that cost millions of dollars each, or even a DGX B200 SuperPOD with 576 B200 GPUs.


Some analysts pointed out that "the gross profit margin of the GPU card itself is in a healthier state. The volume can exceed expectations. If the base of the GPU is made larger, it can then be earned back through supporting network equipment and software services."


AI development has just begun and will subvert thousands of industries in the future

Huang Renxun pointed out in an interview with CNBC:


The AI ramp has just begun and will continue for several years, and investment in AI is still in its early stages.


At the same time, Huang predicts growth in the coming years as AI technology impacts a range of industries, including healthcare.


Huang highlighted the ways in which artificial intelligence is enabling innovation in fields as diverse as science and healthcare, saying:


Artificial intelligence can help "understand the meaning of proteins, the meaning of life," thereby speeding up the development of new treatments.


We can use computers to simulate life so we don't have to do a lot of screening in the lab.


So whatever drug we ultimately decide to put in a trial, the likelihood of it actually passing the trial will be much higher.


Huang also said that in order to apply its chips to real-world tasks, Nvidia needs to have a deep understanding of many subject areas.


There are no chips that specifically synthesize proteins. You have to understand the proteins, the biology, you have to understand what the scientists are trying to do and how we can better automate their work. All of these different algorithms require quite a bit of research.


Huang Renxun said:


Over the next decade, we will see unprecedented advances in computing technologies such as climate technology, digital biology, universal robotics…


While many of Nvidia's recent achievements are the result of investments spanning 5 to 15 years, they are all still a work in progress. At some point, quantum computers may outperform traditional supercomputers, such as those based on NVIDIA GPUs. But Huang predicts that this will take another decade or two, and he also firmly believes that the world's most powerful computers will be a hybrid of traditional and quantum, which is also an area of ongoing research at the company.


Will Nvidia be everywhere?

Huang also talked about Nvidia's list of customers, saying the company's technology has succeeded in greatly speeding up data processing and reducing costs.


We created a new way of computing, and our technology is integrated into products from all computer manufacturers, connecting the world together, which is why NVIDIA is everywhere. We're in every cloud, every data center.


This is a vivid metaphor. AI running on NVIDIA hardware is changing daily life, just like the invention of electricity. Even this comparison underestimates the importance of NVIDIA.


Even Nvidia's overwhelming dominance of the AI chip market, with a market share of about 85%, doesn't convey Nvidia's contribution to AI, according to Raymond James Financial.


There is essentially no so-called discontinuity. The underlying logic of multiple industries is connected.

As Nvidia accelerated its development, AI was not only out of reach in its future, but the entire tech industry was skeptical of the technology's potential, leading to a period of disappointment and disinvestment known as the AI winter.


But Huang stressed that much of the company's current success stems from choices it made early in its history, before it fully grasped where they would lead the company. He said philosophically:


As usual, the future unfolds in a continuous fashion with no real discontinuities in nature.


For example, even when GPUs were just about graphics, Nvidia used them to program them so they could do more than just project predefined pixels onto the display.


This makes them the platform itself, rather than just an aid to the computer's CPU.


The skills Nvidia picked up along the way prepared it for new challenges, Huang explained, illustrating how learnings from one industry can be applied to another.


Particle physics in video games is similar to fluid dynamics in molecular simulations, and the image processing we use for scene lighting in computer graphics is no different than that used for medical devices.


In many ways, these logics are very similar, so we can slowly and systematically expand our horizons.