In 1965, one of the founders of Intel postulated a law that has since been confirmed empirically: Every 18 months the number of transistors on a microchip doubles. Until now this was a guide when it came to technological advancement. But the arrival of artificial intelligence had all the assets to change this. And, according to Jensen Huang, the CEO of Nvidia, this has already happened.
In a recent presentation within the framework of CES 2025, Huang has claimed that the performance of his company’s AI chips is advancing faster than historical rates set by Moore’s Law.
Huang says Nvidia’s AI chips are moving at a rapid pace of their own; the company says that its latest data center superchip is more than 30 times faster to run AI inference workloads than its previous generation.
Added to this is that Huang rejects the idea that AI progress is slowing down. Instead, he states that There are now three active AI scaling laws: pre-training, the initial training phase where AI models learn patterns from large amounts of data; post-training, which fine-tunes an AI model’s responses using methods such as human feedback; and test-time computing, which occurs during the inference phase and gives the AI model more time to “think” after each question.
“We can build the architecture, the chip, the system, the libraries and the algorithms, all at the same time,” adds Huang. If you do that, you can move faster than Moore’s Law, because you can innovate across the board”.
The Nvidia CEO’s bold statement comes at a time when many are wondering if AI progress has stalled. Leading AI labs, such as Google, OpenAI, and Anthropic, They use Nvidia AI chips to train and run their AI modelsand advances in these chips would likely translate into further progress in the capabilities of AI models.
Huang’s statements come alongside the presentation of his new mini-sized supercomputer with built-in AI, a huge advance… or a very intelligent marketing strategy.
Nvidia’s latest data center superchip, GB200 NVL72 is 30 to 40 times faster at running AI inference workloads than Nvidia’s best-selling chip so far, the H100. To this advance in power, we must add that the prices of AI models have also fallen and this is a trend that will continue according to experts.
More generally, Huang stated that Its current AI chips are 1000 times better than the ones it made 10 years ago. That’s a much faster pace than the standard set by Moore’s Law, a pace that Huang says won’t stop anytime soon.