The next "Moore's Law" could explain why NVIDIA wants ARM Holdings so badly

You probably have heard the name Intel before, but you might not be familiar with the name of Gordon Moore. The latter is the co-founder of Intel and is reportedly worth a cool $12 billion. But what Moore is known for is an observation that he made in the 1960s. He noticed that transistor densities doubled every other year which gave chip makers something of a roadmap and a goal. TSMC, the world's largest independent foundry, stuffed a little over 52 million transistors into each square mm on chips made using its 10nm process (like 2017's Snapdragon 835 for example). The Snapdragon 865 Mobile Platform, which powers many Android flagship models this year, is built using the 7nm process which is equipped with nearly 100 million transistors per square mm.

Huang's Law could be behind Nvidia's proposed purchase of ARM Holdings

The new 5nm process, used to manufacture the Apple A14 Bionic chipset, will have approximately 171.3 million transistors per square mm. The chip was just introduced by Apple as it will power the recently unveiled fourth-gen iPad Air tablet. It also is expected to be found inside all 2020 5G Apple iPhone 12 models. It contains 11.8 billion transistors compared to the 8.5 billion inside the A13 Bionic that powers the iPhone 11 family. You might wonder what the big deal is. Well, the larger the number of transistors on a chip, the more powerful and energy-efficient it is.
Both TSMC and Samsung, the top two independent foundries in the world, have roadmaps down to 2nm. But what happens when we get to the end of Moore's Law? The Wall Street Journal's Christopher Mims has the answer: Huang's Law. Mims names his observation after Nvidia Corp. chief executive and co-founder Jensen Huang. Huang's Law says that the chips that power Artificial Intelligence (AI) more than double in performance every other year; this improvement can be credited to both software and hardware improvements over time. Bill Dally, chief scientist and senior vice president of research at Nvidia, reports that between November 2012 and May 2020, the performance of certain Nvidia AI chips increased 317 times. That's better than the "law" dictated by Gordon Moore. TuSimple, a company involved in self-driving trucks, says that performance is doubling every year on its systems powered by Nvidia.

GPU chips, like the kind Nvidia is known for, can handle many different tasks simultaneously. CPUs, or Central Processing Units, are better at handling single tasks quickly. Some tasks, including ones related to AI, can be sliced up and handled much faster by a GPU chip using less power. And as AI moves from the cloud to on-device use, ARM Holdings is one of the leaders in supplying the necessary components. And that could explain Nvidia's $40 billion bid to buy the company.

However, there are some caveats. The processing power available from GPU's can't be used in every situation. TuSimple's co-founder and Chief Technology Officer Xiaodi Hou, notes that even in businesses that rely heavily on AI like self-driving trucks, most of the system's code requires the use of the CPU. And like Moore's Law, eventually Huang's Law will no longer be feasible. That still might leave close to a decade for Huang's Law to be useful. But it won't be as of widespread use as Moore's Law has been and continues to be. And by the time chip makers might need to replace Moore's Law, hopefully something with more widespread capabilities will be developed that has a long future ahead of it. Still, for Nvidia to spend $40 billion to buy ARM, it must have a very good reason to do so.



tinyurlis.gdv.gdv.htu.nuclck.ruulvis.netshrtco.detny.im