Reminiscent of x86 vs. Arm.
- The key theme of Computex was the arrival of generative AI on the PC and while most attention was on Copilot+ running on the NPU, Nvidia was quick to remind everyone that if you want real AI horsepower, you need to use an Nvidia graphics card.
- Copilot+ is a runtime that uses a number of smaller language models (SLMs) to provide specific generative AI services on the PC that developers can write to and incorporate into their apps.
- Typical examples are photograph enhancement, image generation from a sketch, real-time translation and so on.
- The Copilot+ runtime runs entirely on the PC and is designed to run on a specific processor called a neural processing unit (NPU) which is designed to run inference on neural networks both quickly and efficiently.
- Microsoft’s specification for this is an NPU with at least 40 TOPS which Qualcomm is already shipping with Intel and AMD hot on its heels.
- The NPU is demonstrably the most efficient way to run Copilot+ and there is a demonstration by Microsoft that shows this.
- Microsoft shows two PCs running Copilot+ one on a Qualcomm NPU and the other on a Nvidia 4060 chip with the Nvidia PC measured at 50+°C while the Qualcomm X Elite device is measured at 30°C (see here) indicating that the Nvidia device is consuming far more power meaning that the battery life will quickly be drained.
- This indicates that for SLMs, the NPU is the way to go, but as always, there is more to this story than meets the eye.
- Nvidia’s 4060 is capable of running nearly 8x (353 TOPS) faster than the NPU in the X Elite and so it is not a surprise that it consumes 160 watts of power under load explaining why it gets so hot in the task above.
- Consequently, Nvidia was demonstrating a whole series of AI tasks at Computex that require far more power than 45 TOPS almost all of which were running on desktop PCs as opposed to laptops.
- It is becoming clear that the kind of AI tasks one is aiming for will determine where they run with larger models running on Nvidia and smaller models on the NPU.
- This is very similar to the rivalry between x86 and Arm that existed in computing devices for many years where top performance required x86 which consumed a lot of power, but efficiency and battery life were far better on Arm albeit with lower performance.
- This situation was put to bed several years ago by Apple which demonstrated that Arm could do both which has initiated the greatest threat to the x86 processor in its history.
- The trend at the moment (rightly or wrongly) is for AI models to get bigger as it is believed that this is the best way to make them perform better.
- Followed to its logical conclusion, this implies that as AI becomes more integrated with general computing tasks, the performance of the processor executing the task will need to keep rising.
- Furthermore, the fact that the really big tasks are likely to be carried out on desktops that have no batteries means that power consumption will be much less of an issue.
- Hence, it looks like tasks on the NPU will dominate in devices where battery life is important and Nvidia will have a strong position in high-performance tasks where power is less of an issue.
- I would expect the NPU to erode Nvidia’s position from the bottom over time but at the same time, if the models continue to get bigger, there will be plenty of space in the desktop market for Nvidia to sell into.
- I suspect that the moment when the NPU can offer Nvidia performance at a fraction of the power consumption is far away and so this status quo is likely to exist for some years to come.
- This is now a rounding error in Nvidia’s financial performance as 90% of revenues are coming from the data centre and so I do not expect this issue to have any impact on Nvidia’s fundamental performance.
- I also do not see Nvidia impacting the case for running Copilot+ on the NPU or the case for Windows running on Arm processors in the laptop market.
- Qualcomm is going to have this market to itself for a few months as Copilot+ for x86 is not yet ready to be released and so the Intel and AMD laptops will not offer Copilot+ out of the box but it will be available as a future upgrade.
- This is what “Copilot+ ready” means.
- What impact this has remains to be seen but I still think that Windows on Arm has the possibility to take a large amount of share from x86 and the best way to play that remains Qualcomm.
- I own Qualcomm and am quite happy to keep sitting on it.
Qualcomm vs. Arm – Short b ...
23 December 2024