There's been a lot of jazz about the integrated graphics solutions present on Intel's (INTC) Ivy Bridge CPUs and AMD's (AMD) Trinity chips and how they're going to relegate NVIDIA (NVDA) to a niche corner of the market. Interestingly enough, most of the hype is from the investor community and the general media. But if you talk to folks who are involved with designing and building PCs (as I do for myself and for others as a hobby), one thing quickly becomes clear: they're not enough to displace a discrete card in a situation where a discrete card would be purchased anyway.
It seems that Intel and AMD would like everyone to believe that having integrated graphics solutions is something that's "new" and "revolutionary". It's not. Integrated solutions have been shipping for ages and, yes, Intel has, for the longest time, shipped the most graphics chips year after year via these integrated onto motherboard solutions! The difference now is that these integrated solutions are directly on the CPU, meaning that overall platform costs decrease. Further, such integration allows for power savings for systems that actually use the on-die integrated graphics since power management can be done much more effectively and at idle, there are fewer chips in the system drawing idle power.
The next issue that comes up is that integrated graphics are becoming "good enough" and that they will eventually cannibalize the need for discrete graphics cards. To compound this, there have been rumblings in the PC gaming community that high budget games are increasingly targeted towards the relatively outdated hardware in Microsoft's (MSFT) Xbox 360 and Sony's (SNE) Playstation 3, supposedly removing the need for high end graphics chips on the PC counterparts of the games.
Nothing could be further from the truth.
While integrated graphics solutions are now transitioning from "absolute trash" to "competent" for modern 3D games, they're still not enough for PC games. The graphics cores in AMD's Trinity and Intel's Ivy Bridge, competent at low detail settings, are no match for an entry level discrete graphics chip from NVIDIA, let alone the higher end offerings from both NVIDIA and AMD. So for gamers looking to play at high frame rates at the higher quality settings, the current crop of integrated graphics processors are simply not enough. NVIDIA and, to a lesser extent, AMD also invest in technologies that allow -- generally in the laptop/Ultrabook areas -- their graphics chips to be turned off with minimal power waste when non-graphics intensive applications are running, so that the user gets the best of both worlds.
Now that I've made the case that NVIDIA's discrete GPU business is not particularly threatened by integrated solutions on the PC side, it's important to see how NVIDIA can actually benefit from integrated graphics. One word: Tegra.
That's right! On a desktop PC that will be used for gaming or even a laptop that will be used for gaming, a discrete GPU will likely be used, since NVIDIA has shown that their discrete solutions can even be used in Ultrabooks. But what about a mobile device such as a smartphone or a tablet where the key is to sip as little power as possible? That's exactly where the SoC style chip with integrated graphics becomes particularly valuable. Each year, phones and tablets can do more and more amazing things, especially when it comes to 3D graphics, and having competitive integrated GPU solutions here is key. NVIDIA's Tegra 3 is a good chip, but its graphics capabilities are not quite up to par with the graphics performance in Apple's (AAPL) A5X SoC , whose graphics technology is licensed from Imagination Technologies.
As puzzling as this may seem, it is imperative to note that NVIDIA certainly has the strongest technological background when it comes to efficient, high performance GPUs. If NVIDIA can bring this expertise to their Tegra lineup going forward, it will become extremely difficult for the phone vendors to ignore NVIDIA's offerings, especially as graphics and gaming sees the same explosive growth on the mobile side as they did on the PC in the 1990's and early 2000's.
NVIDIA shouldn't fear integrated graphics because they're perfectly positioned to deploy it where it counts -- mobile SoCs.