A little over two years ago, I wrote about how integrated graphics were the future of gaming. I stand by what I said in that article: if anything, recent advances in the computer hardware industry have proven me right and have convinced me even more that we are witnessing the slow death of graphics cards.
That's right: I think the dedicated GPU is going to go the way of the dodo. It's almost heresy to say so as a long-time PC gamer and system builder; I have over 500 games on Steam alone and have built a ridiculous number of PCs for both work and personal use over the years. I'm not afraid of Reddit crucifying me for saying I think and hope that GPUs will go away, but I would totally understand if that were the case. It's a drastic statement.
Getting the best graphics card is your number one priority when it comes to building a gaming PC – almost invariably the most expensive component in your system – and it’s a common aspiration among PC gamers to have a fully-loaded, liquid-cooled rig with an RTX 4090 at the heart of it all. So why am I so convinced that we won’t need them any more soon?
The great graphic revolution
The answer to that question requires two separate parts: a look at the CPU industry and a look at the AI industry. As anyone who knows me well will tell you, I think AI is a bit suspect, so let’s start with the CPU part of the story.
Earlier this year, we saw the triumphant arrival of Qualcomm’s Snapdragon X Elite chips at Computex 2024. A new challenger in the laptop processor arena, something that finally points to Intel’s market dominance, something AMD has been trying and failing to do for years. It was a good performance overall for the new chips, but what stuck in my mind the most was seeing an ultrabook without a graphics card running Baldur’s Gate at 4K.
Yes, CPUs with integrated graphics are getting better and better, even if Qualcomm itself insists it has no real plans to take over the gaming market. And it's not just Snapdragon; Intel plans to fight back with powerful gaming performance in its upcoming Lunar Lake chips, and AMD has been enjoying great success with its custom chips for PC gaming laptops like the Asus ROG Ally X, Lenovo Legion Go, and Valve's Steam Deck. Sure, these chips aren't going to rival the best 4K graphics cards when it comes to high-end gaming, but they're further capable of providing a solid gaming experience.
There's one key reason gaming on integrated graphics is now actually possible, and that's upscaling software. Tools like Nvidia DLSS, AMD FSR, and Intel XeSS are what make this performance possible; my colleague John Loeffler saw an Asus ZenBook with an Intel Lunar Lake chip at IFA 2024 that hit an average of 60fps in Cyberpunk 2077 at 1080p with medium settings thanks to XeSS: a notoriously demanding game.
All in AI
XeSS and DLSS (though notably not AMD's competing FSR upscaler) are both powered by AI hardware, which gives me a nice segue into my next point: AI is killing the gaming GPU industry, and if it continues at its current pace, threatens to devour it entirely.
Nvidia has been making waves in the AI space for some time now. While a potential slowdown in AI expansion sent Nvidia’s stock tumbling last week, the company remains committed to its AI vision: CEO Jensen Huang’s keynote address at Computex was packed with planet-destroying AI plans, and the company continues to release new AI-powered tools and supply hardware for training AI models around the world.
Jensen isn’t alone, either. Earlier this week, AMD SVP Jack Huynh revealed in an interview that AMD is seriously targeting the AI market, and a side effect of this is that Team Red will be pulling out of the high-end GPU race, so we probably won’t be getting a Radeon RX 8900 XTX – at least not in the near future. Instead, AMD’s consumer efforts will focus on the mid-range and budget space, further closing the performance gap between its discrete graphics cards and new integrated graphics cards for processors (iGPUs).
An ignoble end for the humble graphics card?
Simply put, the growing demand for GPUs for AI projects is incompatible with a future where GPUs are required for gaming PCs. It’s been clear for a while that the focus is no longer on consumer hardware (especially for Nvidia), but with iGPUs improving at a faster rate than traditional graphics cards, I won’t be surprised if the RTX 5000 is the last generation of Nvidia GPUs aimed at gamers.
After all, nothing lasts forever. Sound cards and network adapters were an integral part of custom PCs for years, but were eventually phased out as motherboards improved and began integrating those features. As far as the average gamer's requirements go, we're likely not too far away from CPUs that can handle everything you need them to, even if it means gaming at higher resolutions.
I won't cry for the dedicated GPU when it dies either. Not only are they… very It's expensive, but being able to improve my gaming performance by simply swapping out a single chip would make future system upgrades quicker and easier, as well as allowing for more compact PC builds. Yes, I love my beefy, RGB-filled tower, but it takes up too much space on my desk.