It's always easier to replace a video card than it is a CPU and motherboard, so it's not surprising to find people with a GTX 1060 or RX 480 surrounded by comparatively ancient components. These setups are sacrificing some performance by bottle-necking their GPU, sure, but exactly how much is going to waste?
Matt Knuppel of Hardware Unboxed found himself in a situation to test the affect of old parts on the latest mid-range cards -- NVIDIA's GTX 1060 and AMD's RX 480.
We all salivate over the likes of the new Titan X, but it's the aforementioned cards that are most likely to end up in the bulk of PCs.
Knuppel managed to dig up two popular AMD and Intel systems from yesteryear -- one sporting a 3.2GHz Phenom II X4 955 and the other a 2.67GHz i5-750 (which I myself have, though overclocked to 3.2GHz). These were compared to a modern setup -- a 4.0-4.2GHz i7 6700K.
So, no surprise the new computer smacked the other two around, but in a few tests the older systems were able to keep up admirably. In particular, Star Wars Battlefront wasn't really CPU-bound and neither was The Division, with all three configurations returning similar framerates.
Where the updated CPU architecture dominated was the ever-demanding ARMA 3 and Witcher 3, where the 6700K came out in front by margins for 30-40 per cent. In these cases, this was the difference between playable (30fps or more) frame rates.
So, while a new GPU can certainly compensate for an ageing CPU, there is a limit. Given the rise of graphics APIs such as Direct3D 12 and Vulkan, aimed at reducing driver -- and therefore CPU -- overhead, you could hold out on a system upgrade for longer than you might expect.
Originally published on Kotaku Australia.