When you first get a new graphics card, your games run buttery smooth. Over time, you might start to notice that it doesn't run as well, even on the same games. What gives? This video explains what causes performance degradation over time. As you might be able to guess, your graphics card itself doesn't generally degrade over time. Some pieces of hardware like hard drives or SSDs can become less effective over time. Graphics cards, on the other hand, do not degrade the more they're used. Linus Tech Tips tested this by comparing a brand new, sealed EVGA GTX 480 versus one that the site had used extensively for benchmarks and gaming. Both cards performed identically in the same machine.
So, what causes your machine to slow down over time, even on the same games? Any number of software issues can cause the same hardware to perform less effectively. Upgrading to a new operating system with more overhead, accumulating malware or ironically a bloated malware-fighting suite of applications, or just installing too many junk apps that you forgot to clean up. The performance of other devices like the hard drives that your games are installed on can also slow down your performance, if you haven't replaced those recently.
While your graphics card doesn't degrade over time, that doesn't mean it can't wear out. As the video also points out, the more that you pump electrons through the circuits of your graphics cards (especially if you're forcing it by overclocking), the sooner you're likely to see a failure. However, this failure will be sudden, causing the card to fail entirely, rather than the card simply performing worse over time.
Performance degradation - is it real? [YouTube]