When Should Software Stop Being Supported?

Today's release of security updates for Windows XP, Windows 8 and Windows 8.1 signals an about-face by Microsoft. In the wake of the WannaCry outbreak, the intention was to stem the spread of a virulent and damaging ransomware attack. But should the company keep patching an operating system that has been out of mainstream support for over eight years and extended support for three?

In response to my story earlier today, rickinoz made the following comment:

This is an interesting dilemma between:

  1. patch XP to reduce the prevalence/impact of exploits globally
  2. don't patch XP as a lever to force users to update to a newer version

One of my friends also said "Microsoft folds and releases Wannacry fixes for Windows XP and Server 2003. I don't think they could have done anything else".

Microsoft did a lot to make the switch to Windows 10 as easy as possible. While it was not always an easy transition, it was a free update for some time. And they pushed the update out, making it reasonably easy for people to execute the update.

As many experts have said, the two best defensive measures against malware attacks are keeping software up to date and maintaining tested offline backups.

While I maintain most people learn the backup lesson the hard way, either through their own bitter experience or by witnessing the angst of a friend first hand, we don't seem to get the message when it comes to software updates.

I think it's time for Microsoft to take a hard line and say - it's time. The best way to patch your old software is to ditch it.

Where that is not possible, isolate it from your network and the internet and surround it with the best perimeter security needed based on the value of the data on that system.

Was Microsoft right to patch Windows XP again? Should they say "enough is enough"? r should they continue to issue patches for critical flaws indefinitely?


Comments

    It's definitely time to move on. Windows XP was released on 2001. It's not unreasonable to expect businesses or individuals to upgrade their operating system once every 16 years.

    The problem was that XP was great. Vista was hilariously rubbish. Win 7 was good. 8 was a non-factor. I only updated to Win 7 (from XP) to get 64-bit memory access, and only upgraded to Win 10 because it was free.

    We're in uncharted territories here. There isn't really much like computer operating systems in the modern world, where we've faced this problem before. The closest thing I can think of is perhaps cars, which also get long term use and can develop problems or have inherent flaws show up years after support/warranty ends. The difference there is cars can be fixed, there are many options out there, you can take it back to the dealer, take it to a generic mechanic, or even try and fix it yourself. It's not like that with operating systems, if you're using an old version of Windows XP or OSX, and a critical bug comes up, no-one can fix it for you (In theory you could get it fixed if you were running an open source OS, but in practice, that might be a little impractical).

    M$ supporting and helping folk they don't need to?

    Never thought I'd say this, but good on them!

    By attrition, apps and hardware will force the remaining user base to move off XP.

    But I agree with the other posters, there's no need to be using XP unless it's for some legacy application. Even then, I'd strongly consider other options.
    If you hand is forced, such OS' should be denied any internet access, with heavy restriction/policies/filtering for both ingress and egress traffic.

    I'd still wince at the notion...

Join the discussion!