Flash has taken quite a beating lately by everyone from Apple (no Flash on iPad or iPhones) to YouTube (transitioning to HTML5 video) to users sick of security exploits and sluggish browsers. Everyone's looking for the silver bullet that kills Flash, but is HTML5 it?
Video expert Jan Ozer decided to put the most common claim — that Flash video is a CPU hog and that HTML5 will fix this problem — to the test, with a pretty simple methodology:
Since the comparative efficiency of Flash vs. HTML5 seemed easy enough to quantify, I endeavoured to do so, using YouTube's new HTML5-based player as the test bed. Specifically, I played a YouTube video in the same browser twice, once via HTML5, once via Flash, and measured CPU utilization during playback.
The results (in brief, emphasis ours):
When it comes to efficient video playback, the ability to access hardware acceleration is the single most important factor in the overall CPU load. On Windows, where Flash can access hardware acceleration, the CPU requirements drop to negligible levels. It seems reasonable to assume that if the Flash Player could access GPU-based hardware acceleration on the Mac (or iPod/iPhone/iPad), the difference between the CPU required for HTML5 playback and Flash playback would be very much narrowed, if not eliminated.
We also learned that not all HTML5 browsers/H.264 decoders are created equal. Significantly, with Flash 10.1 deployed, Google's HTML5 implementation required the most CPU horsepower of all playback scenarios — by far — on the Windows platform. On the Mac, Firefox and Safari with Flash required less CPU horsepower than Chrome's HTML5 implementation.
At least from a CPU utilization perspective, Flash isn't BAD and HTML5 isn't GOOD. It all depends upon the platform and implementation.
Be sure to check out the full post for all the details.