It seems obvious that the graphics chip on your video card would be better at handling pixels than your CPU. For certain tasks, that's definitely the case. No surprise then that Adobe has built hardware acceleration into its suite of products for years, with Lightroom the latest recipient. For some configurations however, it could result in a performance drop.
As Petapixel's Michael Zhang points out, Adobe has started a thread on its official forums explaining the GPU's new role in Lightroom. In it, camera raw engineer Eric Chan states that newer hardware -- less than three years old -- will see improvements, while those with components older than five years may see no speed-ups at all.
Chan also outlines where GPU integration has been used and why not everything is offloaded from the CPU:
...GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another ... GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.
Display resolution plays a role as well, with Chan saying that larger screens -- in the 4K range -- will see the most benefit from GPU acceleration.
Unfortunately, given your hardware, your performance could very well have suffered. You can easily disable the feature -- just untick the box "Use Graphics Processor" in Preferences -> Performance.