Google is updating the software on the Pixel 2 to enable the Pixel Visual Core - a custom-designed coprocessor that enables apps to take advantage of the HDR+ capability in the Pixel 2's camera.
Google says the new co-processor has eight custom designed cores and bespoke software that gives them full control of what it can do. This heavily vertically-integrated approach is reminiscent of what Apple is doing with their custom silicon.
Based on the sample images Google has posted, the difference this new feature makes is quite startling. Images that take advantage of the Pixel Visual Core are much clearer, particularly in low-light settings.
In addition, the new features employ Rapid And Accurate Image Super-Resolution, or RAISR. This uses machine learning to produce higher quality versions of low resolution images. On the Pixel 2, that means zoomed-in images look sharper than before.
These new features will be available in apps like Instagram, WhatsApp and Snapchat, as well as through the native camera app. And any app developer who wants to snap photos, can build an app to access these new features.