Google’s Photos Assistant Is Freaking Me Out

Google’s Photos Assistant Is Freaking Me Out
Image: iStock

Google’s Photos Assistant is an amazing tool, most of the time. For starters, it’s not too obtrusive, offering users a different take on their photos and videos when it thinks it has spotted a special event or image that is worth a little extra effort. But there’s also a problem.

Most of the offers are in the form of stylised photos:

Picture: Peter Farquhar

Or animations:

The moments are well chosen, and well executed to the point where Photos Assistant is welcome on my phone any time.

Until yesterday, when it offered me a “Then and Now” photo for the first time:

Picture: Peter Farquhar

Obviously, that’s a collage of two pics of two of my kids at different moments in time. Which is nice, because it’s something I’d never think of searching around for.

But I spent too much time thinking about what it took for Google’s AI bot to find it, and I just can’t get comfortable with it.

The pictures were taken about 10 months apart. The one on the right was taken just hours before Photos Assistant offered me the collage.

So after the one on the right was taken, Google’s AI immediately scanned eight months of photos (i.e. hundreds) and identified both my children in a similar pose.

That’s impressive, given they’re not exactly staring down the lens. Although having covered AI developments for years, I’m also aware it’s probably a fair way down the scale of achievements compared to say, where SenseTime is up to in China right now.

But, whatever. It’s still an AI program that can almost instantly identify facial features from hundreds of images, and it’s right in the palm of my, and millions of other hands right now.

And that’s a step too far for me right now.


  • Recently Google identified one of my friends’ children several years apart in one such collage.

    Assistant has lots of other more mundane problems
    – most of its photo “improvements” are extreme saturations, and the subject matter is generally not interesting
    – it makes collages where most of the subjects heads are cut off
    – it makes movies where half the images are rotated 90 degrees or are chopped to fit into a widescreen format

    That’s a lot of energy going into making undesirable content.

  • How is this creepy? The AI does a facial recognition as soon as the picture is uploaded. You make it sound like google specifically went through all the other photos after you uploaded this new pic.

    • I guess the implication is that Google is creating shadow facial profiles, which it could theoretically cross-link between user accounts … but only if it was being evil.

      This is similar to what Facebook was doing with shadow user profiles, compiling dossiers of information about people who didn’t have FB accounts, but were friends of FB account-holders. In that instance Facebook was sharing this information between accounts without the consent of the subject or their friends AND you could find that information when downloading data from your profile.

      So imagine you have a situation where a victim of crime is in hiding from a past aggressor. They have their current photos stored on line. The aggressor has old photos of the victim online. Google has its facial recognition and profile-matching engine churning away, and through some bug, the GPS information from the current photos is linking through a user/facial profile to the old photos….

      • If you go into google images (I think it’s under the tab ‘people’), it has already recognised and grouped faces for everyone in all your photos. You can the go in and tag them with names if you want (although google probably already knows who they are)

        If I look through all the photos of myself that google had auto sorted, I’d see photos of me when I was a kid too!

        Your point about victim and aggressor doesn’t really apply. Both the ‘then’ and ‘now’ photos were taken by the same person.
        It wasn’t a case of the aggressor took the ‘then’ photo, and the victim took the ‘now’ photo, and google created this combined collage from the two account and shared it to both of them.

        • @pukoh My point about victim and aggressor is not that they were taken by the same person, but that they are of the same person but the photos stored in different accounts.

          Shadow profiles means there is potential for information transfer to occur between the accounts with images relating to those profiles. This was what happened with Facebook contact information.

  • No amazing AI going into making the match. Though the facial recognition stuff behind it is pretty cool. But really, when the first photo was uploaded it would have been tagged with IDs for both faces and maybe some other tags, then when you uploaded the second photo it would have had matching tags or similar enough to get put into a group with the first image and from there its just a matter of generating the ‘then & now’

    if you are worried about the facial recognition then I wouldn’t be uploading photos to a cloud service in the first place!

  • Running out of content? This article, supposedly filed on: Aug 17, 2018, 12:15pm is clearly not new. just look at the date of my last comment.

    nothing to see here, move along.

Log in to comment on this story!