Facial Recognition Tech Needs Some Work

Image: iStock

We've all seen motives or TV shows where the good guys use some smart software to analyse footage and find a single face in a crowd of thousands. The South Wales police force in the UK tried that last June when they tried to scan every face that attended the UEFA Champions League final. But the outcomes were less than stellar as the accuracy of the scanning was very error prone.

When you're scanning a large crowd for bad guys, false positives are a big problem. You end up falsely accusing people and over-working your stretched resources - almost exactly what the tech is meant to help you avoid.

South Wales police explain their facial recognition program in glowing terms, pointing to lots of arrests achieved through the system. When they scanned that crowd at the Champions League final, they identified 2470 potential criminals.

But of those 2470 people, just 103 were actual bad guys - that's a significant failure rate in excess of 90 per cent.

They explain that the system has two distinct functions. ‘Identity’ allows officers to load images of they are looking for and then compare them to a databases of 500,000 "custody images". They then use ‘Locate’ to find people using live feeds from fixed and mobile closed circuit TV cameras.

But that fail rate is a concern. Fortunately, the system's purpose is to identify potential suspects and then refer the information to human officers so the false positive IDs don't result in Robocop at the door or a nasty letter. But it's high enough to suggest the tech still needs some work.

With facial recognition systems becoming a parent of our personal computing fabric - it's in Apple's Photos program, Google's image management software and it's been a part of Facebook for some time - its accuracy needs to step up quite a long way before it's ready for use in law enforcement.


Comments

    The failure rate depends on how you look at it. From another article it scanned 170,000 faces and came up with 2,470 potentials. If you look at it that way it's about a 1.45% rate. In other words nearly 99% of people were actually correctly recognised.

    I'd like to see some comparisons between the false positives and their potential "matches". It'd be very interesting to see just how close the people were in appearance. Obviously that's not going to happen because of privacy rules.

    Also, we don't know how many criminals it missed. To get the flip side.

Join the discussion!

Trending Stories Right Now