London Metropolitan Police’s Facial Recognition Still has an 81% Error Rate
Any advancement in the technology world needs to be applauded. This is true even when it comes to disruptive yet controversial technologies such as facial recognition. The London Metropolitan Police has been using this technology for some time now and is noting some progress. Only eight in ten technology-based “hits” are incorrect, which is a notable improvement compared to several months ago.
Anyone who lives in the United Kingdom may have kept tabs on how the London Metropolitan Police has begun experimenting with facial recognition technology. While this technology was heralded as a great step forward to streamlining investigations, it has not been a successful venture by any means. In fact, the initial report issued in May 2019 confirms 96% of “matches” generated by the facial recognition technology were incorrect, which is an astonishing figure.
Although it was evident things would improve over time, there is still a lot of work to be done. The latest data shared by Sky News confirm this technology still fails spectacularly. While its incorrect matches have dropped from 96% to 81%, it is evident most of these hits should be ignored. This is not the way to improve upon existing police investigations, as it will only cost more time to sift through the incorrect information.
The most worrisome aspect regarding these findings is how police investigations will yield many consumers’ faces to be flagged incorrectly. People who are not on any wanted list or part of an ongoing investigation will suddenly become of interest. This is not how facial recognition technology is supposed to work. It is only normal these new statistics raise a lot of major concerns which need to be addressed sooner rather than later. Doing so is easier said than done, unfortunately.
One has to keep in mind the London Metropolitan Police is still working with a very small sample size. As such, the impact on consumers being identified incorrectly should not pose any major problem just yet. However, one also has to wonder if this technology will ever be useful on a broader scale. When the failure rate begins at 100%, drops to 96%, and then goes to 81%, there is genuine progress. It might not come quick enough before things effectively turn out the way they are intended to.
At the same time, one has to wonder if the London Metropolitan Police should continue to stand by this failing project. Tenacity is a wonderful trait which is often undervalued, but in this case, it might be better to pull the plug on the venture altogether. Actively claiming the technology is incorrect in 0.1% of the cases sends a bad signal to everyone who knows better, though.
For the time being, it remains to be seen how this situation evolves. It is evident the general public isn’t too amused by this failing technology, especially because it can have a direct impact on themselves if it’s ever used on a broader scale. Anything that misidentifies people as criminals should be put on ice indefinitely, yet it seems the LMP has no plan to do so in the near future. Government intervention would certainly help, but it seems no progress should be expected in that department either.