What’s “dangerously inaccurate” and puts the burden of proof on citizens who are incorrectly matched as a person of interest? It is the automated facial recognition technology being used by police in the UK. The face recognition runs alongside surveillance cameras on the street, scanning crowds and public spaces to match faces in real time against folks on watchlists.How wildly inaccurate is the technology? The Metropolitan Police, for example, have managed to correctly identify only two people, according to a new report by Big Brother Watch (pdf). However, neither one of those is a criminal — one was incorrectly on a watchlist, and the other was on a “mental health-related watchlist.”To read this article in full, please click here