UK police forces told to address concerns over facial recognition systems or face legal action
The UK’s police forces must address concerns over the use of facial recognition technology or they may face legal action, according to the UK’s privacy watchdog.
Information Commissioner Elizabeth Denham has said the issue is a priority for her office.
Campaign group Big Brother Watch suggested the technology has misidentified a “staggering” number of innocent people as criminal suspects.
The group made freedom of information requests of every UK police force, with two acknowledging that they are currently testing such cameras.
The Metropolitan Police used the technology at the Notting Hill Carnival in 2016 and 2017 as well as at a Remembrance Sunday event, incorrectly flagging up 102 people as potentially having committed crimes.
Of 2,685 matches made between May 2017 and March 2018, South Wales Police said 2,451 were made in error.
Dr Nick McKerrell, lecturer in law at Glasgow Caledonian University, warned against the technology.
Speaking to Scottish Legal News he said: “The use of this discredited and frankly useless technology by the police and other public authorities seems mercifully limited in Scotland. However we cannot rest on our laurels as CCTV is now fairly universal. Although this does not immediately raise the same privacy issues as recognition software it still assumes consent for our image being captured hundreds of times per day.”
Dr McKerrell added: “The other worrying aspect of this is the lack of clear structures of accountability for our national police force. If they were to decide to invest in such materials the Scottish public would not know until after the fact.
“This has been seen with the expansion of taser use and the use of “cyber kiosks” to download materials on arrested individuals’ phones and laptops. This is back to front and we would need much more public discourse if facial recognition software was going to be used by our police force the way it seems to be in some areas of England.”