Facial-recognition technology enabling people’s expressions and moods to be picked up in CCTV footage will be trialled in a police force.
Lincolnshire Police received funding from the Home Office to trial the pilot technology – which has been delayed due to privacy concerns.
Police will be able to enter searches for people wearing hats or glasses as well as certain moods and expressions using the system, to be trialed in Gainsborough.
The system will identify these factors in the force’s CCTV footage.
While police and crime commissioner Marc Jones has secured funding, the force has not decided what searches it will use, nor what supplier provide the system.
The scans are not completed live and all footage will be deleted in 31 days, a spokesperson told The Times.
A human rights and privacy assessment will also be carried out first.
Director of civil liberties campaign group Big Brother Watch Silkie Carlo said: ‘There’s a huge amount of money from the Home Office for this technology and they’re getting themselves into legal trouble, breaching human rights and expanding state surveillance while no one is watching.’
The Home Office said: ‘We are committed to empowering the police to use new technologies like facial recognition safely, in a strict legal framework.’
Just this week the Court of Appeal ruled that the use of facial recognition technology by police did interfere with privacy and data protection laws.
Civil rights campaigner Ed Bridges, 37, brought a legal challenge against South Wales Police arguing their use of automatic facial recognition (AFR) had caused him ‘distress’.
He had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.
In a ruling on Tuesday, three Court of Appeal judges ruled the force’s use of AFR was unlawful, allowing Mr Bridge’s appeal on three out of five grounds he raised in his case.
In the judgment, the judges said that there was no clear guidance on where AFR Locate – the system being trialled by South Wales Police – could be used and who could be put on a watchlist.
It ruled that ‘too much discretion is currently left to individual police officers’.
In a statement, Mr Bridges said he was ‘delighted’ the court had found that ‘facial recognition clearly threatens our rights’.
‘This technology is an intrusive and discriminatory mass surveillance tool,’ he added.
‘For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge.
‘We should all be able to use our public spaces without being subjected to oppressive surveillance.’
The court also found that a data protection impact assessment of the scheme was deficient and that the force had not done all they could to verify that the AFR software ‘does not have an unacceptable bias on grounds of race or sex’.
The judgment notes that there was no clear evidence that the software was biased on grounds of race or sex.
Mr Bridges took his case – believed to be the world’s first over police use of such technology – to the Court of Appeal after his case was previously rejected by the High Court.
In a statement after the ruling, Mr Bridges said he was ‘delighted’ the court has found that ‘facial recognition clearly threatens our rights’.
South Wales Police said the test of their ‘ground-breaking use of this technology’ by the courts had been a ‘welcome and important step in its development’.
Chief Constable Matt Jukes said: ‘The Court of Appeal’s judgment helpfully points to a limited number of policy areas that require this attention.
‘Our policies have already evolved since the trials in 2017 and 2018 were considered by the courts, and we are now in discussions with the Home Office and Surveillance Camera Commissioner about the further adjustments we should make and any other interventions that are required.’
Mr Jukes added: ‘We are pleased that the court has acknowledged that there was no evidence of bias or discrimination in our use of the technology.
‘But questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching our duties around equality.’
At a three-day Court of Appeal hearing in June, lawyers for Mr Bridges argued the facial recognition technology interferes with privacy and data protection laws and is potentially discriminatory.
They said the technology, which is being trialled by the force with a view to rolling it out nationally, is used to live capture the facial biometrics of large numbers of people and compare them with people on a ‘watchlist’.
The force does not retain the facial biometric data of anyone whose image is captured on CCTV but does not generate a match, the court heard.
Mr Bridges’ case was dismissed at the High Court in September last year by two senior judges, who concluded the use of the technology was not unlawful.
Lord Justice Haddon-Cave and Mr Justice Swift said they were ‘satisfied’ the current legal regime is adequate to ‘ensure appropriate and non-arbitrary use of AFR’ and that the force’s use to date of the technology has been ‘consistent’ with human rights and data protection laws.
Mr Bridges, who the force confirmed was not a person of interest and has never been on a watchlist, crowdfunded his legal action and is supported by civil rights organisation Liberty, which is campaigning for a ban on the technology.
AFR technology maps faces in a crowd by measuring the distance between features then compares results with a ‘watchlist’ of images – which can include suspects, missing people and persons of interest.
South Wales Police has been conducting a trial of the technology since 2017.
The force added that it is not intending to appeal against the judgment.