Police facial recognition used to help catch criminals questioned

Police facial recognition system which is used to help catch criminals ‘risks damaging public trust and could be unlawful’

  • Police CCTV system which scans faces for criminals in crowds could be unlawful 
  • Facial recognition technology (FRT) has come under fire from a watchdog
  • The independent Information Commissioner has questioned ‘intrusive’ new tech

A controversial facial recognition system to help the police catch criminals risks damaging public trust and could even be unlawful, a watchdog has warned.

Elizabeth Denham, the independent Information Commissioner, said the software, which scans crowds for a ‘match’ on a database of possible offenders, could be ‘particularly intrusive’.

She spoke out as a damning report into automated facial recognition technology (FRT), used in conjunction with CCTV cameras, found it was ‘almost entirely inaccurate’.

Miss Denham said there was a ‘real risk’ that the public safety benefits of the state-of-the-art technology would be lost. She also announced a separate investigation into the ‘proportionality’ of police forces holding more than 19million mugshots.

Known as custody images, they are taken at police stations of people who are arrested or questioned – but the database includes hundreds of thousands of innocent people.

Facial recognition software used by the police has captured the faces of innocent people and held them on file, according to a watchdog 

Facial recognition technology (FRT) has been used to compare faces with people on police wanted lists 

Frontline officers use the software to scan crowds at major events such as the Notting Hill Carnival, sporting fixtures, music concerts, festivals or at shopping centres.

Photographs and video footage can then be checked in real time against specially created watch-lists or a database of custody images to see if there is a match.

But a study by civil liberties campaigners found that a staggering 98 per cent of ‘matches’ by one of three police forces who have used the technology were wrong.

It means thousands of innocent people are stopped by police, arrested or asked to prove their identities.

It is leading to concerns that Britain is sleepwalking into a surveillance state, says Big Brother Watch. In a blog posted yesterday, Miss Denham said: ‘There may be significant public safety benefits from using FRT – to enable the police to apprehend offenders and prevent crimes from occurring.

‘But how facial recognition technology is used in public spaces can be particularly intrusive. It’s a real step change in the way law-abiding people are monitored as they go about their daily lives.

‘There is a lack of transparency about its use and a real risk that the public safety benefits will not be gained if public trust is not addressed.’

Miss Denham added: ‘For the use of facial recognition technology to be legal, police forces must have clear evidence to demonstrate the use… in public spaces is effective in resolving the problem that it aims to address.’

She has written to the Home Office and the National Police Chiefs’ Council ‘setting out my concerns’. She said: ‘Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public.’

Elizabeth Denham said there was a ‘real risk’ that the public safety benefits of the state-of-the-art technology would be lost

She also said she was studying the ‘transparency and proportionality’ of mugshots kept on the Police National Database, especially for those arrested but not charged.

Big Brother Watch’s report said there was a greater proportion of false ‘matches’ in areas with high ethnic minority populations. Three forces have deployed FRT – the Metropolitan Police, South Wales Police and Leicestershire Police.

Figures released by Scotland Yard revealed 102 innocent people were wrongly matched to a photo. Only two were correct and neither were wanted criminals. In two cases, innocent women were matched with men.

For South Wales Police, 2,451 out of 2,685 matches were incorrect – 91 per cent. Of the remaining 234, there were only 15 arrests.

The force, which received a £2.6million Home Office grant, used it at the Champions League final in Cardiff in 2017 and concerts by Liam Gallagher and Kasabian.

Silkie Carlo, director of Big Brother Watch, said: ‘It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate.’


Source: Read Full Article