Facial-recognition software inaccurate in 98% of cases, report finds ...
https://www.cnet.com/.../facial-recognition-software-inaccurate-in-98-of-metropolitan...
May 13, 2018 - Facial recognition may not be the high-tech policing solution it's purported to be, with new figures showing facial-recognition software used by ...
Face recognition police tools 'staggeringly inaccurate' - BBC News
https://www.bbc.com/news/technology-44089161
May 15, 2018 - Police must address concerns over the use of facial recognition systems or may face legal action, the UK's privacy watchdog says. Information ...
Report: facial recognition software inaccurate in up to 98% of cases
https://www.fastcompany.com/.../report-facial-recognition-software-inaccurate-in-up-t...
May 14, 2018 - Report: facial recognition software inaccurate in up to 98% of cases. The Independent has published figures it gained under a freedom of information request that shows that the facial recognition software that U.K. police forces are trialing are mostly inaccurate.
Facial recognition is not just useless. In police hands, it is dangerous ...
https://www.theguardian.com/.../facial-recognition-useless-police-dangerous-met-inaccur...
May 16, 2018 - In one trial by the Met, the results were 98% inaccurate. People ... UK police use offacial recognition technology a failure, says report. Wed 16 ...
Police face-recognition technology branded 'dangerous and ... - The Sun
https://www.thesun.co.uk/tech/6290315/face-recognition-technology-accurate-police/
May 15, 2018 - Automated facial recognition software dubbed “dangerous and inaccurate” due to its inability to identify people correctly.
Facial recognition technology is "dangerously inaccurate" | IT PRO
www.itpro.co.uk › Policy & legislation › data protection
May 15, 2018 - Facial recognition is also used by South Wales Police, but 91% of its system's matches were inaccurate, despite the Home Office providing £2.6 ...
UK's facial-recognition tech is 'wrong 98 percent of the time'
https://nypost.com/2018/05/15/uks-facial-recognition-tech-is-wrong-98-of-the-time/
May 15, 2018 - Civil liberties group Big Brother Watch branded the automated facial recognitionsoftware as “dangerous and inaccurate” due to its inability to identify people correctly. The organization added that its use is “lawless” and could breach the right to privacy protected by the Human Rights Act.