Toronto police have had access to facial recognition software since last March, and say it can find a possible suspect from a surveillance camera or witness image from a crime scene approximately 50 per cent of the time.

A report to be considered by the Toronto Police Services Board this Thursday says police bought the system from NEC America at a cost of $452,000.

It compares photos and screengrabs from videos against the TPS database of approximately 1.5 million mugshots.

In one instance, the new facial recognition system was used to identify Dean Lisowick, a victim of convicted serial killer Bruce McArthur, who investigators had a difficult time positively identifying after his remains were discovered.

The use of facial recognition software in law enforcement has met significant criticism.

A study completed in part by a University of Toronto researcher earlier this year found the rate of error for Amazon’s facial recognition system increased as the subject’s skin shade grew darker. It also had more trouble with women’s faces than with men’s.

It found that Amazon’s system misidentified dark-skinned women as men close to a third of the time.

Amazon markets its system to law enforcement agencies but it is not the one used by Toronto police.

In the report, police say the use of a facial recognition is supported by criminal legislation and the Charter, and that they will not use the system to actively “conduct, real-time facial recognition comparisons at major events, such as concerts or sporting events.”

They say the system is used to generate “potential candidates” and arrests are only made after additional supporting evidence is secured. Police also say they will not use any other mugshot image database beyond their own.

The system cannot provide a suggested identity when the image is too blurry or a suspect’s face is concealed, but officers were still able to use it to make arrests.

During the last nine months of 2018, the system was used to analyze about 5,000 crime images. In about 3,000 of those, it identified a potential candidate for further investigation.

Of the 3,000 candidate images, approximately 2,400 of those images “led to the identification of the criminals responsible for these criminal offences.”

The offences where the system was used in 2018 included four homicides, numerous sexual assaults and a large number of shootings.