Canada’s privacy commissioner today issued a report after its investigation into the RCMP’s use of controversial facial recognition technology Clearview AI, finding the force violated the Privacy Act.
And it was revealed in a press conference afterwards that the force rarely could account for why it used the software.
The RCMP has maintained that the licences it purchased for Clearview AI were to be used by child exploitation units in their investigations. The force made that claim explicit in the software requisition, obtained by The Tyee.
But according to the privacy commissioner, the RCMP could only attribute six per cent of the software’s use to its child exploitation units. The force was unable to account for 85 per cent of its Clearview AI searches reviewed by the watchdog agency, said privacy commissioner Daniel Therrien.
The privacy commissioner report says the RCMP has promised to now use oversight and assess potential privacy impact before implementing technologies. However, the government already had required such practices for all departments and the RCMP had not followed them, privacy commissioner Therrien confirmed after a Tyee question in a press conference following the report.
The new agreements remain non-binding and unenforceable, Therrien also confirmed, noting that the consequences for continued violations of its renewed agreements would be harm to the RCMP’s reputation.
NDP MP Charlie Angus held his own press conference, in which he noted Bill C-11 and other privacy law changes currently under consideration in Parliament will not address repeated failures of the RCMP to follow existing polices related to technology use.
Documents obtained by the Tyee show that the software, which sifts billions of cached facial images, was approved for use and to be installed by a branch of the RCMP with three other teams that pursue investigations not related to its child exploitation enforcement.
The force has previously admitted that at least three other units were using Clearview AI, but has repeatedly refused to reveal which units.
In applying to in-house higher-ups for permission to acquire the powerful facial recognition software, the RCMP stated that if its request were to be denied, “Children will continue to be abused and exploited online.” And: “There will be no one to rescue them because the tool that could have been deployed to save them was not deemed important enough.”
The form also said that the RCMP would share information obtained using the software with internet child exploitation units in police forces across Canada.
Which other units within the RCMP had access to Clearview AI? That remains unclear. However Clearview AI, said the requisition form, will be installed on the servers of the Sensitive and Specialized Investigative Services, which operate three other teams in addition to the childhood exploitation unit.
Those teams are Behavioural Sciences Investigative Services, the National Centre for Missing Persons and Unidentified Remains and the Truth Verification Section, according to an RCMP hiring advertisement.
According to government documents, Behavioural Sciences Investigative Services conducts research, develops policy and provides consultation to RCMP and other police services in “Criminal Investigative Analysis, Geographic Profiling and Truth Verification.”
The branch also maintains the Violent Crime Linkage Analysis System as well as the National Sex Offender Registry say documents.
An additional key section of the RCMP’s Sensitive and Specialized Investigative Services, according to the hiring ad, is Strategic and Operational Services which “provides services across all program areas.”
Read more: Rights + Justice, Science + Tech
Tyee Commenting Guidelines
Comments that violate guidelines risk being deleted, and violations may result in a temporary or permanent user ban. Maintain the spirit of good conversation to stay in the discussion.
*Please note The Tyee is not a forum for spreading misinformation about COVID-19, denying its existence or minimizing its risk to public health.
Do:
Do not: