When Vancouver’s ABC-dominated city council voted in December in favour of outfitting frontline police officers with body-worn cameras by 2025, proponents were quick to herald the decision as a win for both law enforcement and the public.
The Vancouver Police Department would have a powerful new tool, as Mayor Ken Sim described to The Tyee in an interview during the 2022 municipal race in which he was elected, to “make sure that everyone is safe.”
At least that’s the argument, pitched by decision-makers across North America and echoed in Sim’s election promises.
Among their many problems, body cameras don’t level the power imbalance between police and communities disproportionately harmed by police violence. As we have seen in body camera footage released Jan. 27 of Tyre Nichols’ fatal Jan. 7 beating by police in Memphis, Tennessee, the cameras record police violence — but they don’t stop it.
Body cameras mean more surveillance. More evidence gathered by constant recording will provide additional ways to criminalize people who already have increased contact with police and to cyclically justify the surveillance state.
A VPD report to the Vancouver Police Board in 2013 flagged civil-liberty concerns and storage costs as potential drawbacks of a body camera program. But by the end of 2022, the Vancouver Police Board budgeted $200,000 for a body cameras pilot program, as reported by The Tyee’s Jen St. Denis.
Tech won’t save us
Body cameras are a colossal expense and will also introduce the potential for further harms associated with facial-recognition technology. Canadian police departments already employ FRT and have often been dishonest about its use. One month after denying they used it, the Toronto Police Service admitted to deploying facial-recognition software in 2019. Amid the resulting scandal, police departments in Calgary and Vancouver likewise admitted to using FRT, as well as some RCMP units in B.C. In this context, the potential for body camera data to be used with facial-recognition technology is a reasonable concern.
FRT is notoriously inaccurate. A study from the University of Essex revealed that it was accurate in 19 per cent of cases. Moreover, as Black American researchers Joy Buolamwini and Timnit Gebru demonstrate, FRT is terrible at distinguishing Black people, especially Black women. There are also concerns about FRT’s accuracy in identifying Indigenous people, trans people and people who have had facial surgery.
If FRT is used to identify people recorded in body camera footage, in real time or retroactively, inaccurate software could lead to false positive identifications, or incidents of “mistaken identity,” because of the bias inherent in facial-recognition systems.
In cities like Toronto, Vancouver and Halifax, where Black people are already racially profiled, funnelling more data into police databases could introduce additional ways for Black people to be targeted, fuelled by inaccurate tech.
Reimagining public safety and freedom
The answer is not to make FRT better at identifying everyone. FRT, like many surveillance tools, poses a dire threat to human rights, including the rights to privacy, non-discrimination, peaceful assembly and free expression. As a former Black Lives Matter Vancouver organizer, I am intimately familiar with the chilling effects of surveillance.
In 2017, after a year of activism and protests, we learned that the RCMP had monitored our group since at least 2016. It was an unsettling discovery that has, at times, made me hesitant to exercise my right to public protest.
My experiences are the inspiration for the first season of Amnesty International Canada’s debut podcast, Rights Back at You, which premiered on Feb. 1 to mark the beginning of Black History Month.
As creator and host, I had the privilege of interviewing Black activists speaking out against increased police power and control. I hope that, by amplifying the stories of people reimagining public safety and freedom, the podcast will inspire listeners to think critically about the role of policing and surveillance in perpetuating anti-Black, anti-Indigenous and other forms of systemic racism.
As the stories in Rights Back at You reveal, we cannot fall into the trap of techno-solutionism — the idea that if we just have better technology and more data, we can easily solve complex social problems. Treating emerging technology as a panacea for social ills and gathering more data can usher in harmful unintended consequences, even if it sounds like a promising idea at the start.
There is certainly a role for technology to play in the betterment of our world.
However, we need to interrogate each application’s purpose and whether it truly safeguards human rights and promotes public safety — not just for ABC Vancouver supporters and people with power, but for all.