OTTAWA—NWA really produced a timeless banger with their seminal song about the police.
Yeah, you know the title.
As discussed in this column last week, “Surveillance is the way white supremacy has controlled populations by controlling their movement, who they associate with, and the spaces they occupy.” Surveillance is also intended to create labels for people considered a threat to white supremacy in an effort to cull, isolate, weaken, and then reduce or eliminate that threat. As The Intercept reports, the Royal Colonialist Militarized Police (a.k.a. the Royal Canadian Mounted Police) “has closely monitored the Unist’ot’en camp since its inception, labeling those involved with the Wet’suwet’en resistance as extremists.” These dudes (and I do mean men) were so up in Unist’ot’en’s business, they even knew when the camp constructed a root cellar.
Your tax dollars hard at work.
This criminalization of Indigenous rights activists—and frankly Indigenous dissent—is nothing new.
In 2014, the RCMP drafted an internal document called the Critical Infrastructure Intelligence Assessment report, which suggested that “growing opposition movements against pipelines should be seen and treated as criminal security threats.”
Around this time, the RCMP launched Project SITKA, a surveillance program, to “identify key individuals ‘willing and capable of utilizing unlawful tactics’ during Indigenous rights demonstrations.” The force created a list of 89 activists it considered a criminal threat, created unique profiles for each, and made them available to front-line officers, other law enforcement agencies, and two policing databases. There is no confirmation that these 89 people actually committed a crime.
Another report, this time on Project SITKA, revealed that Indigenous activists were classified as terrorists and extremists, a classification not even neo-Nazi groups enjoyed at the time. In fact, it was only in 2019 that the Canadian government added two neo-Nazi groups to the Terrorist Entities list, (or the federal list of terrorist organizations): Blood & Honour, and its more militant wing, Combat 18 (C18).
Even in Donald Trump’s America, the FBI recently announced that it is equating far-right extremist violence with international terrorism. Three years after Alexandre Bissonnette massacred worshipers at a Québec City mosque, Canada’s response is to add two neo-Nazi groups to an anti-terrorism list.
That’s what systemic racism looks like. Funny, they can monitor Indigenous protesters but not neo-Nazis who commit actual violent crimes. Wonder what the difference is… .
What is even more concerning is that as technologies advance at an exponential rate, and government remains legislatively impotent from a mix of technological ignorance and a law and order hard-on, nothing will be done to enforce Charter Rights such as privacy and freedom of expression.
Last week, The Toronto Star reported that more than 30 police forces around the country used Clearview AI, a facial recognition software that matches photos of “persons of interest” online with photos pulled from millions of sites, including the RCMP, the Ontario Provincial Police, and Toronto Police, with no oversight whatsoever as to how they’re using that data, storing the data, and with whom they are sharing the data. Currently, Canada is Clearview AI’s largest market outside of the U.S.
With our tax dollars, but not with our consent.
Facial recognition is a technology that essentially identifies faces by using facial features from a digital image and compares them to images in a database; many of us open our iPhones using some form of facial recognition. Unfortunately, this technology is unreliable, especially if you’re Black and Indigenous—or of any race other than white—a woman, non-binary, or transgender.
The Toronto Star, in its reporting on Clearview AI, highlights this fact: “The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found. The technology also had more difficulty identifying women than men.”
Basically, if you’re not a white male, this technology will misidentify you, as Amazon’s facial recognition tool, Rekognition, has shown. In 2018, the ACLU tested Amazon’s new toy on Congressional members, and the results were racist. The test compared the lawmakers’ photos to a database of mug shots, disproportionately misidentifying Black and Latino legislators as criminals. In response, the lawmakers wrote a letter to Amazon’s Jeff Bezos, stating that, “there are ‘serious questions regarding whether Amazon should be selling its technology to law enforcement at this time.’”
Only, this technology is being used by law enforcement without public oversight, and its tendency to disproportionately misidentify people of colour, LGBTQ2+ people, and women makes it inherently biased.
Last week, NDP MP Charlie Angus, in response to the revelations of law enforcement’s pervasive use of Clearview AI, tweeted: “I have received all party support to launch a parliamentary investigation into the use/abuse of facial recognition technology: its use by police, corporations, individuals, potential impact on civil society, privacy rights, racialized communities, vulnerable populations.”
An investigation is only the first step. Let’s see what comes out of it. What is clear, however, is that law enforcement has been allowed immense power to criminalize peaceful protestors—who are most likely from vulnerable communities (or else they wouldn’t be protesting) and who most likely have a contentious relationship with police—and surveil them (which marginalizes them even more), resulting in continued over-policing of marginalized communities and increased arrests (if you’re watching someone for long enough and intensely enough, the likelihood you’ll find something to arrest them for is higher). They then use those arrest statistics to acquire more funding from law and order legislators to buy new toys with which to curtail our civil rights. And the cycle repeats.
As more and more companies and governments use this technology, the results will be a wide-spread, disproportionate targeting of populations already vulnerable from their experiences with law enforcement. If this technology continues to spread, what we will create is structural discrimination with ramifications that fall on racial fault lines. If you think Canada is divided now, just wait.
Erica Ifill is a co-host of the Bad+Bitchy podcast.