OTTAWA—Do the Liberals choose their cabinet through varying degrees of anti-Blackness, or is that just a bonus they throw in? Given that the Toronto Police Service recently gave an apology for their systemic anti-Black racism, reporters should be asking Bill Blair what his contribution to the cause was. Nevertheless, Blair is not in my crosshairs this week, it’s oft-forgotten Immigration Minister Sean Fraser and Minster of Innovation, Science, and Industry François-Philippe Champagne.
This week, Toronto is hosting one of the most important tech conferences of the year, the Collision Conference. In addition to my podcast mate Erin Gee, this conference hosts some of the greatest startups worldwide and is an incredible opportunity to find venture capital or angel investing funding for some of the most innovative technologies that are not even in beta stage. Speakers include Alicia Garza, principal at Black Futures Lab and co-creator of #BlackLivesMatter; Lupita Nyong’o; Tope Awotona, president and CEO of Calendly; and even Champagne.
But Black Africans and South Americans were pushed out of the conference due to visa backlogs, as is the way Canada treats immigrants who aren’t white. I don’t need to go through the arguments about the immigration backlogs for temporary visas that Justin Trudeau promised to clear, propped up by an $85-million investment that was revealed in last fall’s Economic Fiscal Update; or that the launch of Canada-Ukraine authorization for emergency travel (CUAET) has added to the pile, bringing the backlog to 2.4 million without a plan for clearing it. White people get special treatment with Canadian immigration, whereas African academics are having their visas rejected at levels so high it raised questions about discrimination, notably anti-Blackness, within Canada’s immigration system.
Fraser is demonstrably lockstep with the Liberals’ penchant for anti-Blackness. According to the National Post, this is a chronic problem that seems to mainly afflict people from countries with dark, racialized populations: nearly two dozen Africans barred from Ottawa Race Weekend and Sky Sports Formula One commentator Karun Chandhok couldn’t cover the Montreal Grand Prix on June 19 despite having a Canadian wife and child but having Indian citizenship. The opportunity cost is that Canada is left behind in innovative circles because Fraser and the Liberals can’t get their shit together.
A particular problem is that this conference is specifically relevant to Minister Champagne’s recently introduced Bill C-27, or more innocuously named, “An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts”. Basically, this is the bill that will determine trade, commerce, and regulations relating to artificial intelligence (AI). Here’s the problem with artificial intelligence: it’s very biased against the usual suspects to whom biases pertain: BIPOC, non-binary, even women in some instances, and this bias is at scale. The researchers left out of this conference could pose solutions to that bias, but would not otherwise get time with the minister or would not otherwise be able to platform these solutions at scale. And these problems are huge ones where the discriminatory implications could reverberate throughout every part of society for generations.
The Georgetown Security Studies Review states unequivocally the systemic and structural harm that can result from AI systems: “But just as AI offers advancements, there is also the potential for a bleak future—machines are prone to bias and racism. These machines learn by running training data through algorithms, each crafted by human handlers. Therefore, if the data inputted into the system is biased, the result will be biased, too.” The Review goes on to say: “This is dangerous and the threat needs to be addressed before biased AI systems become ubiquitous. Ultimately, AI systems have the potential to deepen existing systemic inequalities, particularly in industries like health care, employment, and criminal justice.” Yet there is no evidence that this threat was taken seriously by either Innovation, Science, and Economic Development (ISED) Canada, or Champagne. In fact, the gender-based analysis-plus (GBA+) section of the memorandum to cabinet for this bill should be made public to see who ISED is actively discriminating against. Given the lack of expertise this government and its public service has exhibited in using GBA+, this ignorance formulated in official government documents is not perfunctory—it has wide-ranging consequences that are foreseeable if everyone did their jobs properly, which has never been the case at ISED.
The scary part of Bill C-27 is how systemic it is, yet it refuses to acknowledge AI’s characteristic of systemic bias including racism, sexism, and heteronormativity. This legislation is only focused on harm to individuals and its definition of harms is restrictive. It would’ve been a boon for Minister Champagne to consult leading AI researchers of colour to mitigate the impending deluge of harm that will befall BIPOC, trans, and non-binary people if this bill is passed as is. But they did not, and they will not.
In addition, law enforcement agencies, immigration, health-care services, border services—all of which have been shown to be racially biased throughout the pandemic—can be exempt from any sort of government oversight, thereby giving an already systemically racist system carte-blanche to double up on its efforts at enforcing that racism. The only oversight for this regime is undemocratic—it’s political—meaning the minister of ISED is the only one with authority to identify these harms and address them. Good luck with that.
This is not democracy; this is a power grab—backed by a Liberal government already seen to have overstepped on the Emergencies Act—of one of the most corrosive technologies to come online in generations. If you think social media is bad, wait until this government unleashes systemically biased and discriminatory AI.
Erica Ifill is a co-host of the Bad+Bitchy podcast.