Marking 20 years
of bold journalism,
reader supported.
Analysis
Rights + Justice
Science + Tech

AI’s Use in Federal Courts Isn’t ‘Potential.’ It’s Happening

Without proper adoption, ‘AI could erode the role of Canadian judges,’ say experts.

Bryce Casavant, Andrea Menard and Siomonn Pulla 17 Sep 2024The Conversation

Bryce Casavant is an associate lecturer at Royal Roads University, where Andrea Menard is a PhD candidate and Siomonn Pulla is an associate professor. This article was originally published in the Conversation.

Canadian society is progressing deeper into the digital age. Artificial intelligence technologies — like the generative AI ChatGPT and the legal platform Harvey — are increasingly shaping judicial processes and legal systems, including in the adjudication of intricate cases.

Like other areas of the world, Canada is not immune to these shifting intersections of AI technology and their impacts on the administration of justice.

2024 marks the first full year of implementing Canada’s recent AI policy for the Federal Court. As it stands today, not a single Chief Justice in Canada has firmly said “no” to the use of AI in the courts.

The Federal Court merely lightly salted the AI policy statement with a commitment that more “public consultation” was needed — without describing what that meant.

A delicate dance

Rather than prevent the use of AI — as was the recent case in British Columbia with fake AI-generated cases advanced in argument — the Federal Court has embarked on a delicate dance.

The focus has been on minimizing the known risks of “automated decision-making” in the judiciary, while embracing the potential for business efficiencies. These include translating of court text, performing of legal research and administrative tasks, addressing case management issues, assisting self-represented litigants and supporting alternative resolutions.

Under the Bangalore Principles of Judicial Conduct, this is the equivalent of playing technological footsie.

As these technologies become ubiquitous, a snarling question is raised from the shadows of the Federal Court’s judicial closet: is it even the court’s role to decide such a critical matter, or should this be left with the parliamentary branches of government?

Guiding AI use

The Federal Court AI policy states the intent is “to guide the potential use of AI by members of the Court and their law clerks.”

But it then provides: “the Court will begin investigating and piloting potential uses of AI for internal administrative purposes through its Technology Committee.”

There is no “potential” use — AI is actually being used by the court, albeit not yet in formal adjudications. And the Chief Justice has derogated their own supervisory functions to a non-elected committee, thereby circumnavigating Parliament’s role in legislation development for significant court changes to judicial operations.

This is not a matter to be left to committees or under the sole direction of a single Chief Justice not elected by the Canadian public.

While the policy authors state that they are merely investigating the potential uses of AI, the federal court also bluntly admits that AI “can save time and reduce workload for judges and Court staff, just as it can for lawyers.”

In fairness, the court also “acknowledges the potential for AI to impact adversely on judicial independence.” And that there may be “risk that public confidence in the administration of justice might be undermined by some uses of AI.”

But the court does not say how it plans to ensure checks and balances are implemented and enforced, such as the use of ChatGPT itself.

Eliminating reviews

Another federal initiative was launched during COVID-19 by the Treasury Board of Canada. In that situation, the treasury board sought to ensure “responsible” deployment of automated decision-making to minimize risks to clients, federal institutions and Canadian society.

This raised many questions among legal scholars about AI and its role in administrative decision-making, including when machines replace a human decision-maker.

Improperly adopted, AI could erode the role of Canadian judges and limit the function of courts in judicial review, although some believe this is still far away.

The Federal Court has said that it “will consult the relevant stakeholders before implementing [AI].” But when the federal government is a stakeholder, there is a serious question about the executive branch’s influence on the judicial branch’s operational policies.

Lack of research on the impacts in courts

The Federal Court AI policy suggests an alarming possibility for machine-learning bias within a poorly structured policy that favours potential efficiencies over inherent risks. It also ignores the probability for legal diversity erasure and popular culture bias, such as the removal of Indigenous legal customs and traditions in favour of Eurocentric legal norms and processes.

This raises further questions about how the Federal Court policy will address issues of progressive machine learning over time and the physical and psychological relationships between judges, court staff, lawyers and machines. Relations which could eventually pave the way for the removal of human judges from our courts.

While the intersections of AI and broader legal contexts are woefully understudied, it is the legal profession’s duty to ensure we are governed and heard by those humans we entrust our freedoms to, not the machines others make. Business efficiency has nothing to do with the true role of our courts — upholding the rule of law and constitutional protections.The Conversation  [Tyee]

  • Share:

Get The Tyee's Daily Catch, our free daily newsletter.

Tyee Commenting Guidelines

Comments that violate guidelines risk being deleted, and violations may result in a temporary or permanent user ban. Maintain the spirit of good conversation to stay in the discussion and be patient with moderators. Comments are reviewed regularly but not in real time.

Do:

  • Be thoughtful about how your words may affect the communities you are addressing. Language matters
  • Keep comments under 250 words
  • Challenge arguments, not commenters
  • Flag trolls and guideline violations
  • Treat all with respect and curiosity, learn from differences of opinion
  • Verify facts, debunk rumours, point out logical fallacies
  • Add context and background
  • Note typos and reporting blind spots
  • Stay on topic

Do not:

  • Use sexist, classist, racist, homophobic or transphobic language
  • Ridicule, misgender, bully, threaten, name call, troll or wish harm on others or justify violence
  • Personally attack authors, contributors or members of the general public
  • Spread misinformation or perpetuate conspiracies
  • Libel, defame or publish falsehoods
  • Attempt to guess other commenters’ real-life identities
  • Post links without providing context

Most Popular

Most Commented

Most Emailed

LATEST STORIES

The Barometer

What’s Your Favourite Local Critter?

Take this week's poll