Trudeau’s paradoxical definition of Indigenous consent

The federal government’s skewed view of Indigenous consent, and its apparent conflict of interest on the pipeline, could pose a legal problem.

Image result for policy options: Trudeau’s paradoxical definition of Indigenous consent
Photo: Indigenous drummers perform a drum circle prior to a demonstration against the approval of the Trans Mountain pipeline, in Victoria on June 22, 2019. THE CANADIAN PRESS/Dirk Meissner

he latest cabinet approval of the Trans Mountain pipeline came less than a day after the federal government declared a climate emergency. While the irony was a dream for satirists, it wasn’t the biggest contradiction of the day. Instead, it was Prime Minister Justin Trudeau’s bizarre definition of free, prior, and informed consent (FPIC) with regard to projects that will impact Indigenous land and rights: “[FPIC] is what we engaged in doing with Indigenous communities over the past number of months. It is engaging, looking with them, listening to the issues they have and responding meaningfully to the concerns they have wherever possible.”

By Trudeau’s definition, consent is: listening to issues, responding to concerns wherever possible, and then forging ahead. As Indigenous lawyer and scholar Pam Palmater pointed out, imagine if that definition of consent was applied in the context of sexual relations?

The prime minister’s comments largely went unnoticed in the mainstream media, but his government’s skewed understanding of FPIC and half-hearted attempts at consultations with Indigenous communities remain the core reason it will be unable to move the project forward. Moreover, Ottawa’s purchase of the pipeline created an inherent conflict of interest as it purported to sit down for meaningful consultations.

“Listening to the issues”

So, what exactly was the government “engaged in doing” with Indigenous communities since last August, when the Federal Court of Appeal found that “Canada did not fulfil its duty to consult” on the pipeline and quashed the National Energy Board’s approval of it?

Many of the First Nations that had appealed to the court expressed their dissatisfaction with the renewed Stage III consultation process that the court had mandated.

The Squamish First Nation said it had been assured there were no time limits for the consultations, only to discover that cabinet did have an end date in mind. Khelsilem, a Squamish Nation spokesperson, told a news conference that they had been sent documents for feedback after May 22, the federal government’s self-imposed deadline for comments.

“What we experienced was a shallow attempt at consultation that resulted in a failure to address our concerns,” said Khelsilem. “The failure to meaningfully engage with rights holders means this government is either not serious about building this pipeline or not serious about respecting Indigenous rights.”

Chief Lee Spahan of Coldwater Indian Band said, “The meaningful dialogue that was supposed to happen never happened.” A study of the community’s aquifer had not yet occurred, and an existing pipeline spill has yet to be remediated.

Chief Leah George-Wilson of the Tsleil-Waututh Nation said that consultation once again fell well below the mark set by the Supreme Court of Canada in a number of key decisions, including Tsilhqot’in. This constitutional obligation of the Crown’s was re-emphasized in the Federal Court of Appeal ruling. George-Wilson also noted that the federal government was in a conflict of interest – that its multiple hats as proponent, decision-maker, enforcer of laws and fiduciary to First Nations and all Canadians made it impossible to make an open-minded, unbiased decision.

Alexandria Ocasio-Cortez Says Algorithms Can Be Racist. Here’s Why She’s Right.


Alexandria Ocasio-Cortez recently said that algorithms can perpetuate racial inequities.

Last week, newly elected U.S. Rep. Alexandria Ocasio-Cortez made headlines when she said, as part of the fourth annual MLK Now event, that facial-recognition technologies and algorithms “always have these racial inequities that get translated, because algorithms are still made by human beings, and those algorithms are still pegged to basic human assumptions. They’re just automated. And automated assumptions — if you don’t fix the bias, then you’re just automating the bias.”

Does that mean that algorithms, which are theoretically based on the objective truths of math, can be “racist?” And if so, what can be done to remove that bias? [The 11 Most Beautiful Mathematical Equations]

It turns out that the output from algorithms can indeed produce biased results. Data scientists say that computer programs, neural networks, machine learning algorithms and artificial intelligence (AI) work because they learn how to behave from data they are given. Software is written by humans, who have bias, and training data is also generated by humans who have bias.

The two stages of machine learning show how this bias can creep into a seemingly automated process.  MORE