Judge rules in favour of Somali refugee claimants after study finds shortcomings in facial-recognition technology

    1 of 1 2 of 1

      A Federal Court of Canada Judge has upheld a judicial-review application by two female Somali refugee claimants who questioned the Canadian government's use of facial-recognition software.

      Through their lawyer, Asha Ali Barre and Alia Musa Hosh submitted a study published in Proceedings of Machine Learning Research on facial-analysis algorithms.

      According to the July 20 ruling by Justice Avvy Yao-Yao Go, this research "found that darker-skinned females are the most misclassified group with error rates of up to 34.7%, as compared to the error rate for lighter-skinned males at 0.8%".

      As a result, Barre and Hosh maintained that the federal government had failed to establish that they were Kenyan students rather than citizens of Somalia in denying their claims for refugee status.

      The case was recently covered in Lexbase, which is a newsletter on immigration issues published by Vancouver lawyer and policy analyst Richard Kurland.

      Go's decision noted that applicants also submitted that "there are sufficient differences between the photos, that there are difficulties in comparison when one person is wearing a hijab, and that ethnic Somalis in both Kenya and Somalia can share similar features".

      In their refugee claim, Barre and Hosh insisted that they were practitioners of Sunni-Sufi Islam and were daughters of farmers in Buulo Mareer. It's a town in southwestern Somalia that's been a base for the Al-Shabaab extremist group.

      "They both based their refugee claims on fear of sectarian and gender-based violence from Al-Shabaab and other militant Islamist groups in Somalia," Go wrote in her decision.

      The Refugee Protection Division, however, alleged that they had entered Canada with study permits under different names. They based this conclusion on the use of Clearview AI facial-recognition software, which generated photo comparisons.

      The Minister of Public Safety and Emergency Preparedness (then Bill Blair) filed an application to "vacate the Convention refugee status" conferred on the two women "on the basis that they had misrepresented and withheld material facts relating to relevant matters before the RPD".

      The Refugee Protection Division then did this.

      "The RPD also based its decision on the Global Case Management System [GCMS] notes with respect to the Kenyan students, which according to the RPD, suggested that they did not attend classes at their intended educational institutions," Go noted. "The RPD further accepted affidavits submitted by CBSA [Canadian Border Services Agency] indicating that searches of the Integrated Customs Enforcement System [ICES] did not confirm the Applicants’ alleged entry to Canada under their respective aliases."

      The applicants, on the other hand, suggested that the two Kenyan students are "presumably still living in Manitoba on validly-granted study permits".

      They also maintained that denying them refugee status was "unreasonable" because the Refugee Protection Division had breached procedural fairness.

      "They argue that the RPD should not have admitted photographic evidence of the Kenyan students because this evidence likely came from questionable facial recognition software such as Clearview AI," Go wrote. "I find the Decision unreasonable as the RPD erred in relying on the Privacy Act to admit the photo comparisons and to exempt the Minister from disclosing how the photo comparisons were made. I also find the Decision unreasonable because the RPD ignored evidence that ran contrary to its conclusion, and provided inadequate reasons for its findings with respect to the facial similarities between the Applicants and the Kenyan students."

      Coincidentally, the Vancouver Art Gallery is hosting an exhibition called The Imitation Game: Visual Culture in the Age of Artificial Intelligence.

      Earlier this year, the cocurator, Bruce Grenville, highlighted how this technology isn't always reliable.

      "Racial bias in artificial intelligence is a big problem," Grenville told the Straight, "and unless that's addressed, we've got serious problems."