U.S. tech firm accused of “mass surveillance” in Canada

Privacy Commissioner’s report says Clearview AI “actively marketed” its facial recognition technology to law enforcement in Canada, including the RCMP

    1 of 2 2 of 2

      An investigation by the Office of the Privacy Commissioner of Canada has found that New York–based technology company Clearview AI contravened federal and provincial privacy laws by engaging in “mass surveillance” of Canadians and sharing information from their facial recognition technology with law enforcement, including Toronto police.

      In a report released on Wednesday, Privacy Commissioner Daniel Therrien says Clearview allowed law enforcement and commercial organizations “to match photographs of unknown people against the company’s databank of more than 3 billion images,” including children, scraped from the internet.

      The Privacy Commissioner’s report says that under Canadian privacy laws people have a “reasonable expectation” that photographs they may post on the internet or that are posted by others will not be used by third parties for “identification purposes”. The Privacy Commissioner says consent is required. But his report says Clearview did not attempt to receive consent for use of the images collected by users of their services.

      The investigation, which included the offices of the privacy commissioners of Québec, British Columbia, and Alberta, found that “Clearview collected, used and disclosed Canadians’ personal information for inappropriate purposes”. Those purposes include the “creation of biometric facial recognition arrays” for law enforcement.

      The Privacy Commissioner’s office says it presented its findings to Clearview. And that Clearview argued, among other things, that the information they were using was “publicly available” and that “significant harm is unlikely to occur for individuals” from its sharing of information with law enforcement.

      But the Privacy Commissioner’s report states that “Information collected from public websites, such as social media or professional profiles, and then used for an unrelated purpose, does not fall under the ‘publicly available’ exemption in privacy laws.” 

      Clearview also argued, according to the Privacy Commissioner, that Canadian laws do not apply to the company because it “does not have a ‘real and substantial connection’ to Canada.”

      But the Privacy Commissioner’s report says the company “actively marketed” its services to law enforcement in Canada, including the RCMP. (A related investigation by the Office of the Privacy Commissioner into the RCMP’s use of Clearview AI’s facial recognition technology remains open.) According to the Privacy Commissioner’s report, Clearview had accounts with some 48 law enforcement “and other organizations” in Canada.

      Toronto police were among those found to be using the technology back in February 2020. But then-chief Mark Saunders said he was unaware that officers were availing themselves of the technology until a Star reporter called him to ask about it.

      It’s not clear if the technology was employed to sidestep new provincial restrictions imposed at the time on carding, the police practice of street checks which was widely viewed as racial profiling.

      Clearview stopped providing its services to the Canadian market after the Privacy Commissioner’s office began its investigation into the company.

      The Privacy Commissioner’s report says the company was “prepared to consider” remaining outside of the Canadian market for a further two years, telling the Privacy Commissioner’s office that it “would be willing to take steps, on a best-efforts and without prejudice basis, to try to limit the collection and distribution of the images that it is able to identify as Canadian.” The company asked the Privacy Commissioner to suspend the release of its report in return.

      But the Privacy Commissioner’s office says that it would be “inappropriate” to discontinue its investigation after “Clearview did not demonstrate a willingness” to follow its recommendations to follow regulations.

      The Privacy Commissioner’s office says that “it will pursue other actions available under their respective Acts to bring Clearview into compliance with Canadian laws”.

      On that front, Vito Pilieci, a senior communications advisor to the minister says the office has “not ruled out taking the matter to the Federal Court” to get Clearview to comply with Canadian regulations.

      “Unfortunately, our office does not currently have order-making powers. We will continue to cooperate with our Canadian colleagues to address the matter. In addition, we will be in contact with international data protection authorities,” Pilieci says, noting that counterparts in the U.K. and Australia have announced their own joint investigation into Clearview.

      The Privacy Commissioner’s office began its probe after reports began surfacing in January 2020 that Clearview “was populating its facial recognition database by collecting digital images from a variety of public websites, including but not limited to, Facebook, YouTube, Instagram, Twitter and Venmo, in apparent violation of those organizations’ terms of service and without the consent of individuals.”

      A representative from Clearview was not immediately available for comment. (The link for media inquiries on the company’s website does not work and does not list a phone number.) In a video posted on the company’s website to “dispel misconceptions about Clearview AI”, cofounder Hoan Ton-That reveals that more than 600 law enforcement agencies have used the company’s services. Clearview, whose motto is “Computer vision for a safer world”, boasts on its website that its technology has been used to “track down hundreds of at-large criminals, including pedophiles, terrorists and sex traffickers”.

      The company also has some clients in the banking sector. But it has tried to remain mostly out of the public spotlight since it was founded in 2017. Google joined Twitter and YouTube last year in ordering the company to stop scraping photos from its sites.

      Clearview responded with a blog on its site saying its app has “built-in safeguards to ensure trained professionals only use it for its intended purpose: to help identify the perpetrators and victims of crimes”. The blog says the company “strictly” enforces its Code of Conduct and will “suspend or terminate users who violate it”.

      Comments