Federal Court of Appeal Remits Proposed Class Action Against Clearview AI
The Federal Court of Appeal of Canada has remitted a proposed class action lawsuit against Clearview AI back to the Federal Court, citing an error in how the certification process was handled. This decision revolves around a high-stakes case involving privacy, copyright, and the use of biometric data. Clearview AI, a U.S.-based tech company, faced allegations of improperly collecting and using millions of images from the internet for its facial recognition tools.
The lawsuit, filed by a Canadian citizen from Quebec, accuses Clearview AI of violating copyright and moral rights by scraping images from online sources without consent. These images were then stored in a database used to develop facial recognition services, primarily for law enforcement and security agencies. The plaintiff argued that Clearview AI’s actions breached the Copyright Act, 1985, and infringed on moral rights by using the images without permission.
The case gained attention due to its implications for privacy and digital rights in the age of advanced technology. Clearview AI operated in Canada from 2019 until July 2020, when it ceased operations amid privacy investigations by Canadian authorities.
The Federal Court of Appeal’s ruling focused on a procedural issue: how class members would be identified. The lower court judge had rejected the certification of the class action, arguing that requiring individuals to take steps to determine their status as class members effectively created an “opt-in” process. This, the judge believed, was inconsistent with Canada’s class action framework, which typically operates on an “opt-out” basis.
Justice Elizabeth Walker of the Federal Court of Appeal disagreed. She stated that the fact some individuals might not engage with the process of determining their class member status does not transform the case into an opt-in scheme. Class members remain part of the class unless they actively choose to opt out. Justice Walker emphasized that the proposed method for identifying class members does not undermine the objectives of the class action regime, which aims to promote access to justice.
As a result, the Federal Court of Appeal found that the certification judge had applied the wrong legal test when evaluating the process for identifying class members. The case was remitted to the Federal Court for further consideration, with instructions to apply the correct legal framework in determining whether the lawsuit can proceed as a class action.
This ruling does not address the merits of the copyright or privacy claims but sets an important precedent for how class actions involving technology and privacy issues should be handled in the future. It also highlights the ongoing challenges of balancing innovation with legal and ethical considerations in the digital age.
Clearview AI’s business model, which relies on scraping photographs from the internet to build its facial recognition database, has sparked controversy worldwide. The company’s practices have raised concerns about privacy, consent, and the ethical use of technology. In Canada, these concerns led to regulatory investigations, prompting Clearview AI to discontinue its operations in the country in 2020.
The outcome of this case could have far-reaching implications for how companies like Clearview AI operate in Canada and beyond. While the Federal Court of Appeal’s decision does not resolve the underlying allegations, it ensures that the legal framework for addressing such issues is clarified and strengthened.
Implications of the Ruling and Clearview AI’s Practices
The plaintiff in the case sought several key remedies, including certification of the lawsuit as a class proceeding, her appointment as the representative plaintiff, and the removal of the collected images from Clearview’s database. Additionally, she requested damages for the alleged copyright and moral rights infringements, as well as declaratory and injunctive relief to prevent Clearview from engaging in similar conduct in the future.
The core of the appeal centered on the certification judge’s evaluation of the class member identification process. The lower court had raised concerns that requiring class members to take proactive steps to determine their status—such as querying whether their images were included in Clearview’s database—effectively created an “opt-in” scheme. This, the judge argued, was inconsistent with Canada’s class action framework, which typically operates on an “opt-out” basis. However, Justice Elizabeth Walker of the Federal Court of Appeal rejected this reasoning, emphasizing that the class action regime is designed to promote access to justice, even if some individuals choose not to engage with the process.
Justice Walker clarified that the fact some class members might not take steps to determine their status does not transform the case into an opt-in scheme. She noted that class members remain part of the class unless they actively choose to opt out, and that failing to seek out one’s status does not exclude them from the class automatically. This ruling underscores the importance of ensuring that the class action framework remains accessible and effective, even in cases involving complex technological issues.
The Federal Court of Appeal’s decision to remit the case back to the Federal Court for further consideration ensures that the lower court will now apply the correct legal test when evaluating whether the lawsuit can proceed as a class action. This ruling does not address the merits of the copyright or privacy claims but sets a significant precedent for how class actions involving technology and privacy issues should be handled in the future.
Clearview AI’s business model has been a subject of controversy globally. The company’s practice of scraping photographs from the internet, including those subject to copyright protection, has raised significant ethical and legal concerns. In Canada, these concerns led to privacy investigations by regulatory authorities, prompting Clearview to cease its operations in the country in 2020. Despite its withdrawal from Canada, the company’s practices continue to be scrutinized in other jurisdictions, with investigations and lawsuits ongoing.
The outcome of this case could have far-reaching implications for how companies like Clearview AI operate in Canada and beyond. While the Federal Court of Appeal’s decision does not resolve the underlying allegations, it ensures that the legal framework for addressing such issues is clarified and strengthened. This ruling highlights the ongoing challenges of balancing innovation with legal and ethical considerations in the digital age, particularly in cases involving privacy, copyright, and biometric data.
Conclusion
The Federal Court of Appeal’s ruling in the Clearview AI case underscores the evolving landscape of privacy and copyright law in the digital age. By addressing the class action framework and emphasizing the importance of access to justice, the decision sets a precedent for handling cases involving technology and privacy. While the ruling does not resolve the underlying allegations against Clearview AI, it highlights the need for companies to balance innovation with legal and ethical considerations. The case serves as a reminder of the ongoing challenges in protecting individual rights in an era where biometric data and digital privacy are increasingly at risk.
Frequently Asked Questions
What was the main issue in the Clearview AI class action case?
The case primarily revolved around the certification of the lawsuit as a class proceeding, focusing on the process of identifying class members and whether it aligned with Canada’s class action framework.
What are the implications of the Federal Court of Appeal’s ruling?
The ruling clarified that class members remain part of the class unless they actively opt out, ensuring the class action framework remains accessible and effective for addressing complex technological and privacy issues.
What is the difference between an opt-in and opt-out class action framework?
In an opt-out framework, class members are automatically included unless they choose to exclude themselves, while an opt-in framework requires individuals to actively join the class. Canada’s system typically follows the opt-out approach.
How does this ruling impact companies like Clearview AI?
The ruling emphasizes the need for companies to adhere to legal and ethical standards when handling biometric data and digital privacy. It sets a precedent for how such cases will be handled in the future, potentially influencing operations globally.


