LLM Privacy

Recommendations for policy-makers

The OPC's key recommendations concerning Bill C-27 from April 2023 address many of the policy points raised in this report, and we endorse those recommendations here. Below we suggest some clarifications and additions to those recommendations

  1. Recognize privacy as a fundamental right
  2. Recommendation 1 is presumably intended to mean (among other things) that implied consent or click-wrap privacy agreements are not acceptable means of acquiring consent for data collection, but a more explicit statement of this could make those implications clearer. Another implication of privacy being a fundamental right which could be made more explicit is what a reasonable expectation of privacy means. Given that privacy is being eroded so quickly, an individual who is educated about these trends may no longer expect privacy anywhere. Does that make the expectation of privacy unreasonable?
  3. Protect children’s privacy and the best interests of the child
  4. Limit organizations’ collection, use and disclosure of personal information to specific and explicit purposes that take into account the relevant context.
  5. Recommendation 3 would have the effect of limiting secondary use of data by the organizations who collect that data, but could be expanded to explicitly cover cases where data is collected by third parties, made available online, such as in the CommonCrawl dataset, then used by researchers/industry.
  6. Expand the list of violations qualifying for financial penalties to include, at a minimum, appropriate purposes violations
  7. Provide a right to disposal of personal information even when a retention policy is in place
  8. Perhaps the most contentious and important issue is whether to treat trained AI models as containing their training data or not. We argue that neural network based models do contain their training data in the form of distributed representations. Recommendation 5 suggests a right to disposal of personal information. Whether this is possible without throwing out the entire model is already a question the EU is grappling with given GDPR’s analogous right, and that US copyright suits are likewise adjudicating. The main argument raised against the implication that the right to disposal of personal information should lead to the entire model being thrown out is that the companies who made them will suffer economic losses if these models are banned. This suggestion of economic loss seems to us wildly speculative and quite possibly false given the cost of developing these models. Previous examples of tech innovations that trample on rights have struggled to become profitable. We recommend making this implication explicit: if personal data can be leaked, then individuals have the right to disposal, even if this requires throwing out the model.
  9. Create a culture of privacy by requiring organizations to build privacy into the design of products and services and to conduct privacy impact assessments for high-risk initiatives
  10. Recommendation 6 suggests that privacy guardrails should be built into the design of products and services. Once again, a more explicit description of which measures are expected is called for. Not all privacy protections are made equal.
  11. Strengthen the framework for de-identified and anonymized information
  12. While recommendation 7 goes partway toward addressing the problem of information sometimes becoming identifiable when combined with other information, this does not fully cover the problem of privacy violations that do not involve personally identifying information.
  13. Require organizations to explain, on request, all predictions, recommendations, decisions and profiling made using automated decision systems
  14. Limit the government’s ability to make exceptions to the law by way of regulations
  15. Provide that the exception for disclosure of personal information without consent for research purposes only applies to scholarly research
  16. Recommendation 10 specifies that exceptions to consent only apply to scholarly research, but leaves unclear whether and when the exception for scholarly research ends where scholars are working in partnership with industry. When data moves from public to private hands in the course of a research partnership, greater clarity is needed on whether these exceptions cease to apply, and under what circumstances.
  17. Allow individuals to use authorized representatives to help advance their privacy rights
  18. Provide greater flexibility in the use of voluntary compliance agreements to help resolve matters without the need for more adversarial processes
  19. Make the complaints process more expeditious and economical by streamlining the review of the Commissioner’s decisions
  20. Amend timelines to ensure that the privacy protection regime is accessible and effective
  21. Expand the Commissioner’s ability to collaborate with domestic organizations in order to ensure greater coordination and efficiencies in dealing with matters raising privacy issues