15 November, 2023

Privacy regarding children in Ontario is not expressly dealt with under the Personal Information Protection and Electronic Documents Act — the Federal privacy law that applies to the commercial use of personal information.
The Privacy Commissioner has dealt with this by issuing guidance documents regarding how children’s privacy should be handled. “Putting best interests of young people at the forefront of privacy and access to personal information” is a resolution of several privacy commissioners that sets out some expected practices.
Those include:
- Doing privacy impact assessments and building children’s privacy by design.
- Being transparent with privacy risks and providing privacy info in a clear manner.
- Set default settings to the most privacy-favourable settings.
- Limit tracking and profiling.
- Limit disclosure of personal info.
The proposed Consumer Privacy Protection Act that will eventually replace PIPEDA has a few provisions that refer to “minors”, but it has been criticized for not going far enough.
Contrast this with the approach taken in the UK, which created the Children’s code that “sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services.” It is about keeping children safe and not exploiting them, rather than age-gating them off the internet.
Elizabeth Denham, the UK Privacy Commissioner who was instrumental in developing the Children’s code, explains the age-appropriate design rules in this article titled “Kids’ and teens’ online privacy and safety: 8 compliance considerations.” She also refers to legislation in a few U.S. States that take various approaches to children’s privacy. California has a law similar to the Children’s code. The 8 considerations are:
- Conduct youth impact assessments.
- Implement age estimation, verification or assurance mechanisms.
- Set “high privacy” by default.
- Make legal language clear, concise and comprehensible.
- Adopt data minimization practices.
- Use algorithms and artificial intelligence responsibly.
- Provide parental controls.
- Implement robust security measures.
Come to think of it, some of these would be good practices in general.
The Irish Data Protection Authority recently rendered a decision against TikTok for infringing “… the GDPR’s principle of fairness when processing personal data relating to children between the ages of 13 and 17.” They imposed a fine of €345 million. If you want to wade through the 126-page decision you can follow the link above.
It’s interesting to note that “A third of GDPR fines for social media platforms linked to child data protection.”
Several Canadian Privacy Commissioners are currently investigating TikTok’s privacy practices. It will be interesting to see the result of that investigation.
David Canton is a business lawyer and trademark agent at Harrison Pensa with a practice focusing on technology, privacy law, technology companies and intellectual property. Connect with David on LinkedIn and Twitter.
Image credit: ©InputUX – stock.adobe.com