I attended an event last night where Duncan Stewart of Deloitte talked about their TMT predictions for 2016.
It reinforced for me that the future of tech and what it will do for us is potentially awesome. But also at the same time the amount of information that is being collected and stored about each of us is staggering. That creates real privacy challenges, and real possibilities for abuse. And because the information is there, there is a tendency for government and business alike to want to use it.
One scary aspect is that the more we get used to more information being collected about us, the more complacent we get. Our personal freaky line – the line at which we stop using services because we are concerned about privacy issues – moves a little farther away. That is in spite of the fact that the more information there is about us, the more ripe for abuse it is, and the more that we temper or alter our behaviour because we know we are being watched.
Think for a moment about all the information that is increasingly being collected about us.
- Smartphones that know our every move and the most intimate and personal aspects of our lives.
- Intelligent cars that know where we go and how we drive.
- The internet of things where the stuff we own collects information about us.
- Wearable tech that collects information about our fitness, and increasingly our health.
- The trend for information and services to be performed in the cloud rather than locally, and stored in various motherships.
- Big data that functions by saving as much information as possible.
- Artificial intelligence and cognitive learning tools that can turn data into useful information and make inferences based on seemingly unconnected information.
- Blockchain technology that has the potential to record surprising things about us.
On top of all this, it is becoming increasingly harder to understand when our info is staying on our device, when it goes somewhere else, how long it stays there, who has access to it, when it is encrypted, and who has access to the encryption keys.
It is in this context, and the fact that we just don’t have the time to spend to understand and make all the privacy choices that we need to make, that the Privacy Commissioner of Canada last week released a discussion paper titledConsent and privacy: A discussion paper exploring potential enhancements to consent under the Personal Information Protection and Electronic Documents Act
The introduction states in part:
PIPEDA is based on a technologically neutral framework of ten principles, including consent, that were conceived to be flexible enough to work in a variety of environments. However, there is concern that technology and business models have changed so significantly since PIPEDA was drafted as to affect personal information protections and to call into question the feasibility of obtaining meaningful consent.
Indeed, during the Office of the Privacy Commissioner’s (OPC’s) Privacy Priority Setting discussions in 2015, some stakeholders questioned the continued viability of the consent model in an ecosystem of vast, complex information flows and ubiquitous computing. PIPEDA predates technologies such as smart phones and cloud computing, as well as business models predicated on unlimited access to personal information and automated processes. Stakeholders echoed a larger global debate about the role of consent in privacy protection regimes that has gained momentum as advances in big data analytics and the increasing prominence of data collection through the Internet of Things start to pervade our everyday activities.