Part 2: The Future of Data Privacy
The following article is the second piece in a two-part series. The first article can be found here.
In our modern world of digital commerce, the global nature of the Internet undermines much of the power of conventional state borders. It has become economically undesirable and nearly impossible for a country to isolate itself from the international flow of data and digital transactions. While regulation is possible via the implementation of data protection laws, weak data privacy regimes can endanger the privacy of their own citizens, as well as those in other countries. Furthermore, strict data laws prove insufficient if international corporations who have an interest in foreign data are not subject to the same regulation.
The European Union has taken the first step in international policy coordination with the General Data Protection Regulation (GDPR), which only permits companies from countries with adequate data protection legislation access to European data. Enacted in 2018, the GDPR serves as an extensive update to the 1998 Data Protection Directive and acknowledges that governments themselves may put user data at risk as well. Canada is currently deemed adequate, but this status comes under review every four years, and Canadian corporations may be deprived of access to EU data by 2022 unless the Personal Information Protection and Electronic Documents Act (PIPEDA) is updated to accommodate technological advancement. The GDPR is considered one of the strictest privacy regimes in the world as it oversees all uses of person-related data in the EU, not just private companies. It also has specific sections applicable to EU member states, noting that the state can only interfere with a person’s privacy under certain conditions, an aspect missing in other laws.
Danielle Miller Olofsson, the Chief Privacy and Knowledge Officer at BCF Business Law in Montreal stresses the need for data privacy laws in light of new technology. “Technology moves a lot faster than the legislator, so by the time you’re [at sufficient regulation], the technology is already five years ahead.” Oloffson asserts artificial intelligence as an example of a new, constantly evolving technology that poses a threat to data privacy. “It is challenging the principles upon which data protection around the world is based.”
Artificial intelligence (AI) and other deep learning tools require massive amounts of user data to work properly, as they are meant to “learn” from the data and use that knowledge to improvise solutions to specific problems, much like humans. Although AI can bring about many benefits, ethical considerations are very much still necessary, yet seldom discussed by legislators and CEOs alike. Aware of this, many consumers are distrustful of those companies that collect their data and have started to falsify their information by putting fake names, employers, locations, and birthdays on their profiles.
In discussing the rising prominence of AI and its perception among average consumers, Olofsson notes that this is emblematic of “a tremendous mistrust towards government and business. We don’t trust they will protect our privacy properly, so we lie. We don’t trust that the government has the regulation necessary to protect us, so we lie to protect ourselves.”
This encompasses the central dilemma: on one side of the data privacy divide are private corporations profiting from unrestricted access to their users’ personal information. However, on the other side, there are governments who are not only unwilling to adequately regulate such corporations, but stand to benefit from the current absence of strong data privacy regimes. The consumer is, therefore, caught in the middle and treated as a commodified vehicle at the disposal of their personal data, rather than as an autonomous agent.
This distinction, and the subsequent growing frustration among Europeans at technological corporations, has led to decreasing engagement with online businesses. Although the GDPR was created to combat this distrust, Europeans have yet to show trust in the corporate ability to handle their data. While the GDPR has inspired many companies from non-EU countries to change their data protection practices, it has failed to inspire a greater degree of confidence in digital corporations. However, it has led to an increased understanding of what rights consumers have regarding their personal data.
“An individual should be allowed to have a greater say in how their data is used,” explained Olofsson, “if only to sensitise the individual to the idea that their data is worth something.”
However, consent-based data protection is not enough to assure data privacy. Since Edward Snowden revealed that the U.S. government was mining its own citizen’s data with neither consent nor warrant, there has been greater awareness that personal information is likely unprotected. Unfortunately, subsequent conversations on data protection have mostly dealt with how it might stifle innovation, or whether protection is even the government’s responsibility, rather than matters regarding personal privacy rights, and, as Olofsson notes, this is short-sighted.
“Deciding what kind of regulation is enough is difficult because we don’t know what enough is. We’re not really thinking about what we’re going to do with our data, how we’re going to manage it, and where this is all going,” said Olofsson. “In France, there are people actively working on thinking about ethics and morality and an ethical distribution. What would a morally acceptable use of data be? Here, it’s as if we’re afraid of these words: ethics, morality. Someone has to think this stuff through.”
Although the legal definitions of privacy refer to the use of personally identifiable information, there is no definition for what a right to privacy entails beyond its violation. There is little that connects digital privacy to one’s freedom, yet the link between privacy and freedom has clear precedent. Although there are laws protecting one’s bodily autonomy, financial information, speech and beliefs, and non-violent actions that occur within one’s home, such as the consumption of pornography, the laws currently in place to protect one’s digital information are weak and inadequate. In Canada, the law is without an enforcement mechanism and in the United States, federal protection barely exists at all.
While the GDPR does have enforcement mechanisms, which are a promising start, it has failed to make people feel safe on the Internet. If international regulations are to be successful in protecting privacy, they need to inspire trust and they must be upheld universally. Data protection laws must be based on protecting the right to privacy, rather than simple consent or who owns what data. They need to be informed by ethics, developed through conversations about what we—consumer, government, and corporation alike—want the future of technology to be. Regulation that protects data is possible, but the approach needs to change from casual negligence to a serious discussion about how privacy is defined in the 21st century and how we want to defend it.
Edited by Chanel MacDiarmid.