Are You Being Manipulated Online? How Dark Patterns Can Make You Share More Personal Data Than You Intended
Share this Post
Were you ever surprised that certain personal data that you thought was visible only to you was actually being publicly shared in a social network? Have you been overwhelmed by the amount of privacy choices offered by a certain website or app and just gave up trying to understand them? Have you felt that some services are constantly urging you to share more personal data with other users, when you would rather be more private? If you answered yes to any of the questions above, you have been targeted by Dark Patterns in Data Protection (DPDP).
The general concept of dark patterns was first coined by Harry Brignull in 2010. In his website, he defines them as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.” They reflect the idea that the interface of an online product or service can be designed to manipulate users to do things they did not intend to.
When speaking specifically about dark patterns that target personal data (DPDP), the research is more scarce and only recently started to gain more weight and visibility among academics and consumer and data protection organizations. Regulatory efforts are starting to be seen, especially in the United States (US). Within the European Union (EU), however, despite its global impact, the General Data Protection Regulation (GDPR) does not specifically prohibit DPDP and it is still unclear if the current framework can be of use to curb them.
There are numerous challenges involved in the regulation of DPDP, such as their variability, the risks of over-regulation of design and the ever-evolving nature of online environments. Nevertheless, their negative impact on data subjects’ decision making and agency raises concerns and should not be neglected. In the next paragraphs, their characteristics and a taxonomy for DPDP will be discussed, as well as possible legal ways to tackle them.
Dark Patterns in Data Protection
DPDP can be defined as “user interface design choices that manipulate the data subject’s decision making process in a way detrimental to his or her privacy and beneficial to the service provider.”[i] In a nutshell, they are design practices applied to the interface of an online service that impact data subjects before or during the moment of data collection. DPDP manipulate data subjects, especially through the exploitation of cognitive biases, into sharing more or more in-depth personal data. This is done through pressuring, hindering, misleading or misrepresenting, therefore negatively impacting data subjects’ privacy choices.
DPDP affect data subjects’ decision making capacity and their related ability to consent, or not, to a certain form of data collection. The taxonomy presented below – here in a concise form – is therefore inspired by contract law categories of consent defects, as expressed in the Principles of European Contract Law (PECS): threat, excessive benefit, mistake and fraud. They reflect the ways in which consent can be negatively impacted and invalidate a transaction. As we are dealing with the field of data protection law, the categories were adapted accordingly.
- Pressure: pressuring the data subject to share more (or more in-depth) than intended personal data in order to continue using a product or service. I.e. pressure to share: requiring the data subject to reveal personal data to other users in order to use a service (i.e. a running app that automatically shares geolocation with other users, while not allowing the option to hide it).
- Hinder: delaying, hiding or making it difficult for the data subject to adopt privacy protective actions. I.e. difficult deletion: making it difficult (i.e. having to correctly navigate through multiple drop-down choices) or inconvenient (i.e. requiring the data subject to speak with a representative through the phone) to delete the account, occasioning continuous data collection.
- Mislead: using language, forms and interface elements in order to mislead the data subject whilst taking privacy related actions. I.e. ambiguity: using confusing language such as in a pop up stating “do not share my data with third parties” with the options “yes” and “no.” It is unclear if ‘yes’ means ‘share’ or ‘do not share’.
- Misrepresent: misrepresenting facts to induce data subjects to share more (or more in-depth) personal data than intended. I.e. false necessity: stating that collecting certain types of data is legally necessary, or required for the performance of a task or for system functioning, when they are not.
DPDP rely on cognitive biases to manipulate data subjects into a direction that is favorable to the service provider and detrimental to the data subject’s privacy. Some of the cognitive biases most commonly involved in DPDP are anchoring, bandwagon effect, contrast effect, default effect, false uniqueness effect, framing effect, functional fixedness, hyperbolic discounting, loss aversion, optimism bias and restraint bias. To exemplify DPDP’s mechanism of action, an app that has a privacy invasive default is exploiting the default effect, as research shows that, for different reasons, people will tend to stick with the default option, even if it is not the most favorable to them had they read it more carefully. In the context of the taxonomy presented above, the invasive default would be under category B “hinder,” for making it more difficult for the data subject to adopt a privacy protective alternative.
DPDP and the law
One of the first data protection laws with global impact that mentions DPDP is the California Consumer Privacy Act (CCPA), recently amended by the California Privacy Rights Act (CPRA). It states that an “(…) agreement obtained through the use of dark patterns does not constitute consent.” The challenges to frame and understand the spectrum of practices that should be considered dark patterns still need to be tackled, as companies will look for ways to circumvent legislation and still collect additional data. Nevertheless, acknowledging the existence of dark patterns and their negative influence is a first step in the right direction.
The GDPR, on the other hand, despite its detailed rules and conditions for consent, does not make any references to DPDP or invalidate consent obtained through manipulative design practices. Additionally, the GDPR has important principles that aim at protecting the data subject, such as lawfulness, fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality, and accountability. However, according to the current doctrine and the clarifications expressed in the GDPR’s Recitals, none of these principles can be said to outlaw DPDP. Lastly, the GDPR also codified privacy by design as data protection by design and by default (Article 25). However, specifically for being broad and overarching, this framework does not offer adequate specifications that would help identify and curb DPDP.
Fairness as a guiding principle to regulate DPDP
Among the challenges and first steps to efficiently regulate DPDP are: a) their variability, as they can assume multiple forms and exploit different cognitive biases. The legal apparatus would have to be broad and consistent to tackle the full spectrum; b) the difficulty in regulating design practices, by risking being too stiff and unnecessarily hindering improved functionality. The new rules would have to be precise in identifying and outlawing DPDP, without impacting innovation; c) the constantly evolving nature of the online environment, which might quickly transform DPDP and leave a more targeted regulation soon outdated. This last point leads to the importance of having a central guiding principle when regulating DPDP, one that helps to focus on their core negative aspect, irrespective of the means or techniques being deployed. I propose it should be the fairness principle.
Fairness is already mentioned in five Articles (5, 6, 13, 14 and 40) and seven Recitals (4, 39, 42, 45, 60, 71 and 129) in the GDPR. However, it is the only principle that does not have a clear definition or clarification on how it should be interpreted or applied within the data protection context. Fairness is a promising tool to tackle DPDP, as it would highlight their unfair character for manipulating and suppressing data subjects’ agency and their possibilities of having better privacy outcomes. The relationship between data subjects and data controllers needs to be fair, therefore power and knowledge asymmetries between these two sides, such as those exacerbated by DPDP, need to be properly dealt with and mitigated by data protection legislation.
In conclusion, DPDP will continue to linger in digital environments as long as data protection laws do not make direct attempts to tackle them. These efforts must include identifying and banning DPDP and manipulative design practices, as well as guaranteeing that the relationship between data subjects and data controllers is fair, irrespective of the place and time they happen.
[i] Luiza Jarovsky, Dark Patterns, Privacy and the LGPD, Global Privacy Law Review vol 2, issue 2 (2021) – Forthcoming.
————————————————————————————————————————————————–
The Israel Public Policy Institute (IPPI) serves as a platform for exchange of ideas, knowledge and research among policy experts, researchers, and scholars. The opinions expressed in the publications on the IPPI website are solely that of the authors and do not necessarily reflect the views of IPPI.
Share this Post
Utopian or Dystopian: What does the Future of Data-Driven Government Hold?
Proliferation of data-science fields has improved the daily lives of people across the globe, offering tools to improve…
The Weakest Link: Why we Cannot Look at our Information Environment Platform-by-Platform
Over the past year, as the gravity of the Covid-19 pandemic dawned on governments and people around the…
Digital Campaigns’ Trends in the General Elections for the 24th Knesset
Introduction: Algorithmic Campaigns Social media platforms have significantly changed the ways in which political campaigns are run, due…