Dark Patterns (Deceptive Design) in Data Protection
Luiza Jarovsky
Co-founder of the AI, Tech & Privacy Academy, LinkedIn Top Voice, Ph.D. Researcher, Polyglot, Latina, Mother of 3. ??Join our AI governance training (1,100+ participants) & my weekly newsletter (55,000+ subscribers)
Learn how to identify and stop them from dictating your online choices
Dark patterns (or deceptive design), according to?Harry Brignull?- the first designer who coined the term - are?“tricks used in websites and apps that make you do things that you didn't mean to, like buying or signing up for something.”?Some common examples are websites or apps that make it almost impossible to cancel or delete an account, almost invisible links to unsubscribe from non-requested newsletter and insurance products that are surreptitiously added to your shopping cart. You can check more examples?here?or tweet and expose your own personal findings using the hashtag?#darkpattern?(they might be retweeted by?this account?- it's worth checking out some outrageous examples there).
[??Would you like to receive daily privacy and data protection insights? Follow me on?Twitter?and on?LinkedIn]
One of the chapters of my ongoing Ph.D. in data protection, fairness and UX design is about dark patterns in the context of data protection (you can download the full article I wrote on the topic?here). I defined them as “user interface design choices that manipulate the data subject’s decision-making process in a way detrimental to his or her privacy and beneficial to the service provider.” In simple terms, they are?deceptive design practices used by websites and apps to collect more or more sensitive personal data from you. They are everywhere and most probably you have been encountering some form of dark pattern in a daily basis while navigating online. Below are two examples of practices I call dark patterns:
1- Screenshot from the TikTok sign up page:
In this example, you cannot know if the “Yes” or “No” buttons are for the “are you over 18” or for the “do you allow TikTok to show personalized ads” question. According to my taxonomy, it would be a “mislead” type of dark pattern, as it misleads users into accepting personalized ads (as the user is forced to say "yes" to "confirm you are above 18."
2- Screenshot from the website groopdealz.com:
领英推荐
Here, through manipulative language (read the underlined text under the field to insert the email address), the website is pressuring the user to add his or her email and subscribe to the newsletter, so the category of dark pattern is “pressure.” To read more about the taxonomy and the different types of categories and sub-categories, click?here.
In the full?article?about the topic, I have discussed dark patterns’ mechanism of action and the behavioral biases that are exploited by them (showing how they manage to manipulate us to do something that was not our initial intention). I have also presented their legal status regarding the European General Data Protection Regulation - GDPR (no, they are not explicitly illegal) and offered a taxonomy to help us understand what is and what is not a dark pattern. Lastly, I proposed paths of regulatory changes that could help us move forward from here and improve the protection that is offered to users.
My goal in talking about them in this newsletter is to increase awareness about the topic and to show that the design of websites and apps is not inoffensive or neutral.?Design is a powerful tool to manipulate behavior and sometimes - particularly because of behavioral biases - it is difficult to detect and avoid manipulative tricks, especially online.
What you can do as an individual is to?try and be critical about your behavior online and why you are interacting with certain platforms in a certain way (what is your goal? What will you gain with that? What will the platform gain with that? Is is possible that you are being tricked into behaving in a certain way that is actually harmful to you?).
Online platforms that offer services that we love - such as Facebook, Twitter, Tiktok, Amazon, Netflix, Spotify, Tinder and so on - are not “neutral.” They are companies working for (a lot of) profit and an important input for them to make money is your personal data. Their foremost goal is not doing good to people and to the world, but pleasing shareholders (hopefully following the law). Law is not always (perhaps never) at the same pace as the technologies it is applied to (and related manipulative techniques to make them more profitable) so it is important that you are aware of what is happening and decide what is better for you.
There is a lot to unpack here, and I hope to talk more about the topic in future posts. See you next week!
? Before you go:
See you next week. All the best,?Luiza Jarovsky