In 2010, the Electronic Frontier Foundation was fed up with Facebookâs pushy interface. The platform had a way of coercing people into giving up more and more of their privacy. The question was, what to call that coercion? Zuckermining? Facebaiting? Was it a Zuckerpunch? The name that eventually stuck: Privacy Zuckering, or when âyou are tricked into publicly sharing more information about yourself than you really intended to.â
A decade later, Facebook has weathered enough scandals to know that people care about those manipulations; last year, it even paid a $5 billion fine for making âdeceptive claims about consumersâ ability to control the privacy of their personal data.â And yet researchers have found that Privacy Zuckering and other shady tactics remain alive and well online. Theyâre especially rampant on social media, where managing your privacy is, in some ways, more confusing than ever.
Hereâs an example: A recent Twitter pop-up told users âYouâre in control,â before inviting them to âturn on personalized adsâ to âimprove which ones you seeâ on the platform. Donât want targeted ads while doomscrolling? Fine. You can âkeep less relevant ads.â Language like that makes Twitter sound like a sore loser.
Actually, itâs an old trick. Facebook used it back in 2010 when it let users opt out of Facebook partner websites collecting and logging their publicly available Facebook information. Anyone who declined that âpersonalizationâ saw a pop-up that asked, âAre you sure? Allowing instant personalization will give you a richer experience as you browse the web.â Until recently, Facebook also cautioned people against opting out of its facial-recognition features: âIf you keep face recognition turned off, we wonât be able to use this technology if a stranger uses your photo to impersonate you.â The button to turn the setting on is bright and blue; the button to keep it off is a less eye-catching grey.
Researchers call these design and wording decisions âdark patterns,â a term applied to UX that tries to manipulate your choices. When Instagram repeatedly nags you to âplease turn on notifications,â and doesnât present an option to decline? Thatâs a dark pattern. When LinkedIn shows you part of an InMail message in your email, but forces you to visit the platform to read more? Also a dark pattern. When Facebook redirects you to âlog outâ when you try to deactivate or delete your account? Thatâs a dark pattern too.
Dark patterns show up all over the web, nudging people to subscribe to newsletters, add items to their carts, or sign up for services. But, says says Colin Gray, a human-computer interaction researcher at Purdue University, theyâre particularly insidious âwhen youâre deciding what privacy rights to give away, what data youâre willing to part with.â Gray has been studying dark patterns since 2015. He and his research team have identified five basic types: nagging, obstruction, sneaking, interface interference, and forced action. All of those show up in privacy controls. He and other researchers in the field have noticed the cognitive dissonance between Silicon Valleyâs grand overtures toward privacy and the tools to modulate these choices, which remain filled with confusing language, manipulative design, and other features designed to leech more data.
Those privacy shell games arenât limited to social media. Theyâve become endemic to the web at large, especially in the wake of Europeâs General Data Protection Regulation. Since GDPR went into effect in 2018, websites have been required to ask people for consent to collect certain types of data. But some consent banners simply ask you to accept the privacy policiesâwith no option to say no. âSome research has suggested that upwards of 70 percent of consent banners in the EU have some kind of dark pattern embedded in them,â says Gray. âThatâs problematic when you're giving away substantial rights.â
Recently, sites like Facebook and Twitter have begun to give their users more fine-grained control of their privacy on the website. Facebookâs newly rolled out Privacy Checkup, for instance, guides you through a series of choices with brightly colored illustrations. But Gray notes that the defaults are often set with less privacy in mind, and the many different checkboxes can have the effect of overwhelming users. âIf you have a hundred checkboxes to check, whoâs going to do that,â he says.
Last year, US senators Mark Warner and Deb Fischer introduced a bill that would ban these kinds of âmanipulative user interfaces.â The Deceptive Experiences to Online Users Reduction Actâ DETOUR for shortâwould make it illegal for websites like Facebook to use dark patterns when it relates to personal data. âMisleading prompts to just click the âOKâ button can often transfer your contacts, messages, browsing activity, photos, or location information without you even realizing it,â Senator Fischer wrote when the bill was introduced. âOur bipartisan legislation seeks to curb the use of these dishonest interfaces and increase trust online.â
The problem is that it becomes very difficult to define a dark pattern. âAll design has a level of persuasion to it,â says Victor Yocco, the author of Design for the Mind: Seven Psychological Principles of Persuasive Design. By definition, design encourages someone to use a product in a particular way, which isnât inherently bad. The difference, Yocco says, is âif youâre designing to trick people, youâre an asshole.â
Gray has also run into difficulty drawing the line between dark patterns and plain bad design.
âItâs an open question,â he says. âAre they defined by the designerâs intent, or the perception in use?â In a recent paper, Gray looked at how people on the subreddit r/AssholeDesign make ethical calculations of design choices. The examples on that subreddit range from the innocuous (automatic updates on Windows software) to the truly evil (an ad on Snapchat that makes it look like a hair has fallen on your screen, forcing you to swipe up). After combing through the examples, Gray created a framework that defines âasshole designâ as one that takes away user choice, controls the task flow, or entraps users into a decision that benefits not them, but the company. Asshole designers also use strategies like misrepresentation, nickel-and-diming, two-faced interactionsâlike advertising an ad blocker that also contains ads.
Many of these dark patterns are used to juice metrics that indicate success, like user growth or time spent. Gray cites an example from the smartphone app Trivia Crack, which nags its users to play another game every two to three hours. Those kinds of spammy notifications have been used by social media platforms for years to induce the kind of FOMO that keeps you hooked. âWe know if we give people things like swiping or status updates, itâs more likely that people will come back and see it again and again,â says Yocco. âThat can lead to compulsive behaviors.â
The darkest patterns of all arise when people try to leave these platforms. Try to deactivate your Instagram account and youâll find itâs exceptionally hard. First, you canât even do it from the app. From the desktop version of the site, the setting is buried inside of âEdit Profileâ and comes with a series of interstitials. (Why are you disabling? Too distracting? Here, try turning off notifications. Just need a break? Consider logging out instead.)
âItâs putting friction in the way of attaining your goal, to make it harder for you to follow through,â says Nathalie Nahai, the author of Webs of Influence: The Psychology of Online Persuasion. Years ago, when Nahai deleted her Facebook account, she found a similar set of manipulative strategies. âThey used the relationships and connections I had to say, âAre you sure you want to quit? If you leave, you wonât get updates from this person,ââ and then displayed the pictures of some of her close friends. âTheyâre using this language which is, in my mind, coercion,â she says. âThey make it psychologically painful for you to leave.â
Worse, Gray says, the research shows that most people donât even know theyâre being manipulated. But according to one study, he says, âwhen people were primed ahead of time with language to show what manipulation looked like, twice as many users could identify these dark patterns.â At least thereâs some hope that greater awareness can give users back some of their control.
- Thereâs no such thing as family secrets in the age of 23andMe
- My friend was struck by ALS. To fight back, he built a movement
- How Taiwanâs unlikely digital minister hacked the pandemic
- Linkin Park T-shirts are all the rage in China
- How two-factor authentication keeps your accounts safe
- đïž Listen to Get WIRED, our new podcast about how the future is realized. Catch the latest episodes and subscribe to the đ© newsletter to keep up with all our shows
- đđœââïž Want the best tools to get healthy? Check out our Gear teamâs picks for the best fitness trackers, running gear (including shoes and socks), and best headphones