What are dark patterns and how are they tricking me?

Security researcher Ophir Harpaz was browsing a travel website to book a flight when she noticed a little prompt many of us are familiar with. It said: “38 people are looking at this flight.” At first glance, this may seem to be a helpful reminder warning that the flight might sell out. Except it wasn’t so.

In a tweet that has since gone viral, Harpaz explains she inspected the code of the website. It turned out it was designed to claim, at any moment, that a random number of people between 28 to 45 were looking at the same flight. Such design tricks have a name: dark patterns.

While physical shops have their own tricks, including music, scents and layout, the online world offers new ways to lure people into spending more money or giving up more personal data. That’s how we end up with premium subscriptions and newsletters we never intended to sign up for.

The many types of dark patterns

In 2010, user-experience researcher Harry Brignull coined the term “dark patterns” to define subtle ploys used on websites and apps to get you to do things you would not otherwise do through their interface design. He says dark patterns tend to be “manipulative, typically quite deceptive and usually rather frustrating when you realise you’ve been taken in”.

To expose such design techniques, Brignull co-founded the Dark Patterns website and promoted the eponymous hashtag, which document examples of such tactics and shames the companies involved.

An issue that needs to be resolved is where we draw the line; where does aggressive advertising end and manipulation begin?

There are many kinds of dark patterns, but notably “sneak into basket” happens when you’re shopping online and the site sneaks an additional item into your basket.

With so-called “confirmshaming”, the option to decline is worded in a way that attempts to shame the user into compliance. This can consist of a pop-up dialog box, which asks you to provide your email address in exchange for a discount, but the option to decline the offer is framed as a shameful choice: “No thanks, I like paying full price” or “No thanks, I hate saving money”.

Meanwhile, “roach motel” is a site designed in a way that makes it very easy to sign up to a service, such as a premium subscription, but hard to cancel. Then there is “forced continuity” when your free trial with a website comes to an end and your credit card is charged without warning.

Dark patterns rely on our automatic or emotional responses. Psychologist and economist Daniel Kahneman famously described that we have two modes of thinking: fast and slow. The fast kind is unconscious, automatic and less rational, while the slow kind is said to be conscious and rational. Dark patterns exploit our reliance on the former.

New study shows how common they are

A new study by researchers at Princeton University and the University of Chicago analysed 11,000 shopping websites and found 1,800 instances of dark patterns. “At best, dark patterns annoy and frustrate users,” the researchers say. “At worst, they can mislead and deceive users.” This includes causing financial loss, tricking people into giving up vast amounts of personal data or inducing compulsive and addictive behaviour in both adults and children.

Proliferation of dark patterns would be even higher had the study analysed colours and not just text-based interfaces. Light grey can be used to hide an option to decline. Colours we associate with certain modes, such as green and red, can be swapped to catch us on the wrong foot.

“Colour can be used to guide someone in the right direction to accomplish a task effectively and efficiently,” says Priscilla Esser of the Interaction Design Foundation. “When companies want to trick users into selecting the more expensive option of a subscription or want to make finding certain ‘unsubscribe’ options more difficult, the designers can use the same knowledge about colours in a dark way.”

Dark patterns are more likely to appear on popular websites, as ranked by Alexa top sites, according to the Princeton and Chicago researchers. What’s more, there are 22 third-party entities that offer dark-pattern design services to shopping websites. Nudges are weaponised and sold.

Given that advertisers have access to our digital identity and process much of our personal data, could this mean dark patterns can be personalised too? Brignull says he hasn’t come across any instances of personalised dark patterns, but concedes that it’s possible in theory. It is a potential risk on the horizon.

What law says about dark patterns

So far, there have been some attempts to tackle dark-pattern design with legislation. In the European Union, which tends to be at the forefront of customer and privacy protection, the Consumer Rights Directive, in force in the UK, outlaws the “sneak into basket” dark pattern.

In the United States, the Detour Act seeks to curb manipulative dark-pattern behaviour by prohibiting the largest online platforms, those with more than 100 million monthly active users, “from relying on user interfaces that intentionally impair user autonomy, decision-making or choice”.

For legislation to be effective, a central issue that needs to be resolved is where we draw the line; where does aggressive advertising end and manipulation begin? And what kind of information do we expect from companies?

Even if it’s true that 38 people are looking at the same flight as you, should the company let you know? It could potentially be helpful and it could save you from missing out. But we also know that so-called “social proof” makes people spend more. So should companies nudge us with that information, even if it’s true?

For now, it’s still largely up to consumers to protect themselves. In the short run, a third-party provider could rank and shame shopping websites depending on their use of dark patterns. In the long run, we need to reconsider our relationship with online retailers.