Dark patterns 101

Recently, I met up with my fellow Interintellect Clo S to talk about so-called ‘mindful UX’ design. We all might recognise instances where websites or tools are designed in a really annoying way, making it hard to find what we’re looking for. I was surprised (although I really shouldn’t have been) to hear that there is a term for such practices: dark patterns.

Dark patterns are really intriguing to me. Apart from the fact that it sounds like something Kylo Ren (a.k.a. Ben Swolo) might use, the notion that digital tech is sometimes designed in ways to mislead users is very relevant to the work I’m currently doing in my PhD (see for instance this previous post). So, I set out to dive into these dark patterns, and here’s what I found out.

The term ‘dark patterns’ was coined a little over a decade ago in 2010 by a guy called Harry Brignull. Having completed a PhD in cognitive science, he apparently was one of the first to notice all these sneaky back-alley practices that were being implemented by companies online. Normally, user experience (UX) design is meant to create an intuitive environment for the user, easy to understand. Going one step beyond that, there’s a behavioural technique called ‘nudging’, which is often used in consumer science and marketing. Nudging goes beyond simply being intuitive and kind of dips its toes in the pool of manipulation. A good example of nudging is that you have to opt out for a certain less desirable (but for whom?) event to take place. This might sound murky, but it can actually also be used for good as we’ve seen here in the Netherlands, where the government has implemented an opt-out strategy for organ donation: if you want, you can opt out, but if you don’t indicate your preference after a while you will be listed as ‘having no objection’ to organ donation, drastically increasing the potential for much-needed organ transplants.

The answer to the question posed earlier—who benefits?—is important. While most of us may feel that opt-out strategies for organ donation are a good thing (i.e., many, many people will benefit), when online tech companies stealthily make a choice for you, this is generally not a good thing. Here are some examples of dark patterns as listed by Harry Brignull on his site about dark-pattern-awareness.

  • Trick questions: confusing language to trick you into thinking they’re asking for one thing, when they’re asking for the opposite (for instance with a checkbox about agreeing to what is basically spam)
  • Subscriptions, especially paid ones, for services are usually very easy to get up and running, but once you want out… good luck finding the page where you can cancel your sub. I’ve even had the personal experience that some (Wall Street Journal, I’m looking at you) go as far as to require you to call an international number to cancel your subscription, even though setting it up is as easy of a couple of clicks of a mouse button.
  • Another example that I’ve come across a lot personally is ads disguised as elements that are part of (navigation through) websites. For instance, a news site that lists sponsored articles (which are basically ads) among regular real news articles with exactly the same layout. Clearly biased information posing as objective reporting is a big problem.

Pretty recognisable eh? The examples are numerous, and I’ll definitely be keeping an eye out for more examples of dark patterns in my own daily life. Which ridiculous ones have you come across? I’m curious to hear all about your own experiences, and what sorts of consequences these shady design features may have had for you!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Website Powered by WordPress.com.

%d bloggers like this: