cft

The illusion of choice

This is how your digital behaviour is designed


user

Diego Lopez Yse

3 years ago | 4 min read

Do you think your actions are the result of your own free choices? What if those actions are the inevitable and necessary consequence of antecedent states of affairs? What does this mean for your free will?

In a deterministic world where there’s an exclusive future for all our actions, digital users can become more predictable and monetizable than ever.

In fact, by using creative designs and deceptive strategies, companies can create deterministic worlds and exploit the fact that human behaviour is hardwired to choose the path of least resistance.

Sites and apps designs become highly relevant in this scenario, because if you know how people think, you can design choice environments that make it easier for people to choose what you want.

Several companies observe and learn an incredible amount of information about user behaviour in order to refine what is called “choice architectures,” discrete design elements intended to influence human behaviour through how decisions are presented.

There are choice architectures all around you, and they are never neutral: they always influence user behaviour, even when they fail to accomplish its objective or there’s no explicit strategy behind.

Nudging

“Nudging” refers to how users can be driven towards making certain choices by appealing to psychological biases. A nudge is any aspect of the choice architecture that alters people’s behaviour in a predictable way without forbidding any options or significantly changing their economic incentives. It represents a small feature of the environment that alters people’s behaviour but does so in a non-enforced way.

It’s true that choice architectures influence by default (even when there’s no apparent intention behind), so a nudge is best understood as the intentional attempt at influencing choice.

Example of nudging: the pre-selected amount suggests is the ‘right’ one. Source: The Internet Patrol
Example of nudging: the pre-selected amount suggests is the ‘right’ one. Source: The Internet Patrol

Nudges work without limiting the user’s original set of choices, and can produce positive results like:

Whether through reminders, personalized notifications, awards or default settings, nudges can steer people to make better choices. But the question is: better to whom? What if there were opposite interests behind the act of driving user’s behaviours?

The dark side

There are tricks that go beyond nudges to influence decision making, and cause users to do things they may not otherwise do. Dark patterns are subtle ploys many companies use to manipulate you into doing something, such as buying or signing up for something, or disclosing personal or financial details.

A design that delivers the best conversion rate might not be the same that delivers the best user experience.

Example of a dark pattern applied to hide extra costs. Source: Shopify
Example of a dark pattern applied to hide extra costs. Source: Shopify

Dark patterns may exploit timing to make it harder for a user to be fully in control of its faculties during a key decision moment. In other words, the moment when you see a notification can determine how you respond to it. Or if you even notice it.

Dark patterns also leverage on a very human characteristic: we’re lazy by design, so producing friction is always an effective strategy. Designs that require lots more clicks/taps and interactions discourage users into engaging with that content.

In this example, the user must actively unselect the extra product, or otherwise it will appear in the checkout. Source: Econsultancy
In this example, the user must actively unselect the extra product, or otherwise it will appear in the checkout. Source: Econsultancy

The site Darkpatterns.org details some of these deceptive mechanisms like “roach motel” (when the design makes it simple to sign up but hard to cancel), “disguised ads” (ads masqueraded as content that isn’t trying to sell you something), or “privacy Zuckering” (named after Facebook CEO Mark Zuckerberg, is the trick of getting you to overshare data).

These strategies extend everywhere on the internet, and although some countries try to regulate this behaviour, the truth is that not all governments seem to be taking action:

As digital users, we are in a very hard spot. The fact that many US banking sites are harder to read than Moby-Dick (making 58% of US bank content not readable for the average American), and about 11% of retail websites contain dark patterns, reveals that we are continuously under siege. Is there any way out?

Final thoughts

Manipulation through behavioural techniques can occur quietly and leave no trace. Since companies can drive customer’s decisions through heavy analytics and user interfaces, it’s easy to imagine a digital future in which social platforms employ algorithms to predict the virality and monetizability of each post, only accepting or highlighting the ones that could generate sufficient revenue.

Companies are super focused on testing and experimenting with different techniques to get the most desirable responses, and since they are incredible experts in the discipline, it seems hard to avoid being deceived.

The good news is that education is a powerful tool, and by knowing some of their strategies and trying to be aware of our cognitive biases, we might be able to sidestep some of the traps.

Sites like Darkpatterns.org or Princeton’s Web Transparency have lots of examples regarding dark patterns and awesome material. I suggest you take some time and go through them. It will pay off. You can also explore @darkpatterns or use the hashtag #darkpatterns on Twitter to call out what you’ve found or discover what others have found. Social media engagement is a good way to put pressure on companies to stop using these practices.

If you can detect deceptions, you’re more likely to avoid them

Interested in these topics? Follow me on Medium, Linkedin or Twitter

Upvote


user
Created by

Diego Lopez Yse

Reshaping with technology. Working in AI? Let's get in touch


people
Post

Upvote

Downvote

Comment

Bookmark

Share


Related Articles