Let us imagine that we want to start doing more exercise as a springtime resolution. Just then, someone hands us a flyer on the street advertising a gym with a one-week trial, no strings attached. It seems like a clear sign to us, and besides, if we do not like it, we do not have to pay anything, so we sign up. And, to sign up, we only have to raise a victorious thumbs-up to the man who handed us the flyer, like a senator from ancient Rome.
At first glance, it seems like an innocent promotion, but the fine print—printed in very tiny letters and written in technical terms, far removed from the language we use in our everyday lives—indicates that, once that trial week is over, the subscription will renew automatically. In addition, it sets out a deliberately difficult cancellation process: it can only be done in person, on Wednesdays, and between 9:30 and 10:16. A very limited time slot. On top of that, it is not canceled at the gym itself, but rather we have to go to another one, the central gym, which is relatively close, but we get lost on the way there (a very difficult route, by the way).
It may be that none of this is a problem because we do not want to cancel; we love the gym, it is clean, it does not smell like sweat, the machines are not permanently occupied by very strong men who yell a lot (it is a magical gym), the price is affordable, and we do not want to leave. But what if the amount goes up without prior notice? What if they start charging us extra for every machine we use? The Pilates classes stop being included or, simply, we get bored. We want to cancel, but it is very difficult, so they charge us for another month.
This type of design, with a (very) easy sign-up process, but one in which canceling turns out to be very difficult, would be a “real-life” (non-virtual) example of something I want to talk about today: dark patterns, very present in the world of the internet.
Another common example occurs in the in-person contracting of household services (electricity, gas, telephone service). A salesperson may loudly highlight a very low promotional price and assure us that “there is no commitment,” while the actual contract—which we sign quickly, barely reading it, because we trust that very pleasant man—includes a clause requiring commitment or imposing a financial penalty for early cancellation. This would be what is called in English bait and switch and what we would translate as hook and switch: they attract us with an appealing offer, but the real terms of the contract are actually different, so we are not really making an informed decision. Something similar happened to me many years ago; the girl who was selling it to me (very young) was extremely pleasant, so I switched gas companies. That penalized me enormously and I ended up paying some “extra” costs whose origin I did not know. In another case, it was with insurance. I was offered a cheaper rate, but it was only for a few months. I ended up paying much more.
These kinds of stratagems can also be found in door-to-door sales aimed at older people, where they ask us to “just sign to receive information,” when in reality the signature activates a binding contract. This practice combines elements of coercion (social pressure in the moment, which makes it enormously difficult for us to say no to that veeeery nice man) and deception through ambiguity (the document is not presented clearly). A bit like preferred shares, but with less serious consequences. Although still important.
In all of these cases, the objective is the same: to take advantage of the lack of clear information, the pressure of the moment, or the difficulties involved in processing complex details, in order to induce decisions that the person would not make voluntarily. In the digital world, as I was saying, these are known as dark patterns.
What are these “dark patterns,” which sound like something out of an intergalactic movie? They are design tricks that make us, without realizing it, make decisions that we do not really want to make. Instead of helping us choose clearly, these designs confuse us or pressure us into accepting something that is not in our interest: subscriptions, purchases, or the use of our personal data. Nowadays they appear almost everywhere: the European Commission published a study indicating that almost 97% of websites contained some kind of dark pattern. In addition, they are becoming increasingly complex to detect thanks to automated systems that adapt to what we do. This means that even if we have a high level of digital literacy, we still fall into them. For example, according to the Eurobarometer, 74% of citizens say they are concerned about the use of their personal data on the internet. So, why do we so routinely give up our data online?
Reading about this, I realized that I regularly fell for several of these patterns. Perhaps not the most obvious ones (although I am not safe from them either), but other more subtle ones, giving away personal data, for example, but also buying unnecessary things or failing in my attempt to cancel subscriptions to services I no longer wanted.
There are two major types: “soft” dark patterns and “aggressive” dark patterns. The soft ones consist of subtle tricks that play on our very common little human weaknesses. For example, leaving checked by default the option that is most beneficial to the company. Research in behavioral economics (in which Richard Thaler, who by the way was at our Longevity Congress, is an expert) has shown that whatever appears selected by default greatly influences our decisions, largely simply because changing it requires a small additional effort. Sometimes digital training and experience can help us “not fall for it.” In the case of aggressive patterns, no amount of training helps; that is where the stronger tactics come in: messages that rush you, screens that hide the option you are interested in, colors or sizes that push you to press the wrong button, or emotional messages that want to make you feel bad if you do not accept. This type of design is so manipulative that it can deceive even people with a great deal of digital experience. In fact, a classic experiment on these mechanisms showed that users could be persuaded to accept conditions they had initially rejected simply by modifying the interface design.
Why do they work even if we are used to moving around in the digital realm? Because they are combined with one another and because we often use cell phones, where small screens and the speed with which we browse make it easier for us to press what we do not want. And the more tired we are or the less accustomed we are to technology, the easier we are to manipulate.
Some of the most common tricks used to influence us, according to the literature, are these:
- Leaving checked by default the option that is most beneficial to companies (which means many people do not change it).
- Creating a sense of fake urgency (“Last spots!”, “3 minutes left!”). This has made me end up with things I was not really sure I wanted. A pair of shoes I only wore once, much to my regret.
- Showing exaggerated testimonials or reviews to convince us that “everyone is doing it” or that it is super excellent. Even though our experience ends up telling us otherwise. This happens a lot with the usefulness of certain things (which seemed like they were going to solve our lives and end up abandoned in a corner).
- Strategically raising or lowering prices so that something seems cheap. This also happens in the “real world” (I will not name companies, but it happens with the famous “VAT-free day”).
- Filling the screen with information so that we get tired and accept the first thing we see. In other words: exhausting us.
The result is usually similar: we end up accepting terms we did not want, buying more than we need, or keeping subscriptions, we no longer use. And spending much more money than we think (and would want). For example, according to a Consumer Reports study in the United States, hidden fees and other similar mechanisms—common in dark patterns—can cost an average American family more than $3,200 a year. More than 40% of consumers say they have suffered unexpected financial consequences from dark patterns in online shopping.
Although these practices affect the entire population, they can have a particular impact on older people. Not because they are naive, but because many of these systems are designed to take advantage of very common situations in everyday life: less familiarity with certain digital environments, greater interpersonal trust, or greater difficulty in quickly processing large amounts of information on small screens, which makes them more vulnerable to harmful decisions without realizing it. Again, neither youth nor digital knowledge will keep us safe, but it is important that we be able to understand them (at least, know that they exist) in order to build more autonomous lives in the digital world as well. Again, as with cans of pineapple, I place the responsibility on companies, not on users.
For me, the existence of these designs actually raises one important question (or two): on the one hand, to what extent our decisions are still really ours when the environments in which we choose are designed to push us in a specific direction. Dark patterns affect not only consumption, but also everyday autonomy. And although they can harm anyone, their effects may be greater for those who face less familiar or more complex digital environments. But the problem (and here is the second important question) cannot be reduced to “learning how to use the internet better.” It also implies asking ourselves what kind of digital environment we want to build and what limits should exist to protect decisions that, in principle, ought to be free.