The Ethical Problems That Exposes “the social dilemma”.

Technology reconfigures the way we perceive.


Aranza Sánchez

3 years ago | 4 min read

The advantages that technology has brought to the world are undeniable. Technological advances not only allow us to communicate but also to face different crises with the best tools.

However, the excess and the changes that are happening can lead us to catastrophic consequences, that is why it becomes crucial to reflect on the scope of technology and social networks.

The whole world is going crazy for the new Netflix documentary “the social dilemma”. The reason for this is important.

Technology is not a passive structure waiting to be used. Rather, it seems to be a being that is getting stronger and stronger and making us look at it.

Artificial Intelligence has come to completely configure the way in which we perceive ourselves, the others, and the whole world.

The ethical problems.

After reflecting on the documentary, I tried to identify a single problem from which all the others were derived. I could’t find it.

The fact that the situation with artificial intelligence has so many problems makes it much more complex to find a solution.

However, there is a way to classify multiple problems:

1. Surveillance capitalism.

The French philosopher and psychologist Michel Foucault strongly developed the theory of the panopticon.

Basically, we can summarize what he wants to achieve with this concept in one sentence:

We are watched.

At the end of the 18th century, an architectural structure was designed to be installed in prisons. It consisted of a tower located in the center, while the cells were located around it in a circle.

This allowed those inside the tower to watch over each of the cells without any obstacles.


Today, there is talk of a digital panopticon and perhaps this form of surveillance is much more dangerous.

Foucault’s panopticon had the ability to shape people’s behavior precisely because they knew they were being watched and would be punished for going against the rules.

In the digital panopticon, the situation is completely different. The behavior of individuals also changes, with the difference that no one perceives that they are being observed and manipulated at the same time.

Since it is almost invisible surveillance, it is not perceived as violent.

On the contrary, those of us on the other side of the screen believe that we are free because we are presented with bunches of information and different options to choose from.

But those options have been meticulously selected according to our preferences, which guarantees that we consume that content.

The messages that are transmitted through social networks are so powerful that they not only achieve the goal of manipulating us but also make us contribute to our own self-exploitation.

2. The means above the ends.

Consumerism is something that has been in the making for a very long time, however, discoveries and advances are giving it a digital character.

“If you don’t pay for the product, then you are the product” — Tristan Harris

This premise that Tristan mentions throughout the documentary is the one that prevails within this industry of digital consumerism, and it is so common that we tend to overlook it.

The point is that we become mere transactions. We go from being people — with desires, aspirations, rights — to things that can be modified and sold.

Kant pointed out very well what was the problem behind seeing people as things and not as ends in themselves. What happens is that we leave human dignity out of the equation.

We cannot afford to lose sight of human dignity because along with it, other things are put in parentheses, such as justice or freedom.

3. Who is responsible?

This question is also asked throughout the documentary, but it is difficult to answer with certainty.

On the one hand, the owners and collaborators of all these big companies like Instagram, Youtube, or Facebook would have to answer all these ethical questions.

“The algorithm has a mind of its own, and even if one person writes it, it is written in such a way that you build the machine and then the machine changes itself” — Bailey Richardson.

In other words, there is a point where we may lose track of what we create. When it comes to an algorithm that favors manipulating millions of people, who should answer? You’ll probably answer that the one who created it.

But we can also argue that no one had foreseen that the algorithm would change in such a way that it would affect the lives of millions of people.

Is there a moral dilemma here?

4. Digital Regulation.

The massification of the Internet, together with the massification of new technologies has brought about dramatic changes in the way individuals and companies interact and in how transactions are carried out.

We live in a digital economy in which people communicate quickly and efficiently thanks to the rapid interconnection, a situation that facilitates access and consumption of huge amounts of information.

This has made it possible for companies to exploit data on a large scale, and for information to be manipulated to a degree that was unthinkable until a few years ago.

Personal information allows companies to improve their services, but also to use such information in ways that potentially conflict with the human right to privacy and human dignity.

The regulation of personal data is currently a hot topic in the international debate. Much remains to be done and regulated.

Netflix is not out of this topic, they also make use of algorithms that show us what fits our preferences.

The question here is, can we make use of the way the algorithms work so that they benefit us and don’t manipulate us? If the answer is yes, how can we do it?


Created by

Aranza Sánchez







Related Articles