We spend a lot of time on social media. However, to what extent do they condition our knowledge and interpretation of the facts? .
The algorithms in the networks work as a bias to reality. / Photo: Pexels
LatinAmerican Post | Ariel Cipolla
Listen to this article
Leer en español: Los algoritmos de las redes sociales modifican nuestra realidad
We are trapped on social media. Especially in a pandemic, where the options for socializing were reduced, new proposals are emerging to attract users every time. For example, the El Periódico website spoke of the "secret of TikTok's success", mentioning that its number of users increased by 80% during the quarantine.
While they can be a great way to detach ourselves from the outside world and all the uncertainties regarding the virus, we must seriously think about how they influence us. That is, to what extent we are conditioned with respect to our understanding of things. Let's understand a little more about this topic.
Social networks and our perception of reality
The arrival and explosion of social networks allowed for endless content that users can see and share. While this is a huge advance for society, allowing us to have an information democracy, this could hide a trap. That is , to what extent do application algorithms allow us to see everything about everything?
The prestigious Argentine neurologist and neuroscientist, Facundo Manes, had commented in the newspaper Diario Popular that there is an important concept to understand the way we think and perceive the world: cognitive biases. From his point of view, it is about mental schemes that serve so that the brain can process information, responding to situations where we must act quickly.
One of the best known is the confirmation bias. According to what the media En Naranja says, it is an active attitude in which we seek information that confirms our beliefs and opinions. In other words, we rarely encourage the search for ideologies that are far from our own, but rather, we will seek all those opinions that seem to reinforce our reasoning.
This is increasingly evident in the journalistic field, even in the political field. The newspaper El Universal highlights that "we live surrounded by fake news, conspiracy theories and very polarized in political terms." That is, the debate is not generated to enrich the points of view, but to show who is right. The problem is that the discussions almost never end up being fruitful, but rather that each of the parties believes they have won.
This false sense of moral authority occurs through the idea that everyone thinks as we think. As we constantly consume content that reinforces our pre-existing ideologies, we automatically discredit ideas that come from outside sectors.
It is what the página 12 media conceives as an "illusion of horizontality", which locks us up in small sectors of the network with people related to our ideas or tastes. This can modify our perception of reality. This is constantly seen in different facets. For example, the fans of a soccer team will follow people from this team, believing that they “dominate” social networks, although the same happens in rival teams.
Although we constantly complain about the lack of transparency in social networks, we also like to believe that everyone thinks like us , so, to a certain extent, we are what we criticize. An example happens with the Twitter algorithm, which, according to what Rock Content highlights, generates ideal tweets for us to be recommended, based on the people we follow and what they consume, although it ends up being a simple mirage.
In this way, what ends up happening is that we become more and more locked into our opinions. Initially, we will follow a greater number of people with similar ideas to ours. At the same time, the networks themselves will suggest that we follow others that we will like, because our own followers interact with them, generating a reality bias.
This could also be seen in what the specialized website of Strategies and Business mentions, which highlights that three researchers from the University of Michigan wondered if social networks can create "bubbles" for user content. Therefore, they came to the conclusion that the individual choice of users conditions the information displayed.
Knowing this, we may not stop consuming opinions that resemble our own, as this is something we would also do in real life. However, to remove the bias a bit and have a vision closer to reality, we should consume news and ideas that are in opposition, so that we can contrast them and obtain much richer conclusions.