Welcome to Algocracy: The Power of Algorithmic Bias

Algorithmic bias refers to the false sense of control that humans have over some of the decisions we make.

8 mins read
Welcome to Algocracy: The Power of Algorithmic Bias
Large companies that program algorithms may not be transparent about their goals and mechanisms.

Look for a job. Make a complaint to customer service. Provide information about a loan or other banking product. Book airplane or hotel tickets. The news, the information, the ads you see when you log into your social media… Algorithms are behind endless daily tasks and in more industries than we imagine, leading to a phenomenon known as algorithmic bias.

These are the silent mechanisms that increasingly drive the world without us realizing it. And most striking of all, they are learning to be increasingly effective with the data we provide them. They are constantly adapting to the human being coded as “user” and striving to deliver an increasingly personalized, fast and satisfying experience in a way that minimizes effort.

Its tentacles reach into dating apps and can even mediate our choice of partner by presenting us with a series of very specific candidates, allegedly based on our preferences. This foray into the digital world is an example of algorithmic bias that is believed to even influence our political decisions…

Welcome to Algocracy: The Power of Algorithmic Bias
Algorithms can trap us in a bubble of information.

What is algorithmic bias?

Algorithms can make our lives easier. For example, if you are a nature lover and environmental advocate, you are very likely to find more and more information about it on social media. This is not necessarily a bad thing. But things change when your concerns are not entirely healthy.

Let’s not forget the terrible case of Molly Russell. This teenager was looking for topics related to suicide and there came a time when everything her social networks showed her was content related to this theme. Almost without realizing it, we can be trapped in an information bubble where other trends and diverse content no longer penetrate.

Algorithmic bias refers to our false sense of control over the information we receive and the decisions we make due to a disturbing code that infiltrates our daily lives and insists on presenting us with data that is almost never unbiased. But let’s not forget that there are almost always hidden interests.

You don’t decide; the algorithm decides for you
There is a phenomenon we see more and more often in the normalization of AI. AI gives us a false sense of control and self-efficacy. This feeling will only increase when the use of ChatGPT, which will help us with our theses, our academic work, our endless daily tasks, becomes normalized.

We will feel more productive, but in reality it will be that chatbot that will perform the tasks that concern us. This is not always negative, but it increases the algorithmic bias mentioned above. That is, the perception that we make decisions and act without any intervention when this is not the case.

Welcome to Algocracy: The Power of Algorithmic Bias

Algorithms are unfair

Cathy O’Neil is a mathematician who wrote a very popular book called Weapons of Mathematical Destruction (2016). In it she described algorithms as “weapons of mass destruction”. For starters, these computational values are not immune to moral and cultural biases, not to mention the vested interests behind them.

The book tells the story of a teacher who was fired after an algorithm gave her a negative evaluation. This evaluation analyzed data from personal messages, medical reports and more. The same thing happens when assessing mortgages or aid allocations. For example, certain ethnic groups will always be disadvantaged.

But most companies and organizations validate these quick analyses. Algorithm bias leads them to conclude that what an algorithm analyzes will always be valid, even if it is not fair, and often this data is not even personally compiled.

Algocracy: Algorithms in the service of politics

Politicians are thought to be sometimes distant from the real problems of the people. Another criticism is that they may overspend on advisors and may even make mistakes when making decisions.

A study published by the consulting firm Deloitte recently told us something very striking. There may be a future where algorithms and artificial intelligence will take over a significant part of some tasks. They can easily analyze the data that big tech companies collect about us with our mobile phones. This way they will know our needs to make more appropriate social responses.

In the same way, AI can be trained to prevent any administration from committing fraud. Its analytical capacity would replace a large number of consultants and free public institutions from endless workloads. Algocracy, understood as the power of algorithms to replace work, may seem very dystopian, but it is a real possibility.

Utrecht University conducted a study showing that it could be beneficial to allow algorithms to replace all bureaucratic aspects of government agencies. The reason? Citizens tend to trust more the work that a machine can execute (another obvious bias).


Algorithmic bias is here to stay and growing stronger. We will continue to think that most of the purchases we make, the people we pay attention to on social networks or the opinions we think are correct are the product of our will. We will continue to perceive ourselves as having free minds, but in reality we will quietly become more and more conditioned.

We see this especially in young people who are increasingly unhappy living in a digital universe based on social comparison. We need to understand that algorithms are not self-created entities. Rather, there are big companies behind them that program them. And this kind of programming always has a purpose.

If we are moving towards a future where humans and AI work together, we need those who train and program AI to be transparent and to be guided by more ethical, fair, moral and healthy values. We need to regulate these mechanisms that are gradually changing the behavior of users.

The content is for informational and educational purposes only. They are not a substitute for the diagnosis, advice or treatment of a specialist. If you have any doubts or concerns, it is best to consult a trusted professional.

michael Stepansky

Conducts studies in the field of political sciences.
Creates their articles by scanning media

Fatal error: Uncaught TypeError: fclose(): Argument #1 ($stream) must be of type resource, bool given in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php:2381 Stack trace: #0 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2381): fclose(false) #1 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2141): wp_cache_get_ob('<!DOCTYPE html>...') #2 [internal function]: wp_cache_ob_callback('<!DOCTYPE html>...', 9) #3 /home/fikrikadim/public_html/wp-includes/functions.php(5420): ob_end_flush() #4 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(324): wp_ob_end_flush_all('') #5 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(348): WP_Hook->apply_filters('', Array) #6 /home/fikrikadim/public_html/wp-includes/plugin.php(517): WP_Hook->do_action(Array) #7 /home/fikrikadim/public_html/wp-includes/load.php(1270): do_action('shutdown') #8 [internal function]: shutdown_action_hook() #9 {main} thrown in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php on line 2381