Intive Blog

How to Get Rid of Sexist Algorithms

Since algorithms are created by human beings, we cannot help but transmit our intentions, beliefs, practices and thoughts to them. Like any other human creation, technology too is a reflection of what a society thinks and believes in. We’re unable to develop software from a neutral point of view, we’re inevitably biased.

What’s an Algorithm?
Algorithms are sequences of steps built to get a certain result, and they’re everywhere. A cooking recipe can be considered an algorithm, or how we take the bus to work every day, for instance. They affect our daily lives in many different ways, and they often determine what happens to us. They define the ads you get to see in Facebook or Instagram, the first results you see after doing a Google search, or if you’re eligible to get a loan. This kind of “decisions” are based on our personal consumption and profile (ethnic background, gender, social status). With that information, algorithms come to a conclusion (they produce a result) that they assume is the most “accurate”.

But… accurate according to who?

Lucía: “A friend is about to become a father, so I’ve been navigating websites on the subject. Now my social media feeds are showing ads featuring products for babies. And this is an example in relation to credit scores: https://www.nytimes.com/2019/11/15/us/apple-card-goldman-sachs.html.”

Maru: “Since algorithms represent real life, relying too much on technology means we have to trust game rules that are absolutely biased. Sometimes algorithms affect our chances for the worst, but sometimes they even violate our rights and liberties, which is what happens when we leave it to algorithms to “predict” who might commit a crime and where (https://www.lavanguardia.com/tecnologia/20190318/461013536935/inteligencia-artifical-vigilancia-predictiva-policia.html).

Diversity is Key for Algorithms
In some cases, it’s clear that the lack of diversity in a team of developers has a direct impact in the creation of algorithms. Let us look at a few examples:

  • Artificial Intelligence and Machine Learning

Usually, the segment of Caucasian men enjoys some indulgence. Not long ago, Google Photos identified a black man as a gorilla. This can be partly explained by the fact that the developers creating technology are mostly cis-gender Caucasian men. In 2015 in Argentina, of the total number of students attending courses of studies related to programming, only 15% were women, according to an investigation made by the group ChicasEnTecnología, and we don’t even know the variable distribution.

It’s worth saying that that percentage wasn’t always as low. Back in the ‘60s, when the University of Buenos Aires opened the course of studies in computer science, 67% of the students were women, some of whom were very influential in the history of computing in Argentina: Rebeca Cherep de Guber, Cecilia Berdichevsky, Victoria Bajar Simsolo, just to name a few. In the ‘70s, that percentage grew, and 3 of 4 of the students of that course of studies were women. With time, those numbers only declined.

  • Selection of Personnel

Gender discrimination is also very noticeable in programs to select job candidates. Algorithms are based on real-life cases. To explain this, Cathy O’Neil, an American mathematician, uses the case of FoxNews founder Roger Ailes, who was accused of sexual harassment in the mid ‘90s. Women couldn’t stand his behavior for very long and left the company after short periods of service. If at the moment of selecting candidates, the algorithm ranks those employees who stay in one job for long periods of time higher, those women would be left out of the selection program.

Moreover, when we think of it, who are the ones who put their careers aside to take care of their families? Who are the ones who take upon themselves the task of looking after their loved ones in the event of a family emergency? Today, care tasks still fall on women (in heterosexual couples). So the algorithm would be just reflecting the practices of our society.

In truth, the algorithm doesn’t discriminate, the developer does. If the need for diversity was taken into account when selecting candidates (what’s called “positive discrimination”), the program could follow this rule and handle applications differently. As we see it, the problem is that when we develop, we can only evaluate our own experience (o just what the task requires), often without questioning it.

  • Language

Some websites offer the possibility to ask real-time questions to a representative, but at the other side there’s only a bot to answer us. Such bots produce an automatic response message (“Contact Support”, for example), like the error messages that appear when a website crashes. In all these cases, the algorithms need to consider the kind of language to use and the way to communicate, because maybe the user at the other end is visually impaired. If the algorithm has a sexist bias, it means it isn’t communicating well and it’s also reproducing prejudices. For example, Google Translate assumes the gender of a person based on the career, and there are chatbots that always address the user as if they were male. We shouldn’t take this lightly because language isn’t innocent. We don’t create it, quite on the contrary, it defines us.

Is it Possible to Create Algorithms with a Gender Perspective?
There is no such thing as a simple solution or a recipe explaining how to achieve this. In fact, it cannot even be measured. Algorithms reflect how our society is. What we can do is train those who develop algorithms to have a gender perspective, to question things, ask, learn and consider all of that when working. Having more diverse teams is the first step. Although the presence of women in development teams doesn’t guarantee by itself less sexist creations because we’ve all grown up with the same rules and in the same context, it’s always a plus to have different experiences and points of view.

Algorithms represent the real world and each one sees and experiences it differently. So, if we want to get rid of sexist algorithms we can begin by breaking the stereotype of specific careers for boys and for girls, buying less pink objects as gifts for girls and instead allow for moments of adventure and experiments.

Lucía Capón Paul

Lucía Capón Paul is an iOS developer in intive since 2015, where she started as a trainee. A student of Computer Engineering at Universidad de Buenos Aires (UBA), Lucía is a member of intive-FDV’s team very committed to social causes related to technology for inclusion and women’s equality in the IT world. A cat lover and an amateur dancer, she took classes in styles as diverse as tango, swing, salsa and rock.

Mariana Silvestro

Mariana Silvestro is a full-stack developer in intive since December 2017 and leader of the Backend team since October 2018. With a degree in Computer Science, graduated from the Universidad Atlántida Argentina, Mariana is also a Senior Technician in Information Systems from the Universidad Tecnológica Nacional (UTN). Member of the LasDeSistemas community, she is a great militant of feminism and gender equality inside and outside the IT industry. Intensely passionate about reading and writing, she has 4 poems and 3 micro-stories published.

Add comment