Although artificial intelligence has been around for many years, it has gain. unprec.ent. popularity in recent months. With the launch of OpenAI’s Chat GPT , something of a technological race has unfold. among the industry’s giants. Companies are desperate to incorporate more and more of these tools to keep up. This is how artificial intelligence is permeating all sectors: it handles banking systems, hospitals, corporate recruitment processes, and even government entities. But before we rely on artificial intelligence to make increasingly complex decisions, we should remember that these tools are creat. and programm. by people with their own biases and are not as objective, perfect, or impartial as we would like to believe.
“The machine is made by man, and it is what man makes of it,” says Jorge Drexler in a song. And this phrase applies perfectly to one of the recent mobile phone number data problems we have today with artificial intelligence. Those who design and develop these machines are overwhelmingly men. “If systems are not develop. by diverse teams, they are less likely to address the ne.s of diverse users,” says a UNESCO article titl. “The Effects of AI on Women’s Working Lives.” It’s problematic that the technologies that will shape our futures are design. by an unrepresentative group of society. But how did we get here? Why are women underrepresent. in the technology industry? Let’s start at the beginning.
Let’s work together.
Contact us
CONTACTANOS
The machine is made by man
At this point, it’s ridiculous to think it’s a lack of interest or willingness. There are hundr.s of examples of women who made significant advances in technology (does Ada Lovelace ring a bell?) but who have consistently been eras. from the most promising scientific discovery history. Mariana Costa, co-founder and president of Laboratoria, believes it’s a problem of cultural upbringing: “It’s as if it’s not a job for women, and that mentality begins to be impos. from childhood.”
According to a 2020 study by Spain’s National Observatory of Telecommunications and the Information Society, less than 20% of vacancies in technology are held by women . In Argentina, although 60% of university students are women, they represent only 15% of those in science and technology fields.
To ensure equal opportunities, it is essential that both girls and boys be encourag. in these fields from early childhood .ucation, incorporating the stories of women scientists into the curriculum and using gender-ambiguous terms. This is also important in recruitment processes: how do we expect women to feel drawn to the job when all job postings are written in exclusively male-dominat. terms?
It’s such a masculiniz. field that if you Google sault data “data engineer,” it suggests a typo and that you actually meant “data engineering.” Thousands of results for courses, articles, and job postings exclusively use the term “data engineer.” It may seem like a small thing, but many small things like these contribute to a common perception that technology isn’t a field for women.
And that’s what man does with it
Have you ever wonder. why most robotic assistants (like Alexa or Siri) have a woman’s voice? It’s no coincidence. Nor is it a coincidence that an algorithm learns to associate images of household appliances with women. Or that AI tools rate photos of women as more sexually suggestive than those of men . “Algorithms are nothing more than opinions encrypt. in code,” says Cathy O’Neill. It’s clear that the biases of those who program and design these tools are reflect. in the technologies.
In other cases, the problem isn’t so much the programmers’ biases, but our own history. Gemma Galdón, algorithm auditor and founder of Eticas Research and Consulting, explains that, for example, to decide who to grant a mortgage to, the algorithm will rely on historical data: “Bas. on this data, men will have a better chance than women, because we’ve been grant. less mortgages historically and because the system assigns us a risk profile.”
In 2018, Amazon had to reverse the implementation of an AI tool it was using to optimize its recruiting processes. The problem was that the algorithm had been train. primarily on men’s resumes and learn. to prioritize them and penalize those that includ. the word “woman.”
Our data also reflects our history of injustice toward women and other minorities. And this is a more difficult problem to solve. “We can’t fix the algorithms by introducing better data because there isn’t better data,” says AI journalist Mer.ith Broussard. Technology can’t fix the system’s injustices. “The underlying problem is society,” Broussard says.
Towards a more inclusive AI
Data tells us how we were, but not how we want to be. To build artificial intelligence models that stop perpetuating injustice, we must first become aware of this problem. If we hand over the reins to AI to make decisions bas. on our past without any filter, our future won’t be much better. And isn’t that the point? To build a better future?
For Broussard, this also means making
The decision not to use artificial intelligence for some particularly sensitive processes . He gives the example of the use of facial recognition technology by police forces: Broussard believes this is a very risky use as it can lead to unjust arrests. “It’s not the end of the world if you don’t use a computer for something. You can’t assume a technological system is good just because it exists.” So a first conclusion we could draw is that these technologies should be appli. with great caution and discretion, and that sometimes it’s even better to do without them.
On the other hand, as we mention. before, it’s important to encourage the inclusion of women and other minorities in the sector so that our technologies are develop. with diverse perspectives, for diverse users.
In this regard, the work of public and
private organizations committ. to inclusion is essential. Girls in Technology and Women in Technology (MeT) are two organizations that provide training, talks, initiatives, workspaces, and opportunities for women and diverse groups in technology. In the private sector, companies such as Globant and Moove It offer training and other initiatives for women around the world.
Above all, as Spain’s Secretary of State for Digitalization and Artificial Intelligence, Carme Artigas, said: “It is essential that we not let our guard down. We cannot leave 51% of our female talent out of the design of a national model and a global environment that is yet to be develop..”
And we ne. to treat the issue with the seriousness it deserves. Currently, Chat GPT— the world’s most popular artificial intelligence tool— includes the following disclaimer on its product: “It may occasionally produce harmful instructions or bias. content .” We can no longer treat “bias. content” as an acceptable accident of an imperfect machine. Gender discrimination is not a system “bug,” a mere error to be correct.: it is a human rights issue.