When artificial intelligence discriminates, it looks like this
Article originally published in March 2018
Artificial intelligence (AI), these are the words we’ve been reading in every techno article for a while! And that even interests my grandmother! Because this technology is becoming ubiquitous in our lives, it’s important to make sure it’s evolving in the right way, because this hasn’t always been the case.
Me Gabrielle Paris Gagnon, a technology lawyer with Propulsio conseillers d’affaires 360, explains that, although AI is a great leap forward for mankind, the fact remains that it is not yet fully developed.
“Over the last ten years, deep learning has seen unprecedented breakthroughs thanks to the use of graphics cards, the availability of megadatabases and scientific advances,” she reports. This period is also not lacking in examples of failures that have demonstrated the risks associated with a lack of diversity within work teams. Think, for example, of voice recognition systems that tend to perform better with male voices, or facial recognition systems that fail to detect black users.”
As a reminder, here are a few cases where artificial intelligence has failed (and I say that with restraint). You should know that many of these cases no longer exist today, as they have been corrected.
- Apple released an app in 2014 to assess our health. It’s a great, much-needed idea, but unfortunately the designers hadn’t thought to include menstrual cycle tracking.
- A 12-year-old girl loves playing video games (as a reminder, video games are played as much by women as by men) but it’s impossible to find an avatar that looks like her in 2015. In many online gaming mobile applications, avatars are only boys.
- An international beauty contest was organized in 2016 with an artificial intelligence algorithm as judge. Only white women were considered beautiful because that was the data the system had been fed.
- If you search Google Images with the term “doctor” in 2017, it only brings up pictures of men, because apparently, a woman can’t be a doctor…
- In the early days of Apple’s personal assistant, back in 2011, Siri wasn’t capable of helping you find the morning-after pill. In fact, it didn’t even understand the request.
- In the register of assistants, it has been proven time and again over the last few years that the attribution of female voices is in fact only a reflection of certain sexist ideals in Silicon Valley.
There are many other cases, and I haven’t even touched on situations where racialized people are not taken into consideration. Another example for the road: a “racist” soap dispenser can’t work when a hand with black skin comes forward. And yes, we didn’t think to teach the machine to see anything other than white skin!
Why can’t we create inclusive systems? Like all technologies, artificial intelligence reflects the values of its creators. AI uses the data its creator gives it to “eat”, and therefore transmits the creator’s prejudices. When a team is set up to develop an AI, it is crucial that this team examines whether elements of human and cultural diversity in systems are properly taken into account during the construction of the algorithm. It’s not up to the AI to ask the questions, it’s up to the designers to think about how to nurture the system. Artificial intelligence is not like any other technology,” insists Gabrielle D. Paris.
“What’s important to remember is that with traditional technologies, programmers code the entire decision-making process of the machine,” she stresses. We know in advance what the machines will do. With AI, however, programmers have no control over the outcome, making the training phase crucial. The programmers who train these systems tend to reproduce their own worldview and therefore their own biases.”
Unless team members represent all facets of diversity, it’s impossible to ask the necessary questions, because if something isn’t part of your reality, you won’t even think about it.
So what can we do about it? If you read this blog regularly, you’ll know that the answer lies in diversity. An opinion shared by Me Paris. “These experiences quickly made researchers and industry people realize the importance of having diverse teams. It’s a way of reducing risk. In the training phase, it ensures that the experiences of diverse groups, including women, racialized people, from different social classes or sexual orientations, are represented.”
Joëlle Pineau, who heads up Facebook’s artificial intelligence research lab in Montreal, drives home the point about the need for women in the industry. In her view, it’s everyone’s duty to shoulder this responsibility.
“In AI research, while we have made progress, women and minorities are still not well represented, and women in particular are leaving the field at a faster rate than men,” she writes in a blog. We all need to do a better job of communicating how diverse and fascinating AI can be to encourage more people to get involved. And we need to encourage the careers of young scientists and provide an interesting, safe and rewarding working environment for all members of the community.”
Artificial intelligence is becoming an increasingly important part of our daily lives. It conditions the way we act and think, as Facebook filters did for the US presidential election. So it’s imperative to have more women playing a role in the development of these technologies. And not just as virtual assistant voices…
- How to build an inclusive company for deaf and hard-of-hearing employees - 27 January 2021
- Positive discrimination in technology: against - 20 July 2020
- For affirmative action in technology - 30 June 2020