Sunday, October 1, 2023

Is artificial intelligence sexist and racist?

Examples

In 2014, Amazon developed software to analyze the CVs of candidates for positions in its company. The software was trained using a database containing the profiles of employees hired or promoted over a 10-year period. However, in 2015 the company discovered that the system had a clear preference for male candidates.

A 2017 study by a University of Washington researcher showed that Google’s voice recognition software was less effective at processing female voices. According to the author, this difference could be explained by the fact that the software would be less effective when analyzing high-pitched voices.

Subscribe to our newsletter!

So you don’t miss any scientific news and know everything about our efforts to fight fake news and misinformation!

In 2018, researchers associated with Microsoft tested applications used to analyze images and determine a person’s gender. They noted that these apps had a very low error rate (0.8%) for identifying white men, but 34.7% for black women.

Finally, certain artificial intelligence algorithms developed to predict the risk of recidivism of young offenders present racist biases, according to a study carried out in 2019 in Spain. The researchers observed that a man of foreign origin was twice as likely to be incorrectly classified as high risk of recidivism than a man of Spanish origin.

Stereotyped word families

These examples and many others are well known. And the reason is also known: the vast majority of artificial intelligence software uses the machine learning approach.

Read Also:  Algeria and Chile continue their scientific collaboration

It’s about allowing the computer to figure out how to perform a task more efficiently by providing it with a large amount of data to train it. The problem is that, in this immense amount of data, biases, stereotypes and prejudices can creep in.

One particular branch of machine learning is called natural language processing. This technology is based, among other things, on the use of statistics to predict the next word in a sentence based on the previous words. It is this branch that is behind the success of robots like ChatGPT since last year: they predict sequences of words at a phenomenal speed.

To do this, you must first convert words into numbers, using an approach called word vectorization. This allows terms with similar meaning to be grouped together.

However, already in 2016, an American study showed that these sets of words contained biases that reflected the stereotypes present in society: the “housewife” woman and the computer scientist man, for example. Another study conducted in 2017 also concluded that word vectorization generally associates women with words related to family and the arts, while it associates men with words related to career, science, and technology.

You can observe this phenomenon yourself using Google Translate. If we ask the algorithm to translate “the doctor and the nurse” into English, we get “the doctor and the nurse,” genderless words. However, if we want to do the reverse translation, we will get “the doctor and the nurse.”

Read Also:  Let's train teachers in generative AI

Biased data at the beginning?

In a recent article discussing the ethical issues of artificial intelligence, the International Peace Institute (IPI) expressed concern about this issue of bias and stereotyping. These, the text recalls, can arise at different stages of creating applications: during the development of the algorithm, during its training using data banks or in the decision-making stage by the ‘AI’.

But it is the database used to train the computer that is one of the biggest sources of bias. If it contains biases, these will be built into the algorithm. For example, in the case of Amazon software, the negative bias toward women reflected the company’s hiring trends in previous years. In fact, the software had been trained almost exclusively with profiles of male employees.

In the case of the Microsoft study on facial recognition using AI, the researchers finally noticed that the databases used to train the algorithms were made up of almost 80% light-skinned faces. British researchers came to a similar conclusion in 2021 when they analyzed algorithms developed to detect skin cancer: dark skin was substantially underrepresented in the databases used to train these algorithms.

Little diversity among programmers

The person who designs the algorithm can also transfer their own biases to it, consciously or not. In fact, the programmer can set rules that contain implicit biases, notes Brian Uzzi, a professor at Northwestern University in a paper that focuses on ways to reduce biases in AI.

Read Also:  The discovery has fascinated us for 35 years

This phenomenon is illustrated by a study conducted in 2020 by American researchers. They used different image categorization programs to analyze photographs of elected representatives of the US Congress. These algorithms had been trained with image banks, in turn categorized by programmers based on a set of words dating back to 1980. Result: the labels assigned to images of women generally referred to their appearance, while those attributed to men men were linked to their occupation.

Tracking solution

Therefore, one of the solutions to reduce the biases of artificial intelligence seems to be to diversify the group of programmers. In fact, women are underrepresented in the fields of programming, information technology, engineering, mathematics, and physics. For example, a 2018 survey by Wired magazine found that only 12% of leading machine learning researchers were women. In 2022, women occupied only 25.8% of technical positions in Meta and black people, only 2.4%.

Verdict

It is almost inevitable that artificial intelligence software will exhibit sexist or racist biases. Due to the way AI works, which consists of feeding it with immense databases, these biases reflect those that, conscious or unconscious, exist in our society and, therefore, are found in these databases and in the programmers. .

Photo: Suriya Phosri | Dreamstime.com

Times of National
Times of Nationalhttps://timesofnational.com
Times of National To give more information about the latest happenings, news related to happenings in the country and abroad a casual understanding of the latest technology products and gadgets, celebrity news and gossip, latest movie news, sports, and cricket scores all you need Always ready to fulfill whatever else is becoming a part of our life nowadays.
Latest news
Related news

LEAVE A REPLY

Please enter your comment!
Please enter your name here