• Home
  • ¿Por Qué?
  • Multimedia
  • Twist
  • Teach
    • Glosario
  • Share
    • Cuentanos tu experiencia
  • Coexist
  • Red
    • Colaboradores
    • Activities
  • en
  • es
Twist Islamophobia
Share 0

Artificial Intelligence and the reproduction of racism

By Fundación de Cultura Islámica · On 26 November, 2021


Alfonso Casani – FUNCI

Spanish newspaper La Vanguardia recently reported the results of an experiment to measure the possibilities of artificial intelligence to make moral judgments, which ultimately ended up showing the machine’s racist and misogynist bias. The experiment is a paradigmatic case that highlights the problem of the conditionality of the knowledge on which artificial intelligence relies. If we are racist, so will be our machines.

In simple terms, artificial intelligence refers to the learning and development capacity of machines and their processors, with the aim of learning from the knowledge and databases provided and issuing autonomous responses that are capable of adapting to the environment. As the European Parliament itself explains, the aim is for machines to be able to reason, learn, create and plan their actions.

Source: donnachaod.wordpress.com

The experiment reported by the Spanish newspaper analyzed the responses of a computer to different situations and behaviors, based on the knowledge acquired in virtual networks and forums. To the surprise of the scientists involved, the responses offered by the machine reflected racist behavior (a mistrust of the black population, for example) and sexist behavior (for instance, considering it correct to kiss a woman even though she had expressed her rejection).

The experiment highlights one of the great problems of artificial intelligence: its system needs to rely on databases collected by humans, which implicitly transmit many of the judgments (and prejudices) of its designers. As the article in La Vanguardia concludes, in this case, the machine merely offers “a condensation of our behavior as a digital society”.

White designers and racist products

The aforementioned example is but the latest in a long list of products and research that have shown the risks of transmitting patterns of discrimination to the machines or programs developed. On the one hand, programs based on data or reports can contribute to repeating the same behavior.

In 2016, different researchers reported the discriminatory tendency of the Correctional Offender Management Profiling for Alternative Sanctions (Compas) program, a criminal recidivism risk assessment program, towards the black American community. The program, it was reported, was twice as likely to consider a black person a potential recidivist than a white person. Although the specific calculations behind the program are unknown, it showed one basic problem, the program relied on the historical statistics of the U.S. prison population, where the largest percentage of population is black.

Equally shocking are the multiple complaints against facial recognition machines made in recent years. It was notable the bias shown by the Google Photos application, whose software of classification of people and objects confused the black population with gorillas, as reported in 2015.

In the same vein, a student of the renowned MIT (Massachusetts Institute of Technology), Joy Buolamwini, denounced that same year the problems that facial recognition programs had when recognizing the faces of the black population. After acknowledging that these programs could not recognize her face, as a person of color, Buolamwini put on a white mask, only to find out that her features were identified without difficulty. The problem was compounded when it came to women, as opposed to men. Her study was applied to the artificial intelligence systems of the major technology giants, Amazon, IMB and Microsoft. On average, all three showed a margin of error of less than 1% when the faces analyzed belonged to white males; however, the margin increased to 35% when analyzing the faces of black women. The problem, again, leads us to the databases used, in which the black population and women are underrepresented, i.e., more photographs and data from white male populations are included to help the machines identify the features analyzed.

Underlying this fact is a second problem, which allows us to understand this bias: ten of the largest technology companies in Silicon Valley have no black female employees. Three of them had no employees of color at all. Subconsciously, the designers of these systems reproduce their environment, favoring the representation of the white, male population over that of other minorities.

Source: racounteur.net

A similar case went viral several years ago, when a soap vending machine failed to recognize the hand of a black person as opposed to that of his white companion.

A social problem

These examples are not intended to deny the many virtues of technological development and artificial intelligence, but rather to reflect on the absence of neutrality and the risk of reproducing the discriminatory patterns that lie behind this fact. The article only mentions two of the multiple problems faced by these technological systems: the reproduction of discrimination against minority groups, and the reproduction of the lack of representativeness. Both are social problems, and not technological ones, which show some of the shortcomings existing within society. It is inevitable to conclude that it is not the computer systems that need to be corrected, but the societies themselves.

prejudicesracismsocial experimentTechnology
Share Tweet

Fundación de Cultura Islámica

La Fundación de Cultura Islámica promueve el diálogo intercultural y la educación para el desarrollo de forma independiente, científica y rigurosa, desde hace más de tres décadas.

You Might Also Like

  • Coexist

    Islamophobia in Britain, a concern for all of us

  • Share

    La Guardia Mora Podcast

  • Share

    Gaza: when politics matter more than lives

No Comments

Leave a reply Cancel reply

You must be logged in to post a comment.

Search

Videos

  • Outside the Box: Islam in university campuses

    5 April, 2021
  • Role modelling and multiculturalism: “the Salah effect”

    22 December, 2020
  • Symposium “Tackling Gendered Islamophobia in Europe”

    22 September, 2020

Find us in Facebook

Actividades

  • To be a young Muslim woman in Spain today

    17 March, 2023
  • Quiz La Islamoteca, a new game to dismantle prejudices about Islam

    28 January, 2022
  • Symposium “Tackling Gendered Islamophobia in Europe”

    22 September, 2020

Archives

  • Home
  • ¿Por Qué?
  • Aviso Legal
  • Política de privacidad
  • Política de cookies
  • Contacto

Categories

  • Activities (21)
  • Coexist (43)
  • Multimedia (53)
  • Share (97)
  • Teach (79)
  • Twist (94)
  • Uncategorized (1)

Tags

Coexistence Europe history Human Rights Islam islamophobia media prejudices racism Refugees tolerance USA videos women

Latest news

  • Living as a Muslim in France: The Reality Behind the Headlines

    27 May, 2025
  • ‘Convivencia’, what is it all about? (Part I)

    2 April, 2025
  • Literature and the construction of historical memory: A review of Children of Gebelawi

    13 March, 2025
  • We don’t call it “Notre-Dame”, we call it “Their Lady”

    1 February, 2025

© 2015 Fundación de Cultura Islámica | web by Trixma

Gestionar el consentimiento de las cookies
Utilizamos cookies propias y de terceros para fines analíticos y para mostrarle publicidad personalizada en base a un perfil elaborado a partir de sus hábitos de navegación (por ejemplo, páginas visitadas).
Funcional Always active
El almacenamiento o acceso técnico es estrictamente necesario para el propósito legítimo de permitir el uso de un servicio específico explícitamente solicitado por el abonado o usuario, o con el único propósito de llevar a cabo la transmisión de una comunicación a través de una red de comunicaciones electrónicas.
Preferencias
El almacenamiento o acceso técnico es necesario para la finalidad legítima de almacenar preferencias no solicitadas por el abonado o usuario.
Estadísticas
El almacenamiento o acceso técnico que es utilizado exclusivamente con fines estadísticos. El almacenamiento o acceso técnico que se utiliza exclusivamente con fines estadísticos anónimos. Sin un requerimiento, el cumplimiento voluntario por parte de tu proveedor de servicios de Internet, o los registros adicionales de un tercero, la información almacenada o recuperada sólo para este propósito no se puede utilizar para identificarte.
Marketing
El almacenamiento o acceso técnico es necesario para crear perfiles de usuario para enviar publicidad, o para rastrear al usuario en una web o en varias web con fines de marketing similares.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
Preferencias
{title} {title} {title}