Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies
Facial recognition technologies (FRT), widely used in various applications such as identity verification, surveillance, and access control, often exhibit algorithmic bias, resulting in inaccurate identification of women, people of color, and gender-nonconforming individuals. The prevalence of such b...
Saved in:
Main Author: | |
---|---|
Format: | text |
Language: | English |
Published: |
Animo Repository
2024
|
Subjects: | |
Online Access: | https://animorepository.dlsu.edu.ph/etdd_philo/14 https://animorepository.dlsu.edu.ph/context/etdd_philo/article/1014/viewcontent/2024_Domingo_Full_text.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | De La Salle University |
Language: | English |
id |
oai:animorepository.dlsu.edu.ph:etdd_philo-1014 |
---|---|
record_format |
eprints |
spelling |
oai:animorepository.dlsu.edu.ph:etdd_philo-10142024-04-11T00:42:25Z Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies Domingo, Rosallia M. Facial recognition technologies (FRT), widely used in various applications such as identity verification, surveillance, and access control, often exhibit algorithmic bias, resulting in inaccurate identification of women, people of color, and gender-nonconforming individuals. The prevalence of such biases raises concerns about the fairness of facial recognition systems. Current AI fairness methods have attempted to address these issues through an intersectional framework, but the problem of auto-essentialization adds a critical dimension to the bias problem in FRT. Auto-essentialization refers to perpetuating racial and gender inequalities through automated technologies rooted in historical and colonial notions of difference, particularly concerning racialized gender. This historical and deep-seated bias challenges AI fairness methods’ dominant “de-biasing” algorithm approach. The persistence of these historical systems of inequality, if not recognized and addressed, may lead to the perpetuation and reinforcement of bias in FRT. Therefore, the objective of this study is two-fold: firstly, to analyze the algorithmic bias in FRT as a process of auto-essentialization rooted in historical colonial projects of gendered and racialized classification; and secondly, to explore the epistemological and ethical limitations of the prevailing intersectional AI fairness approach in addressing racial and gender bias within FRT as an auto-essentialist tool. In pursuing these objectives, the study emphasizes the critical need for a reimagined intersectional AI fairness approach to mitigating bias in FRT. This approach incorporates decolonial feminist perspectives into existing AI fairness frameworks to comprehensively address the problem of auto-essentialization, which contributes to racial and gender bias in FRT. 2024-03-19T07:00:00Z text application/pdf https://animorepository.dlsu.edu.ph/etdd_philo/14 https://animorepository.dlsu.edu.ph/context/etdd_philo/article/1014/viewcontent/2024_Domingo_Full_text.pdf Philosophy Dissertations English Animo Repository Face perception Artificial intelligence Decolonization Feminist theory Philosophy |
institution |
De La Salle University |
building |
De La Salle University Library |
continent |
Asia |
country |
Philippines Philippines |
content_provider |
De La Salle University Library |
collection |
DLSU Institutional Repository |
language |
English |
topic |
Face perception Artificial intelligence Decolonization Feminist theory Philosophy |
spellingShingle |
Face perception Artificial intelligence Decolonization Feminist theory Philosophy Domingo, Rosallia M. Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies |
description |
Facial recognition technologies (FRT), widely used in various applications such as identity verification, surveillance, and access control, often exhibit algorithmic bias, resulting in inaccurate identification of women, people of color, and gender-nonconforming individuals. The prevalence of such biases raises concerns about the fairness of facial recognition systems. Current AI fairness methods have attempted to address these issues through an intersectional framework, but the problem of auto-essentialization adds a critical dimension to the bias problem in FRT. Auto-essentialization refers to perpetuating racial and gender inequalities through automated technologies rooted in historical and colonial notions of difference, particularly concerning racialized gender. This historical and deep-seated bias challenges AI fairness methods’ dominant “de-biasing” algorithm approach. The persistence of these historical systems of inequality, if not recognized and addressed, may lead to the perpetuation and reinforcement of bias in FRT. Therefore, the objective of this study is two-fold: firstly, to analyze the algorithmic bias in FRT as a process of auto-essentialization rooted in historical colonial projects of gendered and racialized classification; and secondly, to explore the epistemological and ethical limitations of the prevailing intersectional AI fairness approach in addressing racial and gender bias within FRT as an auto-essentialist tool. In pursuing these objectives, the study emphasizes the critical need for a reimagined intersectional AI fairness approach to mitigating bias in FRT. This approach incorporates decolonial feminist perspectives into existing AI fairness frameworks to comprehensively address the problem of auto-essentialization, which contributes to racial and gender bias in FRT. |
format |
text |
author |
Domingo, Rosallia M. |
author_facet |
Domingo, Rosallia M. |
author_sort |
Domingo, Rosallia M. |
title |
Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies |
title_short |
Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies |
title_full |
Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies |
title_fullStr |
Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies |
title_full_unstemmed |
Reimagining intersectional AI fairness: A decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies |
title_sort |
reimagining intersectional ai fairness: a decolonial feminist approach to mitigating auto-essentialization in facial recognition technologies |
publisher |
Animo Repository |
publishDate |
2024 |
url |
https://animorepository.dlsu.edu.ph/etdd_philo/14 https://animorepository.dlsu.edu.ph/context/etdd_philo/article/1014/viewcontent/2024_Domingo_Full_text.pdf |
_version_ |
1797546054737985536 |