Saltar al contenido principal

Escribe una PREreview

Evaluating Dataset Bias in Biometric Research

Publicada
Servidor
Preprints.org
DOI
10.20944/preprints202508.1479.v1

Biometric technologies are rapidly becoming central to identity verification, national security, and personalized services. However, as the reliance on biometric systems grows, so do the concerns around dataset bias, a silent but significant threat to fairness, accuracy, and inclusivity. This paper investigates the presence and impact of dataset bias in biometric research, shedding light on how skewed data representation can lead to discriminatory outcomes, particularly for underrepresented groups. We explore the roots of bias, ranging from limited demographic diversity in training datasets to socio-technical factors influencing data collection practices. Through real-world case studies and critical analysis, this study urges researchers and developers to adopt more ethical, transparent, and inclusive data strategies. The goal is not just to improve biometric system performance but to ensure that these technologies serve all individuals equally, regardless of race, gender, or age. Tackling dataset bias isn't just a technical issue; it's a matter of social justice and trust in an increasingly digital world.

Puedes escribir una PREreview de Evaluating Dataset Bias in Biometric Research. Una PREreview es una revisión de un preprint y puede variar desde unas pocas oraciones hasta un extenso informe, similar a un informe de revisión por pares organizado por una revista.

Antes de comenzar

Te pediremos que inicies sesión con tu ORCID iD. Si no tienes un iD, puedes crear uno.

¿Qué es un ORCID iD?

Un ORCID iD es un identificador único que te distingue de otros/as con tu mismo nombre o uno similar.

Comenzar ahora