AI Article Synopsis

  • Machine learning systems affect our everyday lives, so it's important that they make fair and unbiased decisions.
  • The field of fairness in machine learning is growing to understand how these systems can show biases from the data they're trained on and our society.
  • The authors share ideas on how to improve these systems by looking at the data used, how the models are built, and ensuring a diverse team of developers.

Article Abstract

Machine learning systems influence our daily lives in many different ways. Hence, it is crucial to ensure that the decisions and recommendations made by these systems are fair, equitable, and free of unintended biases. Over the past few years, the field of fairness in machine learning has grown rapidly, investigating how, when, and why these models capture, and even potentiate, biases that are deeply rooted not only in the training data but also in our society. In this Commentary, we discuss challenges and opportunities for rigorous posterior analyses of publicly available data to build fair and equitable machine learning systems, focusing on the importance of training data, model construction, and diversity in the team of developers. The thoughts presented here have grown out of the work we did, which resulted in our winning the annual Research Parasite Award that GigaSciencesponsors.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8685850PMC
http://dx.doi.org/10.1093/gigascience/giab086DOI Listing

Publication Analysis

Top Keywords

machine learning
16
fairness machine
8
challenges opportunities
8
learning systems
8
fair equitable
8
training data
8
relationship parasites
4
parasites fairness
4
machine
4
learning
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!