Conference Content

Learn more about who, what, when, and where.
2 of 4
Next

In & Out of the Hub

This is your portal to access more content in the Virtual Hub.
1 of 4
Next

Live Content

On these streams you will find live content happening during the conference.
4 of 4
Close

Connect with others

These spaces are built specifically for you to connect with others at the conference.
3 of 4
Next
← Back to Library

Towards Accountability for Machine Learning Datasets: Practices from Software Engineering and Infrastructure

Ben Hutchinson, Andrew Smart, Alex Hanna, Emily Denton, Christina Greer, Oddur Kjartansson, Parker Barnes, Margaret Mitchell
Tags
Accountability
Algorithm Development
Data
Evaluation
Humanistic Theory & Critique
Join the Conversation
Link to ACM Library
Paper questions
Have a question? Ask a question now and we will try to get it answered during the live Q&A.
Ask a Question?
This paper can be seen in
This live session has already passed 🤭. Luckily there is a recording below.
March
8
Mar 08, 2021
22:00
-
23:45
UTC
Add to Calendar 3/8/21 22:00 3/8/21 23:45 UTC Paper Session 8 Check out this session on the FAccT Hub. https://2021.facctconference.org/conference-agenda/towards-accountability-for-machine-learning-datasets-practices-from-software-engineering-and-infrastructure
Abstract

Rising concern for the societal implications of artificial intelligence systems has inspired demands for greater transparency and accountability. However the datasets which empower machine learning are often used, shared and re-used with little visibility into the processes of deliberation which led to their creation. Which stakeholder groups had their perspectives included when the dataset was conceived? Which domain experts were consulted regarding how to model subgroups and other phenomena? How were questions of representational biases measured and addressed? Who labeled the data? In this paper, we introduce a rigorous framework for dataset development transparency which supports decision-making and accountability. The framework uses the cyclical, infrastructural and engineering nature of dataset development to draw on best practices from the software development lifecycle. Each stage of the data development lifecycle yields a set of documents that facilitate improved communication and decision-making, as well as drawing attention the value and necessity of careful data work. The proposed framework is intended to contribute to closing the accountability gap in artificial intelligence systems, by making visible the often overlooked work that goes into dataset creation.

Live Q&A Recording
This live session has not been uploaded yet. Check back soon or check out the live session.