Fairness Through Awareness. Dwork, C., Hardt, M., Pitassi, T., Reingold, O., & Zemel, R. In ITCS '12, pages 214–226. ACM.
Fairness Through Awareness [link]Paper  doi  abstract   bibtex   
We study fairness in classification, where individuals are classified, e.g., admitted to a university, and the goal is to prevent discrimination against individuals based on their membership in some group, while maintaining utility for the classifier (the university). The main conceptual contribution of this paper is a framework for fair classification comprising (1) a (hypothetical) task-specific metric for determining the degree to which individuals are similar with respect to the classification task at hand; (2) an algorithm for maximizing utility subject to the fairness constraint, that similar individuals are treated similarly. We also present an adaptation of our approach to achieve the complementary goal of "fair affirmative action," which guarantees statistical parity (i.e., the demographics of the set of individuals receiving any classification are the same as the demographics of the underlying population), while treating similar individuals as similarly as possible. Finally, we discuss the relationship of fairness to privacy: when fairness implies privacy, and how tools developed in the context of differential privacy may be applied to fairness.
@inproceedings{dwork_fairness_2012,
	location = {New York, {NY}, {USA}},
	title = {Fairness Through Awareness},
	url = {http://doi.acm.org/10.1145/2090236.2090255},
	doi = {10.1145/2090236.2090255},
	abstract = {We study fairness in classification, where individuals are classified,
e.g., admitted to a university, and the goal is to prevent discrimination
against individuals based on their membership in some group, while
maintaining utility for the classifier (the university). The main
conceptual contribution of this paper is a framework for fair
classification comprising (1) a (hypothetical) task-specific metric for
determining the degree to which individuals are similar with respect to
the classification task at hand; (2) an algorithm for maximizing utility
subject to the fairness constraint, that similar individuals are treated
similarly. We also present an adaptation of our approach to achieve the
complementary goal of "fair affirmative action," which guarantees
statistical parity (i.e., the demographics of the set of individuals
receiving any classification are the same as the demographics of the
underlying population), while treating similar individuals as similarly as
possible. Finally, we discuss the relationship of fairness to privacy:
when fairness implies privacy, and how tools developed in the context of
differential privacy may be applied to fairness.},
	eventtitle = {Proceedings of the 3rd Innovations in Theoretical Computer Science Conference},
	pages = {214--226},
	booktitle = {{ITCS} '12},
	publisher = {{ACM}},
	author = {Dwork, Cynthia and Hardt, Moritz and Pitassi, Toniann and Reingold, Omer and Zemel, Richard},
	urldate = {2017-01-11},
	date = {2012},
	keywords = {{CAREER}, Zotero Import (Mar 30), Zotero Import (Mar 30)/My Library, Zotero Import (Mar 30)/My Library/Algorithmic Fairness, Privacy and Fairness, Statistical Fairness}
}

Downloads: 0