Exploring Discrimination: A User-centric Evaluation of Discrimination-Aware Data Mining. Berendt, B. & Preibusch, S. In pages 344--351, December, 2012. IEEE.
Paper doi bibtex @inproceedings{berendt_exploring_2012,
title = {Exploring {Discrimination}: {A} {User}-centric {Evaluation} of {Discrimination}-{Aware} {Data} {Mining}},
isbn = {978-1-4673-5164-5 978-0-7695-4925-5},
shorttitle = {Exploring {Discrimination}},
url = {http://ieeexplore.ieee.org/document/6406461/},
doi = {10.1109/ICDMW.2012.109},
urldate = {2016-12-05},
publisher = {IEEE},
author = {Berendt, Bettina and Preibusch, Soren},
month = dec,
year = {2012},
keywords = {FATML.org Bibliography, machinelearning, fairness, DADM, visualization, interpretability},
pages = {344--351},
annote = {Author keywords: Discrimination Discovery, Evaluation, User studies, Responsible data mining, Mechanical Turk
This paper critically re-evaluates key assumptions underlying the deployment of discrimination-aware data mining (DADM) and presents a user study of DADM.
The authors offer a distinction between constraint-oriented and exploratory DADM.
Authors take data mining to mean "knowledge discovery", and therefore include pre-processing and deployment in their analysis.
Authors distinguish between discrimination as making distinctions on the basis of some feature and discrimination as distinctions made unjustly on features such as race or sex, which they refer to as discrimination-indexed features [protected categories?] that they point out is not an imbalance but is "a property of a decision that may lead to such an imbalance".
"In its descriptive role, data mining may detect discrimination in a data set, when statistical imbalances originate in earlier decisions."
"In its prescriptive role, the very point of data mining is to create discrimination… a decision rule by definition makes distinctions based on some features"
DADM therefore seeks to prevent selection features based on "bad patterns", which it filters out, leaving only "good" patterns remaining.
Authors describe DADM as acting as a constraint on "decision rules that discriminate against customers with features that have been found to be predictive of undesirable outcomes", in which sacrificing some selection features is outweighed by the "legal or otherwise" imperative to prevent discrimination.
Discusses discrimination as operative over a "static and well-defined" set of features, but also manifests in indirect ways, such as through red-lining, that should be allowed in data-mining if that feature "is highly correlated with or predicts a discrimination-indexed one (such as being black)."
Therefore, DADM should apply an explicit definition of bad patterns and guarantee "that it either does not find any such patterns or finds them all and filters them out."
To do so requires exploratory analyses to find these indirect features correlated with discriminatory features.
The remainder of the article then discusses a means of utilizing a visual data mining tool [DCUBE-GUI] to visualize the effects of discriminatory features on risk factors and then asking non-expert users to explore the effects of discrimination, using mTurk. This supports the authors' argument that static, constraintoriented DADM features should be supplemented with exploratory tools such as DCUBE-GUI.
},
file = {Berendt-ExploringDiscrimination.pdf:C\:\\Users\\Ashudeep Singh\\Zotero\\storage\\9M48QCXH\\Berendt-ExploringDiscrimination.pdf:application/pdf}
}
Downloads: 0
{"_id":"rcM4SZNG2BJnob6cw","bibbaseid":"berendt-preibusch-exploringdiscriminationausercentricevaluationofdiscriminationawaredatamining-2012","downloads":0,"creationDate":"2018-01-23T22:21:13.906Z","title":"Exploring Discrimination: A User-centric Evaluation of Discrimination-Aware Data Mining","author_short":["Berendt, B.","Preibusch, S."],"year":2012,"bibtype":"inproceedings","biburl":"http://www.ashudeepsingh.com/FATML-Bibliography/fatcs.bib","bibdata":{"bibtype":"inproceedings","type":"inproceedings","title":"Exploring Discrimination: A User-centric Evaluation of Discrimination-Aware Data Mining","isbn":"978-1-4673-5164-5 978-0-7695-4925-5","shorttitle":"Exploring Discrimination","url":"http://ieeexplore.ieee.org/document/6406461/","doi":"10.1109/ICDMW.2012.109","urldate":"2016-12-05","publisher":"IEEE","author":[{"propositions":[],"lastnames":["Berendt"],"firstnames":["Bettina"],"suffixes":[]},{"propositions":[],"lastnames":["Preibusch"],"firstnames":["Soren"],"suffixes":[]}],"month":"December","year":"2012","keywords":"FATML.org Bibliography, machinelearning, fairness, DADM, visualization, interpretability","pages":"344--351","annote":"Author keywords: Discrimination Discovery, Evaluation, User studies, Responsible data mining, Mechanical Turk This paper critically re-evaluates key assumptions underlying the deployment of discrimination-aware data mining (DADM) and presents a user study of DADM. The authors offer a distinction between constraint-oriented and exploratory DADM. Authors take data mining to mean \"knowledge discovery\", and therefore include pre-processing and deployment in their analysis. Authors distinguish between discrimination as making distinctions on the basis of some feature and discrimination as distinctions made unjustly on features such as race or sex, which they refer to as discrimination-indexed features [protected categories?] that they point out is not an imbalance but is \"a property of a decision that may lead to such an imbalance\". \"In its descriptive role, data mining may detect discrimination in a data set, when statistical imbalances originate in earlier decisions.\" \"In its prescriptive role, the very point of data mining is to create discrimination… a decision rule by definition makes distinctions based on some features\" DADM therefore seeks to prevent selection features based on \"bad patterns\", which it filters out, leaving only \"good\" patterns remaining. Authors describe DADM as acting as a constraint on \"decision rules that discriminate against customers with features that have been found to be predictive of undesirable outcomes\", in which sacrificing some selection features is outweighed by the \"legal or otherwise\" imperative to prevent discrimination. Discusses discrimination as operative over a \"static and well-defined\" set of features, but also manifests in indirect ways, such as through red-lining, that should be allowed in data-mining if that feature \"is highly correlated with or predicts a discrimination-indexed one (such as being black).\" Therefore, DADM should apply an explicit definition of bad patterns and guarantee \"that it either does not find any such patterns or finds them all and filters them out.\" To do so requires exploratory analyses to find these indirect features correlated with discriminatory features. The remainder of the article then discusses a means of utilizing a visual data mining tool [DCUBE-GUI] to visualize the effects of discriminatory features on risk factors and then asking non-expert users to explore the effects of discrimination, using mTurk. This supports the authors' argument that static, constraintoriented DADM features should be supplemented with exploratory tools such as DCUBE-GUI. ","file":"Berendt-ExploringDiscrimination.pdf:C\\:\\\\Users\\\\Ashudeep Singh\\\\Zotero\\\\storage\\\\9M48QCXH\\\\Berendt-ExploringDiscrimination.pdf:application/pdf","bibtex":"@inproceedings{berendt_exploring_2012,\n\ttitle = {Exploring {Discrimination}: {A} {User}-centric {Evaluation} of {Discrimination}-{Aware} {Data} {Mining}},\n\tisbn = {978-1-4673-5164-5 978-0-7695-4925-5},\n\tshorttitle = {Exploring {Discrimination}},\n\turl = {http://ieeexplore.ieee.org/document/6406461/},\n\tdoi = {10.1109/ICDMW.2012.109},\n\turldate = {2016-12-05},\n\tpublisher = {IEEE},\n\tauthor = {Berendt, Bettina and Preibusch, Soren},\n\tmonth = dec,\n\tyear = {2012},\n\tkeywords = {FATML.org Bibliography, machinelearning, fairness, DADM, visualization, interpretability},\n\tpages = {344--351},\n\tannote = {Author keywords: Discrimination Discovery, Evaluation, User studies, Responsible data mining, Mechanical Turk\n\nThis paper critically re-evaluates key assumptions underlying the deployment of discrimination-aware data mining (DADM) and presents a user study of DADM.\nThe authors offer a distinction between constraint-oriented and exploratory DADM.\nAuthors take data mining to mean \"knowledge discovery\", and therefore include pre-processing and deployment in their analysis.\nAuthors distinguish between discrimination as making distinctions on the basis of some feature and discrimination as distinctions made unjustly on features such as race or sex, which they refer to as discrimination-indexed features [protected categories?] that they point out is not an imbalance but is \"a property of a decision that may lead to such an imbalance\".\n\n\"In its descriptive role, data mining may detect discrimination in a data set, when statistical imbalances originate in earlier decisions.\"\n\"In its prescriptive role, the very point of data mining is to create discrimination… a decision rule by definition makes distinctions based on some features\"\nDADM therefore seeks to prevent selection features based on \"bad patterns\", which it filters out, leaving only \"good\" patterns remaining. \n\n\nAuthors describe DADM as acting as a constraint on \"decision rules that discriminate against customers with features that have been found to be predictive of undesirable outcomes\", in which sacrificing some selection features is outweighed by the \"legal or otherwise\" imperative to prevent discrimination.\nDiscusses discrimination as operative over a \"static and well-defined\" set of features, but also manifests in indirect ways, such as through red-lining, that should be allowed in data-mining if that feature \"is highly correlated with or predicts a discrimination-indexed one (such as being black).\"\n\nTherefore, DADM should apply an explicit definition of bad patterns and guarantee \"that it either does not find any such patterns or finds them all and filters them out.\"\nTo do so requires exploratory analyses to find these indirect features correlated with discriminatory features.\n\n\nThe remainder of the article then discusses a means of utilizing a visual data mining tool [DCUBE-GUI] to visualize the effects of discriminatory features on risk factors and then asking non-expert users to explore the effects of discrimination, using mTurk. This supports the authors' argument that static, constraintoriented DADM features should be supplemented with exploratory tools such as DCUBE-GUI.\n},\n\tfile = {Berendt-ExploringDiscrimination.pdf:C\\:\\\\Users\\\\Ashudeep Singh\\\\Zotero\\\\storage\\\\9M48QCXH\\\\Berendt-ExploringDiscrimination.pdf:application/pdf}\n}","author_short":["Berendt, B.","Preibusch, S."],"key":"berendt_exploring_2012","id":"berendt_exploring_2012","bibbaseid":"berendt-preibusch-exploringdiscriminationausercentricevaluationofdiscriminationawaredatamining-2012","role":"author","urls":{"Paper":"http://ieeexplore.ieee.org/document/6406461/"},"keyword":["FATML.org Bibliography","machinelearning","fairness","DADM","visualization","interpretability"],"downloads":0,"html":""},"search_terms":["exploring","discrimination","user","centric","evaluation","discrimination","aware","data","mining","berendt","preibusch"],"keywords":["fatml.org bibliography","machinelearning","fairness","dadm","visualization","interpretability"],"authorIDs":[],"dataSources":["uzaKuqTDDSn4hnmBP"]}