Auditing for Discrimination in Algorithms Delivering Job Ads. Imana, B., Korolova, A., & Heidemann, J. Technical Report arXiv:2102.07433v2 [cs.NI], arXiv, April, 2021. Paper doi abstract bibtex Ad platforms such as Facebook, Google and LinkedIn promise value for advertisers through their targeted advertising. However, multiple studies have shown that ad delivery on such platforms can be skewed by gender or race due to hidden algorithmic optimization by the platforms, even when not requested by the advertisers. Building on prior work measuring skew in ad delivery, we develop a new methodology for black-box auditing of algorithms for \emphdiscrimination in the delivery of job advertisements. Our first contribution is to identify the distinction between skew in ad delivery due to protected categories such as gender or race, from skew due to differences in qualification among people in the targeted audience. This distinction is important in U.S. law, where ads may be targeted based on qualifications, but not on protected categories. Second, we develop an auditing methodology that distinguishes between skew explainable by differences in qualifications from other factors, such as the ad platform's optimization for engagement or training its algorithms on biased data. Our method controls for job qualification by comparing ad delivery of two concurrent ads for similar jobs, but for a pair of companies with different de facto gender distributions of employees. We describe the careful statistical tests that establish evidence of non-qualification skew in the results. Third, we apply our proposed methodology to two prominent targeted advertising platforms for job ads: Facebook and LinkedIn. We confirm skew by gender in ad delivery on Facebook, and show that it cannot be justified by differences in qualifications. We fail to find skew in ad delivery on LinkedIn. Finally, we suggest improvements to ad platform practices that could make external auditing of their algorithms in the public interest more feasible and accurate.
@TechReport{Imana21b,
author = "Basileal Imana and Aleksandra Korolova and John Heidemann",
title = "Auditing for Discrimination in Algorithms Delivering Job Ads",
institution = "arXiv",
year = 2021,
sortdate = "2021-04-09",
project = "ant",
jsubject = "network_observation",
number = "arXiv:2102.07433v2 [cs.NI]",
doi = "10.1145/3442381.3450077",
month = apr,
jlocation = "johnh: pafile",
keywords = "linkedin, facebook, ad delivery algorithm, bias, skew, discrimination",
url = "https://arxiv.org/abs/2104.04502v1",
otherurl = "https://ant.isi.edu/%7ejohnh/PAPERS/Imana21b.html",
pdfurl = "https://ant.isi.edu/%7ejohnh/PAPERS/Imana21b.pdf",
abstract = "
Ad platforms such as Facebook, Google and LinkedIn promise
value for advertisers through their targeted advertising. However, multiple studies have shown that ad delivery on such platforms can be skewed by gender or race due to hidden algorithmic optimization by the platforms, even when not requested
by the advertisers. Building on prior work measuring skew
in ad delivery, we develop a new methodology for black-box
auditing of algorithms for \emph{discrimination} in the delivery of job
advertisements. Our first contribution is to identify the distinction between skew in ad delivery due to protected categories
such as gender or race, from skew due to differences in qualification among people in the targeted audience. This distinction
is important in U.S. law, where ads may be targeted based
on qualifications, but not on protected categories. Second, we
develop an auditing methodology that distinguishes between
skew explainable by differences in qualifications from other
factors, such as the ad platform's optimization for engagement
or training its algorithms on biased data. Our method controls for job qualification by comparing ad delivery of two
concurrent ads for similar jobs, but for a pair of companies
with different de facto gender distributions of employees. We
describe the careful statistical tests that establish evidence
of non-qualification skew in the results. Third, we apply our
proposed methodology to two prominent targeted advertising
platforms for job ads: Facebook and LinkedIn. We confirm
skew by gender in ad delivery on Facebook, and show that
it cannot be justified by differences in qualifications. We fail
to find skew in ad delivery on LinkedIn. Finally, we suggest
improvements to ad platform practices that could make external auditing of their algorithms in the public interest more
feasible and accurate."
}
Downloads: 0
{"_id":"jeB4mvDeiMYPq8oi3","bibbaseid":"imana-korolova-heidemann-auditingfordiscriminationinalgorithmsdeliveringjobads-2021","author_short":["Imana, B.","Korolova, A.","Heidemann, J."],"bibdata":{"bibtype":"techreport","type":"techreport","author":[{"firstnames":["Basileal"],"propositions":[],"lastnames":["Imana"],"suffixes":[]},{"firstnames":["Aleksandra"],"propositions":[],"lastnames":["Korolova"],"suffixes":[]},{"firstnames":["John"],"propositions":[],"lastnames":["Heidemann"],"suffixes":[]}],"title":"Auditing for Discrimination in Algorithms Delivering Job Ads","institution":"arXiv","year":"2021","sortdate":"2021-04-09","project":"ant","jsubject":"network_observation","number":"arXiv:2102.07433v2 [cs.NI]","doi":"10.1145/3442381.3450077","month":"April","jlocation":"johnh: pafile","keywords":"linkedin, facebook, ad delivery algorithm, bias, skew, discrimination","url":"https://arxiv.org/abs/2104.04502v1","otherurl":"https://ant.isi.edu/%7ejohnh/PAPERS/Imana21b.html","pdfurl":"https://ant.isi.edu/%7ejohnh/PAPERS/Imana21b.pdf","abstract":"Ad platforms such as Facebook, Google and LinkedIn promise value for advertisers through their targeted advertising. However, multiple studies have shown that ad delivery on such platforms can be skewed by gender or race due to hidden algorithmic optimization by the platforms, even when not requested by the advertisers. Building on prior work measuring skew in ad delivery, we develop a new methodology for black-box auditing of algorithms for \\emphdiscrimination in the delivery of job advertisements. Our first contribution is to identify the distinction between skew in ad delivery due to protected categories such as gender or race, from skew due to differences in qualification among people in the targeted audience. This distinction is important in U.S. law, where ads may be targeted based on qualifications, but not on protected categories. Second, we develop an auditing methodology that distinguishes between skew explainable by differences in qualifications from other factors, such as the ad platform's optimization for engagement or training its algorithms on biased data. Our method controls for job qualification by comparing ad delivery of two concurrent ads for similar jobs, but for a pair of companies with different de facto gender distributions of employees. We describe the careful statistical tests that establish evidence of non-qualification skew in the results. Third, we apply our proposed methodology to two prominent targeted advertising platforms for job ads: Facebook and LinkedIn. We confirm skew by gender in ad delivery on Facebook, and show that it cannot be justified by differences in qualifications. We fail to find skew in ad delivery on LinkedIn. Finally, we suggest improvements to ad platform practices that could make external auditing of their algorithms in the public interest more feasible and accurate.","bibtex":"@TechReport{Imana21b,\n author = \"Basileal Imana and Aleksandra Korolova and John Heidemann\",\n title = \"Auditing for Discrimination in Algorithms Delivering Job Ads\",\n institution = \"arXiv\",\n year = 2021,\n\tsortdate = \t\"2021-04-09\", \n\tproject = \"ant\",\n\tjsubject = \"network_observation\",\n number = \"arXiv:2102.07433v2 [cs.NI]\",\n doi = \"10.1145/3442381.3450077\",\n month = apr,\n jlocation = \"johnh: pafile\",\n keywords = \"linkedin, facebook, ad delivery algorithm, bias, skew, discrimination\",\n url = \"https://arxiv.org/abs/2104.04502v1\",\n\totherurl =\t\t\"https://ant.isi.edu/%7ejohnh/PAPERS/Imana21b.html\",\n\tpdfurl =\t\"https://ant.isi.edu/%7ejohnh/PAPERS/Imana21b.pdf\",\n\tabstract = \"\nAd platforms such as Facebook, Google and LinkedIn promise\nvalue for advertisers through their targeted advertising. However, multiple studies have shown that ad delivery on such platforms can be skewed by gender or race due to hidden algorithmic optimization by the platforms, even when not requested\nby the advertisers. Building on prior work measuring skew\nin ad delivery, we develop a new methodology for black-box\nauditing of algorithms for \\emph{discrimination} in the delivery of job\nadvertisements. Our first contribution is to identify the distinction between skew in ad delivery due to protected categories\nsuch as gender or race, from skew due to differences in qualification among people in the targeted audience. This distinction\nis important in U.S. law, where ads may be targeted based\non qualifications, but not on protected categories. Second, we\ndevelop an auditing methodology that distinguishes between\nskew explainable by differences in qualifications from other\nfactors, such as the ad platform's optimization for engagement\nor training its algorithms on biased data. Our method controls for job qualification by comparing ad delivery of two\nconcurrent ads for similar jobs, but for a pair of companies\nwith different de facto gender distributions of employees. We\ndescribe the careful statistical tests that establish evidence\nof non-qualification skew in the results. Third, we apply our\nproposed methodology to two prominent targeted advertising\nplatforms for job ads: Facebook and LinkedIn. We confirm\nskew by gender in ad delivery on Facebook, and show that\nit cannot be justified by differences in qualifications. We fail\nto find skew in ad delivery on LinkedIn. Finally, we suggest\nimprovements to ad platform practices that could make external auditing of their algorithms in the public interest more\nfeasible and accurate.\"\n}\n\n","author_short":["Imana, B.","Korolova, A.","Heidemann, J."],"bibbaseid":"imana-korolova-heidemann-auditingfordiscriminationinalgorithmsdeliveringjobads-2021","role":"author","urls":{"Paper":"https://arxiv.org/abs/2104.04502v1"},"keyword":["linkedin","facebook","ad delivery algorithm","bias","skew","discrimination"],"metadata":{"authorlinks":{}}},"bibtype":"techreport","biburl":"https://bibbase.org/f/dHevizJoWEhWowz8q/johnh-2023-2.bib","dataSources":["YLyu3mj3xsBeoqiHK","fLZcDgNSoSuatv6aX","fxEParwu2ZfurScPY","7nuQvtHTqKrLmgu99"],"keywords":["linkedin","facebook","ad delivery algorithm","bias","skew","discrimination"],"search_terms":["auditing","discrimination","algorithms","delivering","job","ads","imana","korolova","heidemann"],"title":"Auditing for Discrimination in Algorithms Delivering Job Ads","year":2021}