Evaluating stochastic rankings with expected exposure. Diaz, F., Mitra, B., Ekstrand, M. D, Biega, A. J, & Carterette, B. In Proceedings of the 29th ACM International Conference on Information and Knowledge Management, of CIKM '20, October, 2020. ACM.
Paper doi abstract bibtex 1 download We introduce the concept of expected exposure as the average attention ranked items receive from users over repeated samples of the same query. Furthermore, we advocate for the adoption of the principle of equal expected exposure: given a fixed information need, no item receive more or less expected exposure compared to any other item of the same relevance grade. We argue that this principle is desirable for many retrieval objectives and scenarios, including topical diversity and fair ranking. Leveraging user models from existing retrieval metrics, we propose a general evaluation methodology based on expected exposure and draw connections to related metrics in information retrieval evaluation. Importantly, this methodology relaxes classic information retrieval assumptions, allowing a system, in response to a query, to produce a distribution over rankings instead of a single fixed ranking. We study the behavior of the expected exposure metric and stochastic rankers across a variety of information access conditions, including ad hoc retrieval and recommendation. We believe that measuring and optimizing expected exposure metrics using randomization opens a new area for retrieval algorithm development and progress.
@inproceedings{diaz_evaluating_2020,
series = {{CIKM} '20},
title = {Evaluating stochastic rankings with expected exposure},
url = {http://arxiv.org/abs/2004.13157},
doi = {10.1145/3340531.3411962},
abstract = {We introduce the concept of expected exposure as the average attention
ranked items receive from users over repeated samples of the same query.
Furthermore, we advocate for the adoption of the principle of equal
expected exposure: given a fixed information need, no item receive more or
less expected exposure compared to any other item of the same relevance
grade. We argue that this principle is desirable for many retrieval
objectives and scenarios, including topical diversity and fair ranking.
Leveraging user models from existing retrieval metrics, we propose a
general evaluation methodology based on expected exposure and draw
connections to related metrics in information retrieval evaluation.
Importantly, this methodology relaxes classic information retrieval
assumptions, allowing a system, in response to a query, to produce a
distribution over rankings instead of a single fixed ranking. We study the
behavior of the expected exposure metric and stochastic rankers across a
variety of information access conditions, including ad hoc retrieval and
recommendation. We believe that measuring and optimizing expected exposure
metrics using randomization opens a new area for retrieval algorithm
development and progress.},
booktitle = {Proceedings of the 29th {ACM} {International} {Conference} on {Information} and {Knowledge} {Management}},
publisher = {ACM},
author = {Diaz, Fernando and Mitra, Bhaskar and Ekstrand, Michael D and Biega, Asia J and Carterette, Ben},
month = oct,
year = {2020},
}
Downloads: 1
{"_id":"tvcpqgKDTpqaRgmxC","bibbaseid":"diaz-mitra-ekstrand-biega-carterette-evaluatingstochasticrankingswithexpectedexposure-2020","authorIDs":[],"author_short":["Diaz, F.","Mitra, B.","Ekstrand, M. D","Biega, A. J","Carterette, B."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","series":"CIKM '20","title":"Evaluating stochastic rankings with expected exposure","url":"http://arxiv.org/abs/2004.13157","doi":"10.1145/3340531.3411962","abstract":"We introduce the concept of expected exposure as the average attention ranked items receive from users over repeated samples of the same query. Furthermore, we advocate for the adoption of the principle of equal expected exposure: given a fixed information need, no item receive more or less expected exposure compared to any other item of the same relevance grade. We argue that this principle is desirable for many retrieval objectives and scenarios, including topical diversity and fair ranking. Leveraging user models from existing retrieval metrics, we propose a general evaluation methodology based on expected exposure and draw connections to related metrics in information retrieval evaluation. Importantly, this methodology relaxes classic information retrieval assumptions, allowing a system, in response to a query, to produce a distribution over rankings instead of a single fixed ranking. We study the behavior of the expected exposure metric and stochastic rankers across a variety of information access conditions, including ad hoc retrieval and recommendation. We believe that measuring and optimizing expected exposure metrics using randomization opens a new area for retrieval algorithm development and progress.","booktitle":"Proceedings of the 29th ACM International Conference on Information and Knowledge Management","publisher":"ACM","author":[{"propositions":[],"lastnames":["Diaz"],"firstnames":["Fernando"],"suffixes":[]},{"propositions":[],"lastnames":["Mitra"],"firstnames":["Bhaskar"],"suffixes":[]},{"propositions":[],"lastnames":["Ekstrand"],"firstnames":["Michael","D"],"suffixes":[]},{"propositions":[],"lastnames":["Biega"],"firstnames":["Asia","J"],"suffixes":[]},{"propositions":[],"lastnames":["Carterette"],"firstnames":["Ben"],"suffixes":[]}],"month":"October","year":"2020","bibtex":"@inproceedings{diaz_evaluating_2020,\n\tseries = {{CIKM} '20},\n\ttitle = {Evaluating stochastic rankings with expected exposure},\n\turl = {http://arxiv.org/abs/2004.13157},\n\tdoi = {10.1145/3340531.3411962},\n\tabstract = {We introduce the concept of expected exposure as the average attention\nranked items receive from users over repeated samples of the same query.\nFurthermore, we advocate for the adoption of the principle of equal\nexpected exposure: given a fixed information need, no item receive more or\nless expected exposure compared to any other item of the same relevance\ngrade. We argue that this principle is desirable for many retrieval\nobjectives and scenarios, including topical diversity and fair ranking.\nLeveraging user models from existing retrieval metrics, we propose a\ngeneral evaluation methodology based on expected exposure and draw\nconnections to related metrics in information retrieval evaluation.\nImportantly, this methodology relaxes classic information retrieval\nassumptions, allowing a system, in response to a query, to produce a\ndistribution over rankings instead of a single fixed ranking. We study the\nbehavior of the expected exposure metric and stochastic rankers across a\nvariety of information access conditions, including ad hoc retrieval and\nrecommendation. We believe that measuring and optimizing expected exposure\nmetrics using randomization opens a new area for retrieval algorithm\ndevelopment and progress.},\n\tbooktitle = {Proceedings of the 29th {ACM} {International} {Conference} on {Information} and {Knowledge} {Management}},\n\tpublisher = {ACM},\n\tauthor = {Diaz, Fernando and Mitra, Bhaskar and Ekstrand, Michael D and Biega, Asia J and Carterette, Ben},\n\tmonth = oct,\n\tyear = {2020},\n}\n\n","author_short":["Diaz, F.","Mitra, B.","Ekstrand, M. D","Biega, A. J","Carterette, B."],"key":"diaz_evaluating_2020","id":"diaz_evaluating_2020","bibbaseid":"diaz-mitra-ekstrand-biega-carterette-evaluatingstochasticrankingswithexpectedexposure-2020","role":"author","urls":{"Paper":"http://arxiv.org/abs/2004.13157"},"metadata":{"authorlinks":{}},"downloads":1},"bibtype":"inproceedings","biburl":"https://api.zotero.org/users/6655/collections/3TB3KT36/items?key=VFvZhZXIoHNBbzoLZ1IM2zgf&format=bibtex&limit=100","creationDate":"2020-07-18T16:41:51.219Z","downloads":1,"keywords":[],"search_terms":["evaluating","stochastic","rankings","expected","exposure","diaz","mitra","ekstrand","biega","carterette"],"title":"Evaluating stochastic rankings with expected exposure","year":2020,"dataSources":["HB6fr7bPytW2CAAzC","7KNAjxiv2tsagmbgY"]}