Five Ways to Ensure That Models Serve Society: A Manifesto. Saltelli, A., Bammer, G., Bruno, I., Charters, E., Fiore, M. D., Didier, E., Espeland, W. N., Kay, J., Piano, S. L., Mayo, D., Jr, R. P., Portaluri, T., Porter, T. M., Puy, A., Rafols, I., Ravetz, J. R., Reinert, E., Sarewitz, D., Stark, P. B., Stirling, A., van der Sluijs, J., & Vineis, P. 582(7813):482–484.
Five Ways to Ensure That Models Serve Society: A Manifesto [link]Paper  doi  abstract   bibtex   
Pandemic politics highlight how predictions need to be transparent and humble to invite insight, not blame. [Excerpt] [...] Here we present a manifesto for best practices for responsible mathematical modelling. Many groups before us have described the best ways to apply modelling insights to policies, including for diseases4 (see also Supplementary information). We distil five simple principles to help society demand the quality it needs from modelling. [::Mind the assumptions] Assess uncertainty and sensitivity. Models are often imported from other applications, ignoring how assumptions that are reasonable in one situation can become nonsensical in another. [...] Another lapse occurs when models require input values for which there is no reliable information. [...] One way to mitigate these issues is to perform global uncertainty and sensitivity analyses. In practice, that means allowing all that is uncertain — variables, mathematical relationships and boundary conditions — to vary simultaneously as runs of the model produce its range of predictions. [...] [::Mind the hubris] Complexity can be the enemy of relevance. Most modellers are aware that there is a trade-off between the usefulness of a model and the breadth it tries to capture. But many are seduced by the idea of adding complexity in an attempt to capture reality more accurately. As modellers incorporate more phenomena, a model might fit better to the training data, but at a cost. Its predictions typically become less accurate. As more parameters are added, the uncertainty builds up (the uncertainty cascade effect), and the error could increase to the point at which predictions become useless. [...] [::Mind the framing] Match purpose and context. Results from models will at least partly reflect the interests, disciplinary orientations and biases of the developers. No one model can serve all purposes. Modellers know that the choice of tools will influence, and could even determine, the outcome of the analysis, so the technique is never neutral. [...] Consider the value of a statistical life, loosely defined as the cost of averting a death. It is already controversial for setting compensation — for the victims of aeroplane crashes, for instance. Although it might have a place in choosing the best public-health policy, it can produce a questionable appearance of rigour and so disguise political decisions as technical ones. [...] [::Mind the consequences] Quantification can backfire. Excessive regard for producing numbers can push a discipline away from being roughly right towards being precisely wrong. Undiscriminating use of statistical tests can substitute for sound judgement. [...] Once a number takes centre-stage with a crisp narrative, other possible explanations and estimates can disappear from view. This might invite complacency, and the politicization of quantification, as other options are marginalized. [...] Spurious precision adds to a false sense of certainty. [...] Opacity about uncertainty damages trust. A message from the field of sociology of quantification is that trust is essential for numbers to be useful. Full explanations are crucial. [...] [::Mind the unknown] Acknowledge ignorance. For most of the history of Western philosophy, self-awareness of ignorance was considered a virtue, the worthy object of intellectual pursuit — what the fifteenth-century philosopher Nicholas of Cusa called learned ignorance, or docta ignorantia. Even today, communicating what is not known is at least as important as communicating what is known. Yet models can hide ignorance. Failure to acknowledge this can artificially limit the policy options and open the door to undesired surprises. [...] [::Questions not answers] Mathematical models are a great way to explore questions. They are also a dangerous way to assert answers. Asking models for certainty or consensus is more a sign of the difficulties in making controversial decisions than it is a solution, and can invite ritualistic use of quantification. Models’ assumptions and limitations must be appraised openly and honestly. Process and ethics matter as much as intellectual prowess. It follows, in our view, that good modelling cannot be done by modellers alone. It is a social activity. [...]
@article{saltelliFiveWaysEnsure2020,
  title = {Five Ways to Ensure That Models Serve Society: A Manifesto},
  shorttitle = {Five Ways to Ensure That Models Serve Society},
  author = {Saltelli, Andrea and Bammer, Gabriele and Bruno, Isabelle and Charters, Erica and Fiore, Monica Di and Didier, Emmanuel and Espeland, Wendy Nelson and Kay, John and Piano, Samuele Lo and Mayo, Deborah and Jr, Roger Pielke and Portaluri, Tommaso and Porter, Theodore M. and Puy, Arnald and Rafols, Ismael and Ravetz, Jerome R. and Reinert, Erik and Sarewitz, Daniel and Stark, Philip B. and Stirling, Andrew and van der Sluijs, Jeroen and Vineis, Paolo},
  date = {2020-06},
  journaltitle = {Nature},
  volume = {582},
  pages = {482--484},
  doi = {10.1038/d41586-020-01812-9},
  url = {https://www.nature.com/articles/d41586-020-01812-9},
  urldate = {2020-06-26},
  abstract = {Pandemic politics highlight how predictions need to be transparent and humble to invite insight, not blame.

[Excerpt]
[...] Here we present a manifesto for best practices for responsible mathematical modelling. Many groups before us have described the best ways to apply modelling insights to policies, including for diseases4 (see also Supplementary information). We distil five simple principles to help society demand the quality it needs from modelling.

[::Mind the assumptions]

Assess uncertainty and sensitivity. Models are often imported from other applications, ignoring how assumptions that are reasonable in one situation can become nonsensical in another.  [...] Another lapse occurs when models require input values for which there is no reliable information. [...] One way to mitigate these issues is to perform global uncertainty and sensitivity analyses. In practice, that means allowing all that is uncertain — variables, mathematical relationships and boundary conditions — to vary simultaneously as runs of the model produce its range of predictions. [...]

[::Mind the hubris]

Complexity can be the enemy of relevance. Most modellers are aware that there is a trade-off between the usefulness of a model and the breadth it tries to capture. But many are seduced by the idea of adding complexity in an attempt to capture reality more accurately. As modellers incorporate more phenomena, a model might fit better to the training data, but at a cost. Its predictions typically become less accurate. As more parameters are added, the uncertainty builds up (the uncertainty cascade effect), and the error could increase to the point at which predictions become useless.
  [...]

[::Mind the framing]

Match purpose and context. Results from models will at least partly reflect the interests, disciplinary orientations and biases of the developers. No one model can serve all purposes.

Modellers know that the choice of tools will influence, and could even determine, the outcome of the analysis, so the technique is never neutral. 
  [...] Consider the value of a statistical life, loosely defined as the cost of averting a death. It is already controversial for setting compensation — for the victims of aeroplane crashes, for instance. Although it might have a place in choosing the best public-health policy, it can produce a questionable appearance of rigour and so disguise political decisions as technical ones. [...]

[::Mind the consequences]

Quantification can backfire. Excessive regard for producing numbers can push a discipline away from being roughly right towards being precisely wrong. Undiscriminating use of statistical tests can substitute for sound judgement.  [...]
Once a number takes centre-stage with a crisp narrative, other possible explanations and estimates can disappear from view. This might invite complacency, and the politicization of quantification, as other options are marginalized.  [...]
Spurious precision adds to a false sense of certainty. [...]
Opacity about uncertainty damages trust. A message from the field of sociology of quantification is that trust is essential for numbers to be useful. Full explanations are crucial.
  [...]

[::Mind the unknown]

Acknowledge ignorance. For most of the history of Western philosophy, self-awareness of ignorance was considered a virtue, the worthy object of intellectual pursuit — what the fifteenth-century philosopher Nicholas of Cusa called learned ignorance, or docta ignorantia. Even today, communicating what is not known is at least as important as communicating what is known. Yet models can hide ignorance.

Failure to acknowledge this can artificially limit the policy options and open the door to undesired surprises.
  [...]

[::Questions not answers]

Mathematical models are a great way to explore questions. They are also a dangerous way to assert answers. Asking models for certainty or consensus is more a sign of the difficulties in making controversial decisions than it is a solution, and can invite ritualistic use of quantification.

Models’ assumptions and limitations must be appraised openly and honestly. Process and ethics matter as much as intellectual prowess. It follows, in our view, that good modelling cannot be done by modellers alone. It is a social activity. [...]},
  keywords = {~INRMM-MiD:z-H6TNU5DJ,check-list,communicating-uncertainty,data-uncertainty,local-over-complication,manifesto,modelling,modelling-uncertainty,oversimplification,post-normal-science,science-policy-interface,science-society-interface,scientific-communication,uncertainty-propagation,unknown},
  langid = {english},
  number = {7813}
}

Downloads: 0