Raising the Bar? - The Challenges of Evaluating the Outcomes of Environmental Modelling and Software. Matthews, K. B., Rivington, M., Blackstock, K., McCrum, G., Buchan, K., & Miller, D. G. 26(3):247–257.
Raising the Bar? - The Challenges of Evaluating the Outcomes of Environmental Modelling and Software [link]Paper  doi  abstract   bibtex   
The intention of this paper it to open up debate within the environmental modelling and software (EMS) community on how best to respond to the increasing desire to evaluate the success of EMS projects in terms of outcomes rather than outputs. Outcomes in these regards are changes beyond the walls of the research organisation (typically to values, attitudes and behaviour). The authors recognise that outcome evaluation is essential in ensuring the relevance and effectiveness of activities. To date, however, there is a limited appreciation within the EMS community of the nature of the challenge inherent in outcome evaluations. The paper presents an exploratory analysis of the challenges that outcome assessment raises for EMS. It does so using mutually reinforcing conceptual and practical perspectives. The paper presents a conceptual framework of three loosely coupled phases - research, development and operations. The nature of activities and their interactions within these phases is outlined and the forms of evaluation associated with each stage set out. The paper notes how existing forms of evaluation (e.g. peer review, validation and relevance) underpin the delivery of outcomes but do not of themselves evaluate outcomes. The paper proposes that outcomes need conceptually to be seen as an element of complex social processes mediated by government, regulation, markets and the media rather than as simply another form of output from research and development projects. As such outcomes of EMS are: less easily tangible than are outputs; more likely to occur at a significant time lag after any intervention; more difficult to assign causality for and to be subject to significant contestation. Thus EMS activity, however well conducted technically, may only have a minor influence on outcomes and EMS practitioners will have limited control over those outcomes that do occur. The paper uses a series of linked EMS projects to populate the conceptual framework showing the role of evaluations in research, development and operations phases. The paper then presents two forms (quantitative and qualitative) of outcome evaluation used as part of an operational phase evaluation of a project communicating the consequences of climate change to remote-rural land managers in Scotland. The authors conclude that while the challenges of EMS evaluation can be met, there needs to be care from the EMS community not to raise expectations of outcomes that cannot be met.
@article{matthewsRaisingBarChallenges2011,
  title = {Raising the Bar? - {{The}} Challenges of Evaluating the Outcomes of Environmental Modelling and Software},
  author = {Matthews, K. B. and Rivington, M. and Blackstock, K. and McCrum, G. and Buchan, K. and Miller, D. G.},
  date = {2011-03},
  journaltitle = {Environmental Modelling \& Software},
  volume = {26},
  pages = {247--257},
  issn = {1364-8152},
  doi = {10.1016/j.envsoft.2010.03.031},
  url = {https://doi.org/10.1016/j.envsoft.2010.03.031},
  abstract = {The intention of this paper it to open up debate within the environmental modelling and software (EMS) community on how best to respond to the increasing desire to evaluate the success of EMS projects in terms of outcomes rather than outputs. Outcomes in these regards are changes beyond the walls of the research organisation (typically to values, attitudes and behaviour). The authors recognise that outcome evaluation is essential in ensuring the relevance and effectiveness of activities. To date, however, there is a limited appreciation within the EMS community of the nature of the challenge inherent in outcome evaluations. The paper presents an exploratory analysis of the challenges that outcome assessment raises for EMS. It does so using mutually reinforcing conceptual and practical perspectives. The paper presents a conceptual framework of three loosely coupled phases - research, development and operations. The nature of activities and their interactions within these phases is outlined and the forms of evaluation associated with each stage set out. The paper notes how existing forms of evaluation (e.g. peer review, validation and relevance) underpin the delivery of outcomes but do not of themselves evaluate outcomes. The paper proposes that outcomes need conceptually to be seen as an element of complex social processes mediated by government, regulation, markets and the media rather than as simply another form of output from research and development projects. As such outcomes of EMS are: less easily tangible than are outputs; more likely to occur at a significant time lag after any intervention; more difficult to assign causality for and to be subject to significant contestation. Thus EMS activity, however well conducted technically, may only have a minor influence on outcomes and EMS practitioners will have limited control over those outcomes that do occur. The paper uses a series of linked EMS projects to populate the conceptual framework showing the role of evaluations in research, development and operations phases. The paper then presents two forms (quantitative and qualitative) of outcome evaluation used as part of an operational phase evaluation of a project communicating the consequences of climate change to remote-rural land managers in Scotland. The authors conclude that while the challenges of EMS evaluation can be met, there needs to be care from the EMS community not to raise expectations of outcomes that cannot be met.},
  keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-7104609,assessment,environmental-modelling,model-assessment,modelling,outputs-vs-outcomes},
  number = {3}
}

Downloads: 0