Uncertainty in Simulating Wheat Yields under Climate Change. Asseng, S., Ewert, F., Rosenzweig, C., Jones, J. W., Hatfield, J. L., Ruane, A. C., Boote, K. J., Thorburn, P. J., Rötter, R. P., Cammarano, D., Brisson, N., Basso, B., Martre, P., Aggarwal, P. K., Angulo, C., Bertuzzi, P., Biernath, C., Challinor, A. J., Doltra, J., Gayler, S., Goldberg, R., Grant, R., Heng, L., Hooker, J., Hunt, L. A., Ingwersen, J., Izaurralde, R. C., Kersebaum, K. C., Müller, C., Naresh Kumar, S., Nendel, C., O'Leary, G., Olesen, J. E., Osborne, T. M., Palosuo, T., Priesack, E., Ripoche, D., Semenov, M. A., Shcherbak, I., Steduto, P., Stöckle, C., Stratonovitch, P., Streck, T., Supit, I., Tao, F., Travasso, M., Waha, K., Wallach, D., White, J. W., Williams, J. R., & Wolf, J. 3(9):827–832. Paper doi abstract bibtex Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1, 3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policy making.
@article{assengUncertaintySimulatingWheat2013,
title = {Uncertainty in Simulating Wheat Yields under Climate Change},
author = {Asseng, S. and Ewert, F. and Rosenzweig, C. and Jones, J. W. and Hatfield, J. L. and Ruane, A. C. and Boote, K. J. and Thorburn, P. J. and Rötter, R. P. and Cammarano, D. and Brisson, N. and Basso, B. and Martre, P. and Aggarwal, P. K. and Angulo, C. and Bertuzzi, P. and Biernath, C. and Challinor, A. J. and Doltra, J. and Gayler, S. and Goldberg, R. and Grant, R. and Heng, L. and Hooker, J. and Hunt, L. A. and Ingwersen, J. and Izaurralde, R. C. and Kersebaum, K. C. and Müller, C. and Naresh Kumar, S. and Nendel, C. and O'Leary, G. and Olesen, J. E. and Osborne, T. M. and Palosuo, T. and Priesack, E. and Ripoche, D. and Semenov, M. A. and Shcherbak, I. and Steduto, P. and Stöckle, C. and Stratonovitch, P. and Streck, T. and Supit, I. and Tao, F. and Travasso, M. and Waha, K. and Wallach, D. and White, J. W. and Williams, J. R. and Wolf, J.},
date = {2013-06},
journaltitle = {Nature Climate Change},
volume = {3},
pages = {827--832},
issn = {1758-678X},
doi = {10.1038/nclimate1916},
url = {https://doi.org/10.1038/nclimate1916},
abstract = {Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1, 3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policy making.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-12438735,agricultural-land,agricultural-resources,climate-change,crop-yield,integrated-natural-resources-modelling-and-management,integration-techniques,multiauthor,science-policy-interface,transdisciplinary-research,uncertainty,wheat},
number = {9}
}
Reframing Ecosystem Management in the Era of Climate Change: Issues and Knowledge from Forests. Mori, A. S., Spies, T. A., Sudmeier-Rieux, K., & Andrade, A. 165:115–127. Paper doi abstract bibtex We discuss ” ecosystem management (EM)” to face contemporary climate change issues. EM focuses on sustaining ecosystems to meet both ecological and human needs. EM plans have been largely developed independent of concerns about climate change. However, EM is potentially effective for climate change mitigation and adaptation. We provide the principle guidelines based on EM to adaptively tackle the issues. Climate change is one of the significant concerns in land and resource management, creating an urgent need to build social-ecological capacity to address widespread and uncertain environmental changes. Given the diversity and complexity of ecological responses to climate change ” ecosystem management” approaches are needed to provide solutions for meeting both ecological and human needs, while reducing anthropogenic warming and climate-related impacts on society. For instance, ecosystem management can contribute to a reduction in the greenhouse gas emissions through improved land-use and reduced deforestation at a regional scale. Further, conserving and restoring naturally-functioning ecosystems, which is often one of the goals of ecosystem management can significantly contribute to buffering ecological responses to climate extremes such as droughts and wildfires. Moreover, ecosystem management helps build capacity for learning and adaptation at multiple scales. As a result, societies will be better prepared to respond to surprises and uncertainties associated with climate change. In this regard, it is imperative to reframe climate change issues based on the ecosystem approach. Although climate change and ecosystem management plans have largely developed independently, it is now essential for all stakeholders to work together to achieve multiple goals. The ecosystem-based approaches can enable flexible and effective responses to the uncertainties associated with climate change. Reframing ecosystem management helps to face an urgent need for reconsideration and improvement of social-ecological resilience in order to mitigate and adapt to the changing climate. [Excerpt: Conclusion] Novel approaches underpinned by sociology, ecology and climate science are necessary to perform assessments that reflect the many roles that ecosystem management can play in mitigating and adapting to climate change. No single method and focal scale for addressing the effects or causes of climate change exists. Indeed, there are often trade-offs such as those between the goals of building resilience (learning from failure) and reducing vulnerability (minimizing failure) (Adger et al., 2008), suggesting some policies aimed at minimizing exposure to any hazards at the regional scale can potentially conflict with the proactive implementation of adaptive management at the local scale. In this article, we have discussed reframing ecosystem management as an effective way to address the uncertainties of climate change. It is therefore necessary to adopt flexible and robust management strategies that consider various scenarios, rather than adopting a single measure. Similar to climate change, which is intricately connected to other issues beyond physical climatic change, ecological issues are deeply associated with global issues. A number of environmental policies and plans have been historically developed with little consideration of climate instability. Among them, our attempt that integrates different management considerations into the common context lends a strong support for the objectives and approaches of ecosystem management as an effective tool to face climate change uncertainties. [\n] It is important for all stakeholders to work together to identity multiple goals. Ecologists need to address ecosystem processes and functions in the context of possible future conditions; resource managers and policymakers need to build capacity for learning and adaptation; and all stakeholders need to share a recognition that social-ecological systems are interacting not only with each other (social-ecological interdependence) but also with the climate system. In keeping with the view of Moss et al. (2010) that the future climate largely depends on the behaviour of global society, the fates of ecosystems will strongly depend on how human society faces climate change. In particular, there are still important gaps in the combined study of climate and ecosystem science that need to be addressed. At the time of UNFCCC meetings in Copenhagen in 2009, UNEP (2009a) stated that climate information, when coupled with other information such as ecology and socio-economics, should be centralized within policy formulation and decision making process for practical ecosystem management at local and regional scales with reasonable timescales of the next several decades. Bringing different fields together is essential to tackle future complexity. The constructive improvements that come from an ecosystem management strategy, as summarized in Table 2, has the potential to effectively fill the gaps among disciplines and stakeholders.
@article{moriReframingEcosystemManagement2013,
title = {Reframing Ecosystem Management in the Era of Climate Change: Issues and Knowledge from Forests},
author = {Mori, Akira S. and Spies, Thomas A. and Sudmeier-Rieux, Karen and Andrade, Angela},
date = {2013-09},
journaltitle = {Biological Conservation},
volume = {165},
pages = {115--127},
issn = {0006-3207},
doi = {10.1016/j.biocon.2013.05.020},
url = {https://doi.org/10.1016/j.biocon.2013.05.020},
abstract = {We discuss ” ecosystem management (EM)” to face contemporary climate change issues. EM focuses on sustaining ecosystems to meet both ecological and human needs. EM plans have been largely developed independent of concerns about climate change. However, EM is potentially effective for climate change mitigation and adaptation. We provide the principle guidelines based on EM to adaptively tackle the issues. Climate change is one of the significant concerns in land and resource management, creating an urgent need to build social-ecological capacity to address widespread and uncertain environmental changes. Given the diversity and complexity of ecological responses to climate change ” ecosystem management” approaches are needed to provide solutions for meeting both ecological and human needs, while reducing anthropogenic warming and climate-related impacts on society. For instance, ecosystem management can contribute to a reduction in the greenhouse gas emissions through improved land-use and reduced deforestation at a regional scale. Further, conserving and restoring naturally-functioning ecosystems, which is often one of the goals of ecosystem management can significantly contribute to buffering ecological responses to climate extremes such as droughts and wildfires. Moreover, ecosystem management helps build capacity for learning and adaptation at multiple scales. As a result, societies will be better prepared to respond to surprises and uncertainties associated with climate change. In this regard, it is imperative to reframe climate change issues based on the ecosystem approach. Although climate change and ecosystem management plans have largely developed independently, it is now essential for all stakeholders to work together to achieve multiple goals. The ecosystem-based approaches can enable flexible and effective responses to the uncertainties associated with climate change. Reframing ecosystem management helps to face an urgent need for reconsideration and improvement of social-ecological resilience in order to mitigate and adapt to the changing climate.
[Excerpt: Conclusion]
Novel approaches underpinned by sociology, ecology and climate science are necessary to perform assessments that reflect the many roles that ecosystem management can play in mitigating and adapting to climate change. No single method and focal scale for addressing the effects or causes of climate change exists. Indeed, there are often trade-offs such as those between the goals of building resilience (learning from failure) and reducing vulnerability (minimizing failure) (Adger et al., 2008), suggesting some policies aimed at minimizing exposure to any hazards at the regional scale can potentially conflict with the proactive implementation of adaptive management at the local scale. In this article, we have discussed reframing ecosystem management as an effective way to address the uncertainties of climate change. It is therefore necessary to adopt flexible and robust management strategies that consider various scenarios, rather than adopting a single measure. Similar to climate change, which is intricately connected to other issues beyond physical climatic change, ecological issues are deeply associated with global issues. A number of environmental policies and plans have been historically developed with little consideration of climate instability. Among them, our attempt that integrates different management considerations into the common context lends a strong support for the objectives and approaches of ecosystem management as an effective tool to face climate change uncertainties.
[\textbackslash n] It is important for all stakeholders to work together to identity multiple goals. Ecologists need to address ecosystem processes and functions in the context of possible future conditions; resource managers and policymakers need to build capacity for learning and adaptation; and all stakeholders need to share a recognition that social-ecological systems are interacting not only with each other (social-ecological interdependence) but also with the climate system. In keeping with the view of Moss et al. (2010) that the future climate largely depends on the behaviour of global society, the fates of ecosystems will strongly depend on how human society faces climate change. In particular, there are still important gaps in the combined study of climate and ecosystem science that need to be addressed. At the time of UNFCCC meetings in Copenhagen in 2009, UNEP (2009a) stated that climate information, when coupled with other information such as ecology and socio-economics, should be centralized within policy formulation and decision making process for practical ecosystem management at local and regional scales with reasonable timescales of the next several decades. Bringing different fields together is essential to tackle future complexity. The constructive improvements that come from an ecosystem management strategy, as summarized in Table 2, has the potential to effectively fill the gaps among disciplines and stakeholders.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13912160,~to-add-doi-URL,adaptation,climate-change,climate-extremes,droughts,ecology,ecosystem,forest-resources,incomplete-knowledge,knowledge-integration,mitigation,uncertainty,wildfires}
}
Interaction between Ash and Soil Microaggregates Reduces Runoff and Soil Loss. Thomaz, E. L. 625:1257–1263. Paper doi abstract bibtex [Highlights] [::] An exploratory experiment was carried out using ash from a prescribed fire. [::] Interaction between the ash and soil microaggregates reduced runoff by 78%. [::] The ash–soil microaggregates interaction also reduced soil loss by 26%. [::] The ash treatment increased soil loss by 47% compared to the case of bare soil. [::] Fine particles at the topsoil do not necessarily mean the development of surface sealing. [Abstract] Areas subjected to fire have a two-layer system (i.e., ash and soil), which brings enormous complexities to hydrogeomorphic processes. In addition, the combinations of variables from the ash and the soil characteristics result in several possible two-layer system contexts. Here, the interactions among ash and microaggregates (i.e., ash placed over fine soil microaggregates) and their effects on hydro-erosional processes are explored. The ash was produced by an experimental fire and collected from a field managed by a slash-and-burn agricultural system. The design of the experiment included a strategy for considering combinations in which each of the various factors of interest, i.e., ash and microaggregates, was present or absent. In addition, the study searched for interactions between the two factors when both were present. In total, 600 g m2 of fine ash mixture ($<$0.250 mm), obtained from fire at different temperatures, and 90 g m2 of microaggregates was placed over a small splash pan (0.135 m2). Next, a rainfall of 56 mm h−1 lasting for 30 min was applied in four replicates for each treatment: 1) bare soil, 2) bare soil + microaggregates, 3) ash, and 4) ash + microaggregates. The interaction between the ash and soil microaggregates changed the soil hydrology dynamics, reducing soil moisture by 28% and surface runoff by 78%. The ash–microaggregates combination reduced soil loss by sheetwash by 20% and by rainsplash by 25%. Overall, the ash treatment increased soil loss by 47% compared to the case of bare soil. On the contrary, the ash–microaggregates interaction decreased soil loss by 26% compared to the ash treatment.
@article{thomazInteractionAshSoil2018,
title = {Interaction between Ash and Soil Microaggregates Reduces Runoff and Soil Loss},
author = {Thomaz, Edivaldo L.},
date = {2018-06-01},
journaltitle = {Science of The Total Environment},
shortjournal = {Science of The Total Environment},
volume = {625},
pages = {1257--1263},
issn = {0048-9697},
doi = {10.1016/j.scitotenv.2018.01.046},
url = {https://doi.org/10.1016/j.scitotenv.2018.01.046},
urldate = {2019-12-04},
abstract = {[Highlights]
[::] An exploratory experiment was carried out using ash from a prescribed fire.
[::] Interaction between the ash and soil microaggregates reduced runoff by 78\%.
[::] The ash–soil microaggregates interaction also reduced soil loss by 26\%.
[::] The ash treatment increased soil loss by 47\% compared to the case of bare soil.
[::] Fine particles at the topsoil do not necessarily mean the development of surface sealing.
[Abstract]
Areas subjected to fire have a two-layer system (i.e., ash and soil), which brings enormous complexities to hydrogeomorphic processes. In addition, the combinations of variables from the ash and the soil characteristics result in several possible two-layer system contexts. Here, the interactions among ash and microaggregates (i.e., ash placed over fine soil microaggregates) and their effects on hydro-erosional processes are explored. The ash was produced by an experimental fire and collected from a field managed by a slash-and-burn agricultural system. The design of the experiment included a strategy for considering combinations in which each of the various factors of interest, i.e., ash and microaggregates, was present or absent. In addition, the study searched for interactions between the two factors when both were present. In total, 600 g m2 of fine ash mixture ({$<$}0.250 mm), obtained from fire at different temperatures, and 90 g m2 of microaggregates was placed over a small splash pan (0.135 m2). Next, a rainfall of 56 mm h−1 lasting for 30 min was applied in four replicates for each treatment: 1) bare soil, 2) bare soil + microaggregates, 3) ash, and 4) ash + microaggregates. The interaction between the ash and soil microaggregates changed the soil hydrology dynamics, reducing soil moisture by 28\% and surface runoff by 78\%. The ash–microaggregates combination reduced soil loss by sheetwash by 20\% and by rainsplash by 25\%. Overall, the ash treatment increased soil loss by 47\% compared to the case of bare soil. On the contrary, the ash–microaggregates interaction decreased soil loss by 26\% compared to the ash treatment.},
keywords = {~INRMM-MiD:z-XNMZJEXD,brazil,data-uncertainty,erodibility,modelling-uncertainty,post-fire-impacts,runoff,soil-erosion,soil-resources,uncertainty,wildfires},
langid = {english}
}
Welcome to Postnormal Times. Sardar, Z. 42(5):435–444. Paper doi abstract bibtex All that was 'normal' has now evaporated; we have entered postnormal times, the in-between period where old orthodoxies are dying, new ones have not yet emerged, and nothing really makes sense. To have any notion of a viable future, we must grasp the significance of this period of transition which is characterised by three c's: complexity, chaos and contradictions. These forces propel and sustain postnormal times leading to uncertainty and different types of ignorance that make decision-making problematic and increase risks to individuals, society and the planet. Postnormal times demands, this paper argues, that we abandon the ideas of 'control and management', and rethink the cherished notions of progress, modernisation and efficiency. The way forward must be based on virtues of humility, modesty and accountability, the indispensible requirement of living with uncertainty, complexity and ignorance. We will have to imagine ourselves out of postnormal times and into a new age of normalcy – with an ethical compass and a broad spectrum of imaginations from the rich diversity of human cultures.
@article{sardarWelcomePostnormalTimes2010,
title = {Welcome to Postnormal Times},
author = {Sardar, Ziauddin},
date = {2010-06},
journaltitle = {Futures},
volume = {42},
pages = {435--444},
issn = {0016-3287},
doi = {10.1016/j.futures.2009.11.028},
url = {https://doi.org/10.1016/j.futures.2009.11.028},
abstract = {All that was 'normal' has now evaporated; we have entered postnormal times, the in-between period where old orthodoxies are dying, new ones have not yet emerged, and nothing really makes sense. To have any notion of a viable future, we must grasp the significance of this period of transition which is characterised by three c's: complexity, chaos and contradictions. These forces propel and sustain postnormal times leading to uncertainty and different types of ignorance that make decision-making problematic and increase risks to individuals, society and the planet. Postnormal times demands, this paper argues, that we abandon the ideas of 'control and management', and rethink the cherished notions of progress, modernisation and efficiency. The way forward must be based on virtues of humility, modesty and accountability, the indispensible requirement of living with uncertainty, complexity and ignorance. We will have to imagine ourselves out of postnormal times and into a new age of normalcy -- with an ethical compass and a broad spectrum of imaginations from the rich diversity of human cultures.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-6230082,~to-add-doi-URL,chaos,complexity,deep-uncertainty,ethics,feedback,mitigation,non-linearity,post-normal-science,science-ethics,science-society-interface,social-system,technocracy,trade-offs,uncertainty,unknown},
number = {5}
}
A Review of the Mechanical Effects of Plant Roots on Concentrated Flow Erosion Rates. Vannoppen, W., Vanmaercke, M., De Baets, S., & Poesen, J. 150:666–678. Paper doi abstract bibtex Living plant roots modify both mechanical and hydrological characteristics of the soil matrix (e.g. soil aggregate stability by root exudates, soil cohesion, infiltration rate, soil moisture content, soil organic matter) and negatively influence the soil erodibility. During the last two decades several studies reported on the effects of plant roots in controlling concentrated flow erosion rates. However a global analysis of the now available data on root effects is still lacking. Yet, a meta-data analysis will contribute to a better understanding of the soil-root interactions as our capability to assess the effectiveness of roots in reducing soil erosion rates due to concentrated flow in different environments remains difficult. The objectives of this study are therefore: i) to provide a state of the art on studies quantifying the effectiveness of roots in reducing soil erosion rates due to concentrated flow; and ii) to explore the overall trends in erosion reduction as a function of the root (length) density, root architecture and soil texture, based on an integrated analysis of published data. We therefore compiled a dataset of measured soil detachment ratios (SDR) for the root density (RD; 822 observations) as well as for the root length density (RLD; 274 observations). A Hill curve model best describes the decrease in SDR as a function of R(L)D. An important finding of our meta-analysis is that RLD is a much more suitable variable to estimate SDR compared to RD as it is linked to root architecture. However, a large proportion of the variability in SDR could not be attributed to RD or RLD, resulting in a low predictive accuracy of these Hill curve models with a model efficiency of 0.11 and 0.17 for RD and RLD respectively. Considering root architecture and soil texture did yield a better predictive model for RLD with a model efficiency of 0.37 for fibrous roots in non-sandy soils while no improvement was found for RD. The unexplained variance is attributed to differences in experimental set-ups and measuring errors which could not be explicitly accounted for due to a lack of additional data. Based on those results, it remains difficult to predict the effects of roots on soil erosion rates. However, by using a Monte Carlo simulation approach, we were able to establish relationships that allow assessing the likely erosion-reducing effects of plant roots, while taking these uncertainties into account. Overall, this study demonstrates that plant roots can be very effective in reducing soil erosion rates due to concentrated flow. [Excerpt: Conclusions] Vegetation can be used to reduce soil degradation by soil erosion processes. This study showed that plant roots can be very effective in controlling soil erosion rates due to concentrated flow. A combination of a well-established vegetation cover together with a dense root system in the topsoil is therefore most effective and recommended to protect the soil against soil erosion processes by water. The erosion-reducing potential of plant roots can be explained by their indirect negative effect on soil erodibility through affecting various soil properties (e.g. aggregate stability, cohesion, organic matter content, infiltration rate and moisture content). However both the environment and management practices have to be taken into account as they influence the effectiveness of plant roots in reducing soil erosion rates. Analysis of a global dataset based on published data showed that the decrease in SDR as a function of RD or RLD could be best described by a Hill curve model. Root architecture and soil texture were further considered as an attempt to improve the models. This resulted in better predictive models for RLD (for fibrous roots in non-sandy soils) while no improvement could be observed for RD. Consequently, it remains difficult to predict the erosion-reducing effects of plant roots on concentrated flow erosion rates as still a large part of the variance remains unexplained. Results of the Monte Carlo analyses ( Fig. 5) present confidence intervals on estimated SDR values for the proposed models that should be used as an estimation of the uncertainty range. As such, the established relationships between root (length) density and the soil detachment ratio allow for meaningful estimations of the mechanical effects of plant roots on concentrated flow erosion rates. The advantage of this approach is that the results of this study can be extrapolated to different environments to examine the likely root effects on erosion rates, as we implicitly take into account the variability in root and soil characteristics. [\n] As tap root systems are less effective in controlling soil erosion compared to fibrous roots, we furthermore prefer the use of RLD as root variable as it indirectly takes into account the root architecture. The influence of soil texture on erosion-reducing potential could not be demonstrated due to a lack of sufficient data on the erosion-reducing potential of plant roots in different soil textures. More empirical studies are needed to examine the role of soil texture on the erosion-reducing potential. Moreover, a more accurate global database is needed to unravel the influence of additional soil, root and environmental variables on the erosion-reducing potential of plant roots and to improve the predictive quality of the models.
@article{vannoppenReviewMechanicalEffects2015,
title = {A Review of the Mechanical Effects of Plant Roots on Concentrated Flow Erosion Rates},
author = {Vannoppen, W. and Vanmaercke, M. and De Baets, S. and Poesen, J.},
date = {2015-11},
journaltitle = {Earth-Science Reviews},
volume = {150},
pages = {666--678},
issn = {0012-8252},
doi = {10.1016/j.earscirev.2015.08.011},
url = {https://doi.org/10.1016/j.earscirev.2015.08.011},
abstract = {Living plant roots modify both mechanical and hydrological characteristics of the soil matrix (e.g. soil aggregate stability by root exudates, soil cohesion, infiltration rate, soil moisture content, soil organic matter) and negatively influence the soil erodibility. During the last two decades several studies reported on the effects of plant roots in controlling concentrated flow erosion rates. However a global analysis of the now available data on root effects is still lacking. Yet, a meta-data analysis will contribute to a better understanding of the soil-root interactions as our capability to assess the effectiveness of roots in reducing soil erosion rates due to concentrated flow in different environments remains difficult. The objectives of this study are therefore: i) to provide a state of the art on studies quantifying the effectiveness of roots in reducing soil erosion rates due to concentrated flow; and ii) to explore the overall trends in erosion reduction as a function of the root (length) density, root architecture and soil texture, based on an integrated analysis of published data. We therefore compiled a dataset of measured soil detachment ratios (SDR) for the root density (RD; 822 observations) as well as for the root length density (RLD; 274 observations). A Hill curve model best describes the decrease in SDR as a function of R(L)D. An important finding of our meta-analysis is that RLD is a much more suitable variable to estimate SDR compared to RD as it is linked to root architecture. However, a large proportion of the variability in SDR could not be attributed to RD or RLD, resulting in a low predictive accuracy of these Hill curve models with a model efficiency of 0.11 and 0.17 for RD and RLD respectively. Considering root architecture and soil texture did yield a better predictive model for RLD with a model efficiency of 0.37 for fibrous roots in non-sandy soils while no improvement was found for RD. The unexplained variance is attributed to differences in experimental set-ups and measuring errors which could not be explicitly accounted for due to a lack of additional data. Based on those results, it remains difficult to predict the effects of roots on soil erosion rates. However, by using a Monte Carlo simulation approach, we were able to establish relationships that allow assessing the likely erosion-reducing effects of plant roots, while taking these uncertainties into account. Overall, this study demonstrates that plant roots can be very effective in reducing soil erosion rates due to concentrated flow.
[Excerpt: Conclusions]
Vegetation can be used to reduce soil degradation by soil erosion processes. This study showed that plant roots can be very effective in controlling soil erosion rates due to concentrated flow. A combination of a well-established vegetation cover together with a dense root system in the topsoil is therefore most effective and recommended to protect the soil against soil erosion processes by water. The erosion-reducing potential of plant roots can be explained by their indirect negative effect on soil erodibility through affecting various soil properties (e.g. aggregate stability, cohesion, organic matter content, infiltration rate and moisture content). However both the environment and management practices have to be taken into account as they influence the effectiveness of plant roots in reducing soil erosion rates. Analysis of a global dataset based on published data showed that the decrease in SDR as a function of RD or RLD could be best described by a Hill curve model. Root architecture and soil texture were further considered as an attempt to improve the models. This resulted in better predictive models for RLD (for fibrous roots in non-sandy soils) while no improvement could be observed for RD. Consequently, it remains difficult to predict the erosion-reducing effects of plant roots on concentrated flow erosion rates as still a large part of the variance remains unexplained. Results of the Monte Carlo analyses ( Fig. 5) present confidence intervals on estimated SDR values for the proposed models that should be used as an estimation of the uncertainty range. As such, the established relationships between root (length) density and the soil detachment ratio allow for meaningful estimations of the mechanical effects of plant roots on concentrated flow erosion rates. The advantage of this approach is that the results of this study can be extrapolated to different environments to examine the likely root effects on erosion rates, as we implicitly take into account the variability in root and soil characteristics.
[\textbackslash n] As tap root systems are less effective in controlling soil erosion compared to fibrous roots, we furthermore prefer the use of RLD as root variable as it indirectly takes into account the root architecture. The influence of soil texture on erosion-reducing potential could not be demonstrated due to a lack of sufficient data on the erosion-reducing potential of plant roots in different soil textures. More empirical studies are needed to examine the role of soil texture on the erosion-reducing potential. Moreover, a more accurate global database is needed to unravel the influence of additional soil, root and environmental variables on the erosion-reducing potential of plant roots and to improve the predictive quality of the models.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13794980,~to-add-doi-URL,boreal-forests,comparison,eucalyptus-citriodora,forest-resources,macchia,modelling,pinus-tabulaeformis,review,robinia-pseudoacacia,sclerophyllous,soil-erosion,soil-resources,stabilization,uncertainty,vegetation}
}
Consistent Land- and Atmosphere-Based U.S. Carbon Sink Estimates. Pacala, S. W., Hurtt, G. C., Baker, D., Peylin, P., Houghton, R. A., Birdsey, R. A., Heath, L., Sundquist, E. T., Stallard, R. F., Ciais, P., Moorcroft, P., Caspersen, J. P., Shevliakova, E., Moore, B., Kohlmaier, G., Holland, E., Gloor, M., Harmon, M. E., Fan, S. M., Sarmiento, J. L., Goodale, C. L., Schimel, D., & Field, C. B. 292(5525):2316–2320. Paper doi abstract bibtex For the period 1980-89, we estimate a carbon sink in the coterminous United States between 0.30 and 0.58 petagrams of carbon per year (petagrams of carbon = 1015 grams of carbon). The net carbon flux from the atmosphere to the land was higher, 0.37 to 0.71 petagrams of carbon per year, because a net flux of 0.07 to 0.13 petagrams of carbon per year was exported by rivers and commerce and returned to the atmosphere elsewhere. These land-based estimates are larger than those from previous studies (0.08 to 0.35 petagrams of carbon per year) because of the inclusion of additional processes and revised estimates of some component fluxes. Although component estimates are uncertain, about one-half of the total is outside the forest sector. We also estimated the sink using atmospheric models and the atmospheric concentration of carbon dioxide (the tracer-transport inversion method). The range of results from the atmosphere-based inversions contains the land-based estimates. Atmosphere- and land-based estimates are thus consistent, within the large ranges of uncertainty for both methods. Atmosphere-based results for 1980-89 are similar to those for 1985-89 and 1990-94, indicating a relatively stable U.S. sink throughout the period.
@article{pacalaConsistentLandAtmospherebased2001,
title = {Consistent Land- and Atmosphere-Based {{U}}.{{S}}. Carbon Sink Estimates},
author = {Pacala, S. W. and Hurtt, G. C. and Baker, D. and Peylin, P. and Houghton, R. A. and Birdsey, R. A. and Heath, L. and Sundquist, E. T. and Stallard, R. F. and Ciais, P. and Moorcroft, P. and Caspersen, J. P. and Shevliakova, E. and Moore, B. and Kohlmaier, G. and Holland, E. and Gloor, M. and Harmon, M. E. and Fan, S. M. and Sarmiento, J. L. and Goodale, C. L. and Schimel, D. and Field, C. B.},
date = {2001},
journaltitle = {Science},
volume = {292},
pages = {2316--2320},
issn = {1095-9203},
doi = {10.1126/science.1057320},
url = {https://doi.org/10.1126/science.1057320},
abstract = {For the period 1980-89, we estimate a carbon sink in the coterminous United States between 0.30 and 0.58 petagrams of carbon per year (petagrams of carbon = 1015 grams of carbon). The net carbon flux from the atmosphere to the land was higher, 0.37 to 0.71 petagrams of carbon per year, because a net flux of 0.07 to 0.13 petagrams of carbon per year was exported by rivers and commerce and returned to the atmosphere elsewhere. These land-based estimates are larger than those from previous studies (0.08 to 0.35 petagrams of carbon per year) because of the inclusion of additional processes and revised estimates of some component fluxes. Although component estimates are uncertain, about one-half of the total is outside the forest sector. We also estimated the sink using atmospheric models and the atmospheric concentration of carbon dioxide (the tracer-transport inversion method). The range of results from the atmosphere-based inversions contains the land-based estimates. Atmosphere- and land-based estimates are thus consistent, within the large ranges of uncertainty for both methods. Atmosphere-based results for 1980-89 are similar to those for 1985-89 and 1990-94, indicating a relatively stable U.S. sink throughout the period.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14007180,carbon-cycle,forest-resources,uncertainty,united-states},
number = {5525}
}
Some Pitfalls of an Overemphasis on Science in Environmental Risk Management Decisions. Gregory, R., Failing, L., Ohlson, D., & Mcdaniels, T. L. 9(7):717–735. Paper doi abstract bibtex This paper addresses the question whether calls for "more" and "better" science will have the intended effect of improving the quality of decisions about environmental risks. There are reasons to be skeptical: key judgment tasks that fundamentally shape many aspects of decisions about environmental risk management lie outside the domain of science. These tasks include making value judgments explicit, integrating facts and values to create innovative alternatives, and constructively addressing conflicts about uncertainty. To bring new specificity to an old debate, we highlight six pitfalls in environmental risk decisions that can occur as the result of an overemphasis on science as the basis for management choices.
@article{gregoryPitfallsOveremphasisScience2006,
title = {Some Pitfalls of an Overemphasis on Science in Environmental Risk Management Decisions},
author = {Gregory, Robin and Failing, Lee and Ohlson, Dan and Mcdaniels, Timothy L.},
date = {2006-10},
journaltitle = {Journal of Risk Research},
volume = {9},
pages = {717--735},
doi = {10.1080/13669870600799895},
url = {https://doi.org/10.1080/13669870600799895},
abstract = {This paper addresses the question whether calls for "more" and "better" science will have the intended effect of improving the quality of decisions about environmental risks. There are reasons to be skeptical: key judgment tasks that fundamentally shape many aspects of decisions about environmental risk management lie outside the domain of science. These tasks include making value judgments explicit, integrating facts and values to create innovative alternatives, and constructively addressing conflicts about uncertainty. To bring new specificity to an old debate, we highlight six pitfalls in environmental risk decisions that can occur as the result of an overemphasis on science as the basis for management choices.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-11695072,communicating-uncertainty,risk-assessment,science-based-decision-making,science-ethics,science-policy-interface,scientific-communication,technocracy,uncertainty},
number = {7}
}
Improving Generalized Regression Analysis for the Spatial Prediction of Forest Communities. Maggini, R., Lehmann, A., Zimmermann, N. E., & Guisan, A. 33(10):1729–1749. Paper doi abstract bibtex Aim This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location State of Vaud, western Switzerland. Methods Generalized additive models (GAMs) were fitted using the grasp package (generalized regression analysis and spatial predictions, http://www.cscf.ch/grasp). Results Model selection based on cross-validation appeared to be the best compromise between model stability and performance (parsimony) among the five methods tested. Weighting absences returned models that perform better than models fitted with the original sample prevalence. This appeared to be mainly due to the impact of very low prevalence values on evaluation statistics. Removing zeroes beyond the range of presences on main environmental gradients changed the set of selected predictors, and potentially their response curve shape. Moreover, removing zeroes slightly improved model performance and stability when compared with the baseline model on the same data set. Incorporating a spatial trend predictor improved model performance and stability significantly. Even better models were obtained when including local spatial autocorrelation. A novel approach to include interactions proved to be an efficient way to account for interactions between all predictors at once. Main conclusions Models and spatial predictions of 18 forest communities were significantly improved by using either: (1) cross-validation as a model selection method, (2) weighted absences, (3) limited absences, (4) predictors accounting for spatial autocorrelation, or (5) a factor variable accounting for interactions between all predictors. The final choice of model strategy should depend on the nature of the available data and the specific study aims. Statistical evaluation is useful in searching for the best modelling practice. However, one should not neglect to consider the shapes and interpretability of response curves, as well as the resulting spatial predictions in the final assessment.
@article{magginiImprovingGeneralizedRegression2006,
title = {Improving Generalized Regression Analysis for the Spatial Prediction of Forest Communities},
author = {Maggini, Ramona and Lehmann, Anthony and Zimmermann, Niklaus E. and Guisan, Antoine},
date = {2006-10},
journaltitle = {Journal of Biogeography},
volume = {33},
pages = {1729--1749},
issn = {0305-0270},
doi = {10.1111/j.1365-2699.2006.01465.x},
url = {http://mfkp.org/INRMM/article/835166},
abstract = {Aim\hspace{0.6em} This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location\hspace{0.6em} State of Vaud, western Switzerland. Methods\hspace{0.6em} Generalized additive models (GAMs) were fitted using the grasp package (generalized regression analysis and spatial predictions, http://www.cscf.ch/grasp). Results\hspace{0.6em} Model selection based on cross-validation appeared to be the best compromise between model stability and performance (parsimony) among the five methods tested. Weighting absences returned models that perform better than models fitted with the original sample prevalence. This appeared to be mainly due to the impact of very low prevalence values on evaluation statistics. Removing zeroes beyond the range of presences on main environmental gradients changed the set of selected predictors, and potentially their response curve shape. Moreover, removing zeroes slightly improved model performance and stability when compared with the baseline model on the same data set. Incorporating a spatial trend predictor improved model performance and stability significantly. Even better models were obtained when including local spatial autocorrelation. A novel approach to include interactions proved to be an efficient way to account for interactions between all predictors at once. Main conclusions\hspace{0.6em} Models and spatial predictions of 18 forest communities were significantly improved by using either: (1) cross-validation as a model selection method, (2) weighted absences, (3) limited absences, (4) predictors accounting for spatial autocorrelation, or (5) a factor variable accounting for interactions between all predictors. The final choice of model strategy should depend on the nature of the available data and the specific study aims. Statistical evaluation is useful in searching for the best modelling practice. However, one should not neglect to consider the shapes and interpretability of response curves, as well as the resulting spatial predictions in the final assessment.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-835166,~to-add-doi-URL,bias-correction,bioclimatic-predictors,correlation-analysis,forest-resources,habitat-suitability,statistics,uncertainty,weighting},
number = {10}
}
Assessing Crown Fire Potential in Coniferous Forests of Western North America: A Critique of Current Approaches and Recent Simulation Studies. Cruz, M. G. & Alexander, M. E. 19(4):377+. Paper doi abstract bibtex To control and use wildland fires safely and effectively depends on creditable assessments of fire potential, including the propensity for crowning in conifer forests. Simulation studies that use certain fire modelling systems (i.e. NEXUS, FlamMap, FARSITE, FFE-FVS (Fire and Fuels Extension to the Forest Vegetation Simulator), Fuel Management Analyst (FMAPlus®), BehavePlus) based on separate implementations or direct integration of Rothermel's surface and crown rate of fire spread models with Van Wagner's crown fire transition and propagation models are shown to have a significant underprediction bias when used in assessing potential crown fire behaviour in conifer forests of western North America. The principal sources of this underprediction bias are shown to include: [::(i)] incompatible model linkages; [::(ii)] use of surface and crown fire rate of spread models that have an inherent underprediction bias; and [::(iii)] reduction in crown fire rate of spread based on the use of unsubstantiated crown fraction burned functions. [::The use of uncalibrated custom fuel models] to represent surface fuelbeds is a fourth potential source of bias. [\n] These sources are described and documented in detail based on comparisons with experimental fire and wildfire observations and on separate analyses of model components. The manner in which the two primary canopy fuel inputs influencing crown fire initiation (i.e. foliar moisture content and canopy base height) is handled in these simulation studies and the meaning of Scott and Reinhardt's two crown fire hazard indices are also critically examined. [Excerpt: Summary and concluding remarks] The ready availability of a multitude of fire modelling systems in the US in recent years has led to their widespread use in numerous simulation studies aimed at assessing various fire behaviour characteristics associated with specific fuel complex structures, including the propensity for crown fire initiation and spread (McHugh 2006). The results of these simulations, often aimed at evaluating fuel treatment effectiveness, are in turn utilised in a whole host of applications (e.g. Scott 2003; Fiedler et al. 2004; Skog et al. 2006; Johnson et al. 2007; Finkral and Evans 2008; Huggett et al. 2008; Johnson 2008; Reinhardt et al. 2010) and thus have significant implications for public and wildland firefighter safety, community fire protection, fire management policy-making, and forest management practices. As Cheney (1981) has noted, 'The reality of fire behaviour predictions is that overestimates can be easily readjusted without serious consequences; underestimates of behaviour can be disastrous both to the operations of the fire controller and the credibility of the person making the predictions'. [\n] A critical review of several of these simulation studies, as documented here, has found that the results are often unrealistic for a variety of reasons. It's recognised that the authors of these studies commonly point out the limitations of the models and modelling systems being used through a customary disclaimer concerning the unknowns regarding crown fire behaviour (e.g. Stephens et al. 2009). Nevertheless, the fact that the fuel treatment evaluation studies referenced here are based on modelling systems that utilised model linkages for gauging potential crown fire behaviour that have not previously undergone any form of performance evaluation against independent datasets or any empirical observations should be of concern. There appears, however, to be an aversion within an element of the fire research community to do so (e.g. Scott and Reinhardt 2001; Scott 2006; Stephens et al. 2009). Nevertheless, such testing is now generally regarded as a basic tenet of modern-day model development and evaluation (Jakeman et al. 2006). [\n] Fire modelling systems like NEXUS (Scott and Reinhardt 2001), FFE-FVS (Reinhardt and Crookston 2003), FARSITE (Finney 2004), FMAPlus (Carlton 2005), FlamMap (Finney 2006), and BehavePlus (Andrews et al. 2008) that are based on separate implementations or linkages between Rothermel's (1972, 1991) rate of fire spread models and Van Wagner's (1977, 1993) crown fire transition and propagation models have been shown to have a marked underprediction bias when used to assess potential crown fire behaviour. What has been allowed to evolve is a family of modelling systems composed of independently developed, linked models that were never intended to work together, are sometimes based on very limited data, and may propagate errors beyond acceptable limits. [\n] We have documented here the sources of the bias based on empirical evidence in the form of published experimental fire and wildfire datasets. By analysing model linkages and components, we have described the primary sources of such bias, namely: (1) incompatible model linkages; (2) use of surface and crown fire rate of spread models that have an inherent underprediction bias; and (3) reduction in crown fire rate of spread based on use of unsubstantiated CFB functions. The use of uncalibrated, custom fuel models to represent surface fuelbeds is considered another potential source of bias. [\n] Our analysis has also shown that the crown fire initiation underprediction bias inherent in all of these fire modelling systems could possibly be rectified by modifying the method used to calculate the surface fireline intensity for the purposes of assessing crown fire initiation potential, namely using Nelson's (2003) model to estimate tr in place of Anderson's model (1969). Other modelling systems exist for predicting the likelihood of crown fire initiation and other aspects of crown fire behaviour (Alexander et al. 2006; Cruz et al. 2006b, 2008). Mitsopoulos and Dimitrakopoulos (2007) have, for example, made extensive use of this suite of models in their assessment of crown fire potential in Aleppo pine (Pinus halepensis) forests in Greece. These systems are based on models that have undergone performance evaluations against independent datasets and been shown to be reasonably reliable (Cruz et al. 2003b, 2004, 2006b; Cronan and Jandt 2008). Resolving the underprediction bias associated with predicting active crown fire rate of spread inherent in the Rothermel (1991) model would require substantial changes, including a reassessment of the use of a CFB function, if not complete replacement with a more robust empirically developed model (Cruz et al. 2005) that has been extensively tested (Alexander and Cruz 2006) or a physically based one that has undergone limited testing (Butler et al. 2004). [\n] Alexander (2007) has emphasised that assessments of wildland fire potential involving simulation modelling must be complemented with fire behaviour case study knowledge and by experienced judgment. This review has revealed an overwhelming need for the research users of fire modelling systems to be grounded in the theory and proper application of such tools, including a solid understanding of the assumptions, limitations and accuracy of the underlying models as well as practical knowledge of the subject phenomena (Brown and Davis 1973; Albini 1976; Alexander 2009a, 2009b).
@article{cruzAssessingCrownFire2010,
title = {Assessing Crown Fire Potential in Coniferous Forests of Western {{North America}}: A Critique of Current Approaches and Recent Simulation Studies},
author = {Cruz, Miguel G. and Alexander, Martin E.},
date = {2010},
journaltitle = {International Journal of Wildland Fire},
volume = {19},
pages = {377+},
issn = {1049-8001},
doi = {10.1071/wf08132},
url = {https://doi.org/10.1071/wf08132},
abstract = {To control and use wildland fires safely and effectively depends on creditable assessments of fire potential, including the propensity for crowning in conifer forests. Simulation studies that use certain fire modelling systems (i.e. NEXUS, FlamMap, FARSITE, FFE-FVS (Fire and Fuels Extension to the Forest Vegetation Simulator), Fuel Management Analyst (FMAPlus®), BehavePlus) based on separate implementations or direct integration of Rothermel's surface and crown rate of fire spread models with Van Wagner's crown fire transition and propagation models are shown to have a significant underprediction bias when used in assessing potential crown fire behaviour in conifer forests of western North America. The principal sources of this underprediction bias are shown to include:
[::(i)] incompatible model linkages;
[::(ii)] use of surface and crown fire rate of spread models that have an inherent underprediction bias; and
[::(iii)] reduction in crown fire rate of spread based on the use of unsubstantiated crown fraction burned functions.
[::The use of uncalibrated custom fuel models] to represent surface fuelbeds is a fourth potential source of bias.
[\textbackslash n] These sources are described and documented in detail based on comparisons with experimental fire and wildfire observations and on separate analyses of model components. The manner in which the two primary canopy fuel inputs influencing crown fire initiation (i.e. foliar moisture content and canopy base height) is handled in these simulation studies and the meaning of Scott and Reinhardt's two crown fire hazard indices are also critically examined.
[Excerpt: Summary and concluding remarks]
The ready availability of a multitude of fire modelling systems in the US in recent years has led to their widespread use in numerous simulation studies aimed at assessing various fire behaviour characteristics associated with specific fuel complex structures, including the propensity for crown fire initiation and spread (McHugh 2006). The results of these simulations, often aimed at evaluating fuel treatment effectiveness, are in turn utilised in a whole host of applications (e.g. Scott 2003; Fiedler et al. 2004; Skog et al. 2006; Johnson et al. 2007; Finkral and Evans 2008; Huggett et al. 2008; Johnson 2008; Reinhardt et al. 2010) and thus have significant implications for public and wildland firefighter safety, community fire protection, fire management policy-making, and forest management practices. As Cheney (1981) has noted, 'The reality of fire behaviour predictions is that overestimates can be easily readjusted without serious consequences; underestimates of behaviour can be disastrous both to the operations of the fire controller and the credibility of the person making the predictions'.
[\textbackslash n] A critical review of several of these simulation studies, as documented here, has found that the results are often unrealistic for a variety of reasons. It's recognised that the authors of these studies commonly point out the limitations of the models and modelling systems being used through a customary disclaimer concerning the unknowns regarding crown fire behaviour (e.g. Stephens et al. 2009). Nevertheless, the fact that the fuel treatment evaluation studies referenced here are based on modelling systems that utilised model linkages for gauging potential crown fire behaviour that have not previously undergone any form of performance evaluation against independent datasets or any empirical observations should be of concern. There appears, however, to be an aversion within an element of the fire research community to do so (e.g. Scott and Reinhardt 2001; Scott 2006; Stephens et al. 2009). Nevertheless, such testing is now generally regarded as a basic tenet of modern-day model development and evaluation (Jakeman et al. 2006).
[\textbackslash n] Fire modelling systems like NEXUS (Scott and Reinhardt 2001), FFE-FVS (Reinhardt and Crookston 2003), FARSITE (Finney 2004), FMAPlus (Carlton 2005), FlamMap (Finney 2006), and BehavePlus (Andrews et al. 2008) that are based on separate implementations or linkages between Rothermel's (1972, 1991) rate of fire spread models and Van Wagner's (1977, 1993) crown fire transition and propagation models have been shown to have a marked underprediction bias when used to assess potential crown fire behaviour. What has been allowed to evolve is a family of modelling systems composed of independently developed, linked models that were never intended to work together, are sometimes based on very limited data, and may propagate errors beyond acceptable limits.
[\textbackslash n] We have documented here the sources of the bias based on empirical evidence in the form of published experimental fire and wildfire datasets. By analysing model linkages and components, we have described the primary sources of such bias, namely: (1) incompatible model linkages; (2) use of surface and crown fire rate of spread models that have an inherent underprediction bias; and (3) reduction in crown fire rate of spread based on use of unsubstantiated CFB functions. The use of uncalibrated, custom fuel models to represent surface fuelbeds is considered another potential source of bias.
[\textbackslash n] Our analysis has also shown that the crown fire initiation underprediction bias inherent in all of these fire modelling systems could possibly be rectified by modifying the method used to calculate the surface fireline intensity for the purposes of assessing crown fire initiation potential, namely using Nelson's (2003) model to estimate tr in place of Anderson's model (1969). Other modelling systems exist for predicting the likelihood of crown fire initiation and other aspects of crown fire behaviour (Alexander et al. 2006; Cruz et al. 2006b, 2008). Mitsopoulos and Dimitrakopoulos (2007) have, for example, made extensive use of this suite of models in their assessment of crown fire potential in Aleppo pine (Pinus halepensis) forests in Greece. These systems are based on models that have undergone performance evaluations against independent datasets and been shown to be reasonably reliable (Cruz et al. 2003b, 2004, 2006b; Cronan and Jandt 2008). Resolving the underprediction bias associated with predicting active crown fire rate of spread inherent in the Rothermel (1991) model would require substantial changes, including a reassessment of the use of a CFB function, if not complete replacement with a more robust empirically developed model (Cruz et al. 2005) that has been extensively tested (Alexander and Cruz 2006) or a physically based one that has undergone limited testing (Butler et al. 2004).
[\textbackslash n] Alexander (2007) has emphasised that assessments of wildland fire potential involving simulation modelling must be complemented with fire behaviour case study knowledge and by experienced judgment. This review has revealed an overwhelming need for the research users of fire modelling systems to be grounded in the theory and proper application of such tools, including a solid understanding of the assumptions, limitations and accuracy of the underlying models as well as practical knowledge of the subject phenomena (Brown and Davis 1973; Albini 1976; Alexander 2009a, 2009b).},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13706015,~to-add-doi-URL,canada,comparison,conifers,model-comparison,modelling-uncertainty,prediction-bias,rothermel,simulation,software-uncertainty,uncertainty,united-states,wildfires},
number = {4}
}
HISDAC-US, Historical Settlement Data Compilation for the Conterminous United States over 200 Years. Leyk, S. & Uhl, J. H. 5:180175+. Paper doi abstract bibtex Human settlement plays a key role in understanding social processes such as urbanization and interactions between human and environmental systems but not much is known about the landscape evolution before the era of operational remote sensing technology. In this study, housing and property databases are used to create new gridded settlement layers describing human settlement processes at fine spatial and temporal resolution in the conterminous United States between 1810 and 2015. The main products are a raster composite layer representing the year of first settlement, and a raster time series of built-up intensity representing the sum of building areas in a pixel. Several accompanying uncertainty surfaces are provided to ensure the user is informed about inherent spatial, temporal and thematic uncertainty in the data. A validation study using high quality reference data confirms high levels of accuracy of the resulting data products. These settlement data will be of great interest in disciplines in which the long-term evolution of human settlement represents crucial information to explore novel research questions.
@article{leykHISDACUSHistoricalSettlement2018,
title = {{{HISDAC}}-{{US}}, Historical Settlement Data Compilation for the Conterminous {{United States}} over 200 Years},
author = {Leyk, Stefan and Uhl, Johannes H.},
date = {2018-09},
journaltitle = {Scientific Data},
volume = {5},
pages = {180175+},
issn = {2052-4463},
doi = {10.1038/sdata.2018.175},
url = {https://doi.org/10.1038/sdata.2018.175},
abstract = {Human settlement plays a key role in understanding social processes such as urbanization and interactions between human and environmental systems but not much is known about the landscape evolution before the era of operational remote sensing technology. In this study, housing and property databases are used to create new gridded settlement layers describing human settlement processes at fine spatial and temporal resolution in the conterminous United States between 1810 and 2015. The main products are a raster composite layer representing the year of first settlement, and a raster time series of built-up intensity representing the sum of building areas in a pixel. Several accompanying uncertainty surfaces are provided to ensure the user is informed about inherent spatial, temporal and thematic uncertainty in the data. A validation study using high quality reference data confirms high levels of accuracy of the resulting data products. These settlement data will be of great interest in disciplines in which the long-term evolution of human settlement represents crucial information to explore novel research questions.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14633453,~to-add-doi-URL,data-integration,historical-perspective,human-settlement,integrated-modelling,integration-techniques,open-data,time-series,uncertainty,united-states}
}
ENSO as an Integrating Concept in Earth Science. McPhaden, M. J., Zebiak, S. E., & Glantz, M. H. 314(5806):1740–1745. Paper doi abstract bibtex The El Niño-Southern Oscillation (ENSO) cycle of alternating warm El Niño and cold La Niña events is the dominant year-to-year climate signal on Earth. ENSO originates in the tropical Pacific through interactions between the ocean and the atmosphere, but its environmental and socioeconomic impacts are felt worldwide. Spurred on by the powerful 1997-1998 El Niño, efforts to understand the causes and consequences of ENSO have greatly expanded in the past few years. These efforts reveal the breadth of ENSO's influence on the Earth system and the potential to exploit its predictability for societal benefit. However, many intertwined issues regarding ENSO dynamics, impacts, forecasting, and applications remain unresolved. Research to address these issues will not only lead to progress across a broad range of scientific disciplines but also provide an opportunity to educate the public and policy makers about the importance of climate variability and change in the modern world.
@article{mcphadenENSOIntegratingConcept2006,
title = {{{ENSO}} as an Integrating Concept in {{Earth}} Science},
author = {McPhaden, Michael J. and Zebiak, Stephen E. and Glantz, Michael H.},
date = {2006},
journaltitle = {Science},
volume = {314},
pages = {1740--1745},
issn = {1095-9203},
doi = {10.1126/science.1132588},
url = {https://doi.org/10.1126/science.1132588},
abstract = {The El Niño-Southern Oscillation (ENSO) cycle of alternating warm El Niño and cold La Niña events is the dominant year-to-year climate signal on Earth. ENSO originates in the tropical Pacific through interactions between the ocean and the atmosphere, but its environmental and socioeconomic impacts are felt worldwide. Spurred on by the powerful 1997-1998 El Niño, efforts to understand the causes and consequences of ENSO have greatly expanded in the past few years. These efforts reveal the breadth of ENSO's influence on the Earth system and the potential to exploit its predictability for societal benefit. However, many intertwined issues regarding ENSO dynamics, impacts, forecasting, and applications remain unresolved. Research to address these issues will not only lead to progress across a broad range of scientific disciplines but also provide an opportunity to educate the public and policy makers about the importance of climate variability and change in the modern world.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14007188,climate,education,el-nino,enso,uncertainty},
number = {5806}
}
Italian Scientists Vilified in Wake of Olive-Tree Deaths. Abbott, A. Paper doi abstract bibtex [Excerpt] [...] plant scientists at various institutes in Bari, the capital of the Puglia region, [southern Italy, ...] have been subject to a police investigation about whether they are responsible for the introduction of the bacterium, Xylella fastidiosa, into Puglia, or for allowing its subsequent spread. Police have called in several researchers involved in Xylella research for questioning and confiscated computers and documents from scientific institutes. ” We'd just like to be left to do our work without this suspicion and this stress,” says Donato Boscia, head of the Bari unit of the CNR Institute for Sustainable Plant Protection (IPSP), whom police questioned in April. [...] Xylella is endemic in parts of the Americas, including Costa Rica, Brazil and California, but was not previously found in Europe. That changed in October 2013, when scientists at the IPSP and the University of Bari identified the bacterium as the cause of an unusual disease outbreak in olive trees. The outbreak immediately became subject to European Union (EU) regulations to limit its spread, and regional scientists began a systematic effort to understand the disease and contain it: the scientists went on to show that the bacterium was being carried by the spittlebug insect. [::Ornamental plants] From the start, farmers and environmentalists in Italy objected to containment measures, which involved uprooting trees and spraying the groves with pesticides. But trouble for the Puglian scientists began in April 2014, when individuals told police that they suspected that the epidemic was caused by bacteria that scientists had brought in from California for a European training course on Xylella at the Mediterranean Agronomic Institute of Bari (IAMB) in 2010. [\n] Scientists say that this suggestion is ludicrous because the Puglia strain is different from the strains used at the workshop; the widely accepted theory is that the infection was imported with ornamental plants from Costa Rica, where the endemic Xylella strain matches the Puglia strain. [\n] However, the complaints spawned a much broader investigation by public prosecutors, including what role scientists may have had in the epidemic. [...] The prosecutors declined Nature's request for comment. But in March, one of them [...] implied in an interview [...] that they are looking into theories that the bacterium may have been deliberately introduced into the area, or became entrenched because agricultural scientists failed to monitor the region properly, either deliberately or through neglect. [...The prosecutor expressed concern] about the possible corrupting influence of businesses, such as solar-energy companies, which might stand to gain from the clearing of olive groves. [...] [\n] Puglian scientists have had to contend with public criticism too. Several popular blogs devoted to the Xylella emergency have cast doubt on scientists' ways of working and their results – saying, for example, that a remedy exists but is being suppressed [... and] that Xylella had not been proved to be the source of the outbreak and that the deaths were instead due to a fungus that could be eliminated without destroying trees. An expert panel of the European Food Safety Authority debunked these suggestions in a report published in April. [...] [\n] On 27 May, the regional government announced a €2-million (US\$2.2-million) fund for projects that might aid the diagnosis, epidemiology and monitoring of the bacterium. It said that a 'containment area' in the province of Lecce – where the bacterium is now endemic, making complete eradication impossible – will be used as an open-air Xylella laboratory. National and European research agencies have also promised money, says Boscia. ” The outdoor laboratory would be perfect for all of us – and also allow critics to put their own theories to the test.”
@article{abbottItalianScientistsVilified2015,
title = {Italian Scientists Vilified in Wake of Olive-Tree Deaths},
author = {Abbott, Alison},
date = {2015-06},
journaltitle = {Nature},
issn = {1476-4687},
doi = {10.1038/nature.2015.17651},
url = {https://doi.org/10.1038/nature.2015.17651},
abstract = {[Excerpt] [...] plant scientists at various institutes in Bari, the capital of the Puglia region, [southern Italy, ...] have been subject to a police investigation about whether they are responsible for the introduction of the bacterium, Xylella fastidiosa, into Puglia, or for allowing its subsequent spread. Police have called in several researchers involved in Xylella research for questioning and confiscated computers and documents from scientific institutes. ” We'd just like to be left to do our work without this suspicion and this stress,” says Donato Boscia, head of the Bari unit of the CNR Institute for Sustainable Plant Protection (IPSP), whom police questioned in April. [...]
Xylella is endemic in parts of the Americas, including Costa Rica, Brazil and California, but was not previously found in Europe. That changed in October 2013, when scientists at the IPSP and the University of Bari identified the bacterium as the cause of an unusual disease outbreak in olive trees. The outbreak immediately became subject to European Union (EU) regulations to limit its spread, and regional scientists began a systematic effort to understand the disease and contain it: the scientists went on to show that the bacterium was being carried by the spittlebug insect.
[::Ornamental plants]
From the start, farmers and environmentalists in Italy objected to containment measures, which involved uprooting trees and spraying the groves with pesticides. But trouble for the Puglian scientists began in April 2014, when individuals told police that they suspected that the epidemic was caused by bacteria that scientists had brought in from California for a European training course on Xylella at the Mediterranean Agronomic Institute of Bari (IAMB) in 2010.
[\textbackslash n] Scientists say that this suggestion is ludicrous because the Puglia strain is different from the strains used at the workshop; the widely accepted theory is that the infection was imported with ornamental plants from Costa Rica, where the endemic Xylella strain matches the Puglia strain.
[\textbackslash n] However, the complaints spawned a much broader investigation by public prosecutors, including what role scientists may have had in the epidemic. [...] The prosecutors declined Nature's request for comment. But in March, one of them [...] implied in an interview [...] that they are looking into theories that the bacterium may have been deliberately introduced into the area, or became entrenched because agricultural scientists failed to monitor the region properly, either deliberately or through neglect. [...The prosecutor expressed concern] about the possible corrupting influence of businesses, such as solar-energy companies, which might stand to gain from the clearing of olive groves. [...]
[\textbackslash n] Puglian scientists have had to contend with public criticism too. Several popular blogs devoted to the Xylella emergency have cast doubt on scientists' ways of working and their results -- saying, for example, that a remedy exists but is being suppressed [... and] that Xylella had not been proved to be the source of the outbreak and that the deaths were instead due to a fungus that could be eliminated without destroying trees. An expert panel of the European Food Safety Authority debunked these suggestions in a report published in April. [...]
[\textbackslash n] On 27 May, the regional government announced a €2-million (US\$2.2-million) fund for projects that might aid the diagnosis, epidemiology and monitoring of the bacterium. It said that a 'containment area' in the province of Lecce -- where the bacterium is now endemic, making complete eradication impossible -- will be used as an open-air Xylella laboratory. National and European research agencies have also promised money, says Boscia. ” The outdoor laboratory would be perfect for all of us -- and also allow critics to put their own theories to the test.”},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13636189,~to-add-doi-URL,agricultural-resources,complexity,disturbances,forest-resources,multi-stakeholder-decision-making,olea-europaea,plant-pests,science-based-decision-making,science-ethics,science-policy-interface,science-society-interface,scientific-communication,solar-energy,uncertainty,xylella-fastidiosa}
}
The Limits of Cost/Benefit Analysis When Disasters Loom. Rose-Ackerman, S. 7:56–66. Paper doi abstract bibtex [Abstract] Advances in estimating the costs and benefits of climate change policies are a welcome development, but a full-scale cost/benefit analysis that seeks to reduce complex value trade-offs to a single metric of net benefit maximization hides many important public policy issues, especially for disasters and catastrophes that are large, discontinuous, irreversible and uncertain. States should obtain public input on such policies. These policies involve value trade-offs that can be informed by technocratic estimates of costs, benefits and risk. However, such analyses cannot, in principle, be reduced to a single recommendation that 'maximizes net benefits'. Politicians must make value trade-offs informed both by technocrats and by public input. [Policy Implications] [::] Policies that seek to limit the risks of potential disasters and catastrophes - that are large, discontinuous, irreversible, and far in the future - should be informed by technocratic estimates of costs, benefits and risk. [::] However, such analyses cannot, in principle, be reduced to a single recommendation that 'maximizes net benefits'. The applied utilitarianism of cost/benefit cannot resolve such problems. [::] Policy makers should obtain public input as they make the necessary value trade-offs. [::] Once the basic policy choice has been made, the next step is a detailed allocation of costs. Those decisions involve an interplay of technical details with choices about the distribution of costs between general taxpayers, producers and users of a service (e.g. electric power), citizens affected directly by the policy in their homes and businesses and other competing values (e.g., preservation of natural areas). Such choices are not limited to policies that concern disasters; they are pervasive in all public policy choice, but they too cannot be reduced to the logic of cost/benefit analysis (CBA) which treats all benefits and costs equally. [::] CBA is a valuable tool in many contexts, but its limits are particularly obvious in the case of policies that seek to prevent disasters and catastrophes. [Excerpt: Conclusions] States should obtain public input on policies that seek to limit the risks of disasters and catastrophes - some far in the future, others of low probability. Such policies involve value trade-offs that can be informed by technocratic estimates of costs, benefits and risk. However, such analyses cannot, in principle, be reduced to a single recommendation that 'maximizes net benefits'. The applied utilitarianism of cost/benefit cannot resolve such problems; politicians must make value trade-off informed both by technocrats and by public input. Once the basic policy decision has been made, choices that involve the detailed allocation of costs are on a different plane. They involve the interplay of technical details and the way costs are distributed between general taxpayers, producers and users of a service (e.g., electric power), citizens affected directly by the policy in their homes and businesses, other competing values (e.g., preservation of natural areas). Advances in estimating the costs and benefits of climate change policies are a welcome development, but a full-scale cost/benefit CBA that seeks to reduce complex value trade-offs to a single metric of net benefit maximization hides many important public policy issues, especially for disasters and catastrophes that are large, discontinuous, irreversible and uncertain. [...]
@article{rose-ackermanLimitsCostBenefit2016,
title = {The Limits of Cost/Benefit Analysis When Disasters Loom},
author = {Rose-Ackerman, Susan},
date = {2016-05},
journaltitle = {Global Policy},
volume = {7},
pages = {56--66},
issn = {1758-5880},
doi = {10.1111/1758-5899.12279},
url = {https://doi.org/10.1111/1758-5899.12279},
abstract = {[Abstract]
Advances in estimating the costs and benefits of climate change policies are a welcome development, but a full-scale cost/benefit analysis that seeks to reduce complex value trade-offs to a single metric of net benefit maximization hides many important public policy issues, especially for disasters and catastrophes that are large, discontinuous, irreversible and uncertain. States should obtain public input on such policies. These policies involve value trade-offs that can be informed by technocratic estimates of costs, benefits and risk. However, such analyses cannot, in principle, be reduced to a single recommendation that 'maximizes net benefits'. Politicians must make value trade-offs informed both by technocrats and by public input.
[Policy Implications]
[::] Policies that seek to limit the risks of potential disasters and catastrophes - that are large, discontinuous, irreversible, and far in the future - should be informed by technocratic estimates of costs, benefits and risk.
[::] However, such analyses cannot, in principle, be reduced to a single recommendation that 'maximizes net benefits'. The applied utilitarianism of cost/benefit cannot resolve such problems.
[::] Policy makers should obtain public input as they make the necessary value trade-offs.
[::] Once the basic policy choice has been made, the next step is a detailed allocation of costs. Those decisions involve an interplay of technical details with choices about the distribution of costs between general taxpayers, producers and users of a service (e.g. electric power), citizens affected directly by the policy in their homes and businesses and other competing values (e.g., preservation of natural areas). Such choices are not limited to policies that concern disasters; they are pervasive in all public policy choice, but they too cannot be reduced to the logic of cost/benefit analysis (CBA) which treats all benefits and costs equally.
[::] CBA is a valuable tool in many contexts, but its limits are particularly obvious in the case of policies that seek to prevent disasters and catastrophes.
[Excerpt: Conclusions]
States should obtain public input on policies that seek to limit the risks of disasters and catastrophes - some far in the future, others of low probability. Such policies involve value trade-offs that can be informed by technocratic estimates of costs, benefits and risk. However, such analyses cannot, in principle, be reduced to a single recommendation that 'maximizes net benefits'. The applied utilitarianism of cost/benefit cannot resolve such problems; politicians must make value trade-off informed both by technocrats and by public input. Once the basic policy decision has been made, choices that involve the detailed allocation of costs are on a different plane. They involve the interplay of technical details and the way costs are distributed between general taxpayers, producers and users of a service (e.g., electric power), citizens affected directly by the policy in their homes and businesses, other competing values (e.g., preservation of natural areas). Advances in estimating the costs and benefits of climate change policies are a welcome development, but a full-scale cost/benefit CBA that seeks to reduce complex value trade-offs to a single metric of net benefit maximization hides many important public policy issues, especially for disasters and catastrophes that are large, discontinuous, irreversible and uncertain. [...]},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14534852,cognitive-biases,communicating-uncertainty,controversial-monetarisation,cost-benefit-analysis,disasters,ethics,natural-disasters,review,science-ethics,science-policy-interface,science-society-interface,scientific-communication,technocracy,trade-offs,uncertainty,unknown,values-vs-scientific-evidence}
}
Soil Erosion Assessment - Mind the Gap. Kim, J., Ivanov, V. Y., & Fatichi, S. 43(24):2016GL071480+. Paper doi abstract bibtex Accurate assessment of erosion rates remains an elusive problem because soil loss is strongly nonunique with respect to the main drivers. In addressing the mechanistic causes of erosion responses, we discriminate between macroscale effects of external factors – long studied and referred to as ” geomorphic external variability”, and microscale effects, introduced as ” geomorphic internal variability.” The latter source of erosion variations represents the knowledge gap, an overlooked but vital element of geomorphic response, significantly impacting the low predictability skill of deterministic models at field-catchment scales. This is corroborated with experiments using a comprehensive physical model that dynamically updates the soil mass and particle composition. As complete knowledge of microscale conditions for arbitrary location and time is infeasible, we propose that new predictive frameworks of soil erosion should embed stochastic components in deterministic assessments of external and internal types of geomorphic variability.
@article{kimSoilErosionAssessment2016,
title = {Soil Erosion Assessment - {{Mind}} the Gap},
author = {Kim, Jongho and Ivanov, Valeriy Y. and Fatichi, Simone},
date = {2016-12},
journaltitle = {Geophys. Res. Lett.},
volume = {43},
pages = {2016GL071480+},
issn = {0094-8276},
doi = {10.1002/2016gl071480},
url = {http://mfkp.org/INRMM/article/14257841},
abstract = {Accurate assessment of erosion rates remains an elusive problem because soil loss is strongly nonunique with respect to the main drivers. In addressing the mechanistic causes of erosion responses, we discriminate between macroscale effects of external factors -- long studied and referred to as ” geomorphic external variability”, and microscale effects, introduced as ” geomorphic internal variability.” The latter source of erosion variations represents the knowledge gap, an overlooked but vital element of geomorphic response, significantly impacting the low predictability skill of deterministic models at field-catchment scales. This is corroborated with experiments using a comprehensive physical model that dynamically updates the soil mass and particle composition. As complete knowledge of microscale conditions for arbitrary location and time is infeasible, we propose that new predictive frameworks of soil erosion should embed stochastic components in deterministic assessments of external and internal types of geomorphic variability.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14257841,~to-add-doi-URL,erodibility,local-scale,modelling-uncertainty,soil-erosion,soil-resources,uncertainty},
number = {24}
}
Handling uncertainty in bioenergy policy design – A case study analysis of UK and German bioelectricity policy instruments. Purkus, A., Röder, M., Gawel, E., Thrän, D., & Thornley, P. Biomass and Bioenergy. Paper doi abstract bibtex In designing policies to promote bioenergy, policy makers face challenges concerning uncertainties about the sustainability of bioenergy pathways (including greenhouse gas balances), technology and resource costs, or future energy market framework conditions. New information becomes available with time, but policy adjustments can involve high levels of adaptation costs. To enable an effective steering of technology choices and innovation, policies have to strike a balance between creating a consistent institutional framework, which establishes planning security for investors, and sufficient flexibility to adapt to new information. This paper examines implications of economic theory for handling cost and benefit uncertainty in bioelectricity policy design, focussing on choices between price and quantity instruments, technology differentiation, and policy adjustment. Findings are applied to two case studies, the UK's Renewables Obligation and the German feed-in tariff/feed-in premium scheme. Case study results show the trade-offs that are involved in instrument choice and design – depending on political priorities and a country's specific context, different options can prove more adequate. Combining market-based remuneration with sustainability criteria results in strong incentives for bioenergy producers to search for low-cost solutions; whereas cost-based price instruments with centrally steered technology and feedstock choices offer higher planning security for investors and more direct control for policy makers over what pathways are implemented. Independent of the choice of instrument type and technology differentiation mechanism, findings emphasise the importance of a careful policy design, which determines the exact balance between performance criteria such as cost control, incentive intensity, planning security and adaptive efficiency.
@article{purkus_handling_????,
title = {Handling uncertainty in bioenergy policy design – {A} case study analysis of {UK} and {German} bioelectricity policy instruments},
issn = {0961-9534},
url = {http://www.sciencedirect.com/science/article/pii/S0961953415001154},
doi = {10.1016/j.biombioe.2015.03.029},
abstract = {In designing policies to promote bioenergy, policy makers face challenges concerning uncertainties about the sustainability of bioenergy pathways (including greenhouse gas balances), technology and resource costs, or future energy market framework conditions. New information becomes available with time, but policy adjustments can involve high levels of adaptation costs. To enable an effective steering of technology choices and innovation, policies have to strike a balance between creating a consistent institutional framework, which establishes planning security for investors, and sufficient flexibility to adapt to new information. This paper examines implications of economic theory for handling cost and benefit uncertainty in bioelectricity policy design, focussing on choices between price and quantity instruments, technology differentiation, and policy adjustment. Findings are applied to two case studies, the UK's Renewables Obligation and the German feed-in tariff/feed-in premium scheme. Case study results show the trade-offs that are involved in instrument choice and design – depending on political priorities and a country's specific context, different options can prove more adequate. Combining market-based remuneration with sustainability criteria results in strong incentives for bioenergy producers to search for low-cost solutions; whereas cost-based price instruments with centrally steered technology and feedstock choices offer higher planning security for investors and more direct control for policy makers over what pathways are implemented. Independent of the choice of instrument type and technology differentiation mechanism, findings emphasise the importance of a careful policy design, which determines the exact balance between performance criteria such as cost control, incentive intensity, planning security and adaptive efficiency.},
urldate = {2015-04-18},
journal = {Biomass and Bioenergy},
author = {Purkus, Alexandra and Röder, Mirjam and Gawel, Erik and Thrän, Daniela and Thornley, Patricia},
keywords = {Bioenergy policy, Electricity sector, Instruments, New institutional economics, Renewable energy policy, uncertainty},
file = {ScienceDirect Full Text PDF:files/51189/Purkus et al. - Handling uncertainty in bioenergy policy design – .pdf:application/pdf;ScienceDirect Snapshot:files/51190/S0961953415001154.html:text/html}
}
The Boundary Effect: Perceived Post Hoc Accuracy of Prediction Intervals. Teigen, K. H., Løhre, E., & Hohle, S. M. 13(4):309–321. Paper abstract bibtex Predictions of magnitudes (costs, durations, environmental events) are often given as uncertainty intervals (ranges). When are such forecasts judged to be correct? We report results of four experiments showing that forecasted ranges of expected natural events (floods and volcanic eruptions) are perceived as accurate when an observed magnitude falls inside or at the boundary of the range, with little regard to its position relative to the ” most likely” (central) estimate. All outcomes that fell inside a wide interval were perceived as equally well captured by the forecast, whereas identical outcomes falling outside a narrow range were deemed to be incorrectly predicted, in proportion to the magnitude of deviation. In these studies, ranges function as categories, with boundaries distinguishing between right or wrong predictions, even for outcome distributions that are acknowledged as continuous, and for boundaries that are arbitrarily defined (for instance, when the narrow prediction interval is defined as capturing 50 percent and the wide 90 percent of all potential outcomes). However, the boundary effect is affected by label. When the upper limit of a range is described as a value that ” can” occur (Experiment 5), outcomes both below and beyond this value were regarded as consistent with the forecast.
@article{teigenBoundaryEffectPerceived2018,
title = {The Boundary Effect: Perceived Post Hoc Accuracy of Prediction Intervals},
author = {Teigen, Karl H. and Løhre, Erik and Hohle, Sigrid M.},
date = {2018-07},
journaltitle = {Judgment and Decision Making},
volume = {13},
pages = {309--321},
issn = {1930-2975},
url = {http://journal.sjdm.org/17/171211/jdm171211.pdf},
abstract = {Predictions of magnitudes (costs, durations, environmental events) are often given as uncertainty intervals (ranges). When are such forecasts judged to be correct? We report results of four experiments showing that forecasted ranges of expected natural events (floods and volcanic eruptions) are perceived as accurate when an observed magnitude falls inside or at the boundary of the range, with little regard to its position relative to the ” most likely” (central) estimate. All outcomes that fell inside a wide interval were perceived as equally well captured by the forecast, whereas identical outcomes falling outside a narrow range were deemed to be incorrectly predicted, in proportion to the magnitude of deviation. In these studies, ranges function as categories, with boundaries distinguishing between right or wrong predictions, even for outcome distributions that are acknowledged as continuous, and for boundaries that are arbitrarily defined (for instance, when the narrow prediction interval is defined as capturing 50 percent and the wide 90 percent of all potential outcomes). However, the boundary effect is affected by label. When the upper limit of a range is described as a value that ” can” occur (Experiment 5), outcomes both below and beyond this value were regarded as consistent with the forecast.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14644967,classification-bias,cognitive-biases,communicating-uncertainty,psychology,scientific-communication,uncertainty},
number = {4}
}
Geostatistical Simulation and Error Propagation in Geomorphometry. Temme, A. J. A. M., Heuvelink, G. B. M., Schoorl, J. M., & Claessens, L. In Hengl, T. & Reuter, H. I., editors, Developments in Soil Science, volume 33, of Geomorphometry, pages 121–140. Elsevier. Paper doi abstract bibtex This chapter aims to demonstrate how uncertainty in digital elevation model (DEM) attributes can be quantified using geostatistical methods and to show how the propagation of errors to DEM derived products may be computed. To attribute errors DEMs may have positional errors like a shift along one or both coordinate axes, rotational errors, scaling errors, projection errors, or a combination of these. In this chapter, only attribute errors are considered. It describe how propagation of attribute errors in spatial modeling can be computed using the Monte-Carlo method. This method is the most often used error propagation method because it is generic, flexible, and intuitively appealing. In order of increasing complexity, the chapter considers the propagation of error from DEMs to three derivatives, namely slope (a local land-surface parameter), topographic wetness index (a regional land-surface parameter), and soil redistribution resulting from water erosion (a complex model). It also describes the uncertainty propagation analysis in detail.
@incollection{temmeGeostatisticalSimulationError2009,
title = {Geostatistical Simulation and Error Propagation in Geomorphometry},
booktitle = {Developments in {{Soil Science}}},
author = {Temme, A. J. A. M. and Heuvelink, G. B. M. and Schoorl, J. M. and Claessens, L.},
editor = {Hengl, Tomislav and Reuter, Hannes I.},
date = {2009-01-01},
volume = {33},
pages = {121--140},
publisher = {{Elsevier}},
doi = {10.1016/S0166-2481(08)00005-6},
url = {https://doi.org/10.1016/S0166-2481(08)00005-6},
urldate = {2019-11-08},
abstract = {This chapter aims to demonstrate how uncertainty in digital elevation model (DEM) attributes can be quantified using geostatistical methods and to show how the propagation of errors to DEM derived products may be computed. To attribute errors DEMs may have positional errors like a shift along one or both coordinate axes, rotational errors, scaling errors, projection errors, or a combination of these. In this chapter, only attribute errors are considered. It describe how propagation of attribute errors in spatial modeling can be computed using the Monte-Carlo method. This method is the most often used error propagation method because it is generic, flexible, and intuitively appealing. In order of increasing complexity, the chapter considers the propagation of error from DEMs to three derivatives, namely slope (a local land-surface parameter), topographic wetness index (a regional land-surface parameter), and soil redistribution resulting from water erosion (a complex model). It also describes the uncertainty propagation analysis in detail.},
keywords = {~INRMM-MiD:z-BGZ4NZY6,elevation,monte-carlo,randomised-ensemble-uncertainty,slope,soil-erosion,soil-resources,topographic-wetness-index,uncertainty,uncertainty-propagation},
langid = {english},
series = {Geomorphometry}
}
Quantifying the Effects of Wildfire on Changes in Soil Properties by Surface Burning of Soils from the Boulder Creek Critical Zone Observatory. Wieting, C., Ebel, B. A., & Singha, K. 13:43–57. Paper doi abstract bibtex [Highlights] [::] Lab experiments on wildfire impacts were conducted using intact soil cores collected in the field. [::] Fire severity was simulated using a heating gun directed at the soil surface. [::] Fire severity impacted total organic carbon, field-saturated hydraulic conductivity, and water-drop penetration times. [::] Fires did not impact bulk density or core water storage. [::] Reductions in surface soil water repellency in high severity fires may increase infiltration relative to low severity fire. [Abstract] [::Study region] This study used intact soil cores collected at the Boulder Creek Critical Zone Observatory near Boulder, Colorado, USA to explore fire impacts on soil properties. [::Study focus] Three soil scenarios were considered: unburned control soils, and low- and high-temperature burned soils. We explored simulated fire impacts on field-saturated hydraulic conductivity, dry bulk density, total organic carbon, and infiltration processes during rainfall simulations. [::New hydrological insights for the region] Soils burned to high temperatures became more homogeneous with depth with respect to total organic carbon and bulk density, suggesting reductions in near-surface porosity. Organic matter decreased significantly with increasing soil temperature. Tension infiltration experiments suggested a decrease in infiltration rates from unburned to low-temperature burned soils, and an increase in infiltration rates in high-temperature burned soils. Non-parametric statistical tests showed that field-saturated hydraulic conductivity similarly decreased from unburned to low-temperature burned soils, and then increased with high-temperature burned soils. We interpret these changes result from the combustion of surface and near-surface organic materials, enabling water to infiltrate directly into soil instead of being stored in the litter and duff layer at the surface. Together, these results indicate that fire-induced changes in soil properties from low temperatures were not as drastic as high temperatures, but that reductions in surface soil water repellency in high temperatures may increase infiltration relative to low temperatures.
@article{wietingQuantifyingEffectsWildfire2017,
title = {Quantifying the Effects of Wildfire on Changes in Soil Properties by Surface Burning of Soils from the {{Boulder Creek Critical Zone Observatory}}},
author = {Wieting, Celeste and Ebel, Brian A. and Singha, Kamini},
date = {2017-10-01},
journaltitle = {Journal of Hydrology: Regional Studies},
shortjournal = {Journal of Hydrology: Regional Studies},
volume = {13},
pages = {43--57},
issn = {2214-5818},
doi = {10.1016/j.ejrh.2017.07.006},
url = {https://doi.org/10.1016/j.ejrh.2017.07.006},
urldate = {2019-12-04},
abstract = {[Highlights]
[::] Lab experiments on wildfire impacts were conducted using intact soil cores collected in the field.
[::] Fire severity was simulated using a heating gun directed at the soil surface.
[::] Fire severity impacted total organic carbon, field-saturated hydraulic conductivity, and water-drop penetration times.
[::] Fires did not impact bulk density or core water storage.
[::] Reductions in surface soil water repellency in high severity fires may increase infiltration relative to low severity fire.
[Abstract]
[::Study region]
This study used intact soil cores collected at the Boulder Creek Critical Zone Observatory near Boulder, Colorado, USA to explore fire impacts on soil properties.
[::Study focus]
Three soil scenarios were considered: unburned control soils, and low- and high-temperature burned soils. We explored simulated fire impacts on field-saturated hydraulic conductivity, dry bulk density, total organic carbon, and infiltration processes during rainfall simulations.
[::New hydrological insights for the region]
Soils burned to high temperatures became more homogeneous with depth with respect to total organic carbon and bulk density, suggesting reductions in near-surface porosity. Organic matter decreased significantly with increasing soil temperature. Tension infiltration experiments suggested a decrease in infiltration rates from unburned to low-temperature burned soils, and an increase in infiltration rates in high-temperature burned soils. Non-parametric statistical tests showed that field-saturated hydraulic conductivity similarly decreased from unburned to low-temperature burned soils, and then increased with high-temperature burned soils. We interpret these changes result from the combustion of surface and near-surface organic materials, enabling water to infiltrate directly into soil instead of being stored in the litter and duff layer at the surface. Together, these results indicate that fire-induced changes in soil properties from low temperatures were not as drastic as high temperatures, but that reductions in surface soil water repellency in high temperatures may increase infiltration relative to low temperatures.},
keywords = {~INRMM-MiD:z-24NU7ZSB,data-uncertainty,erodibility,fire-severity,modelling-uncertainty,post-fire-impacts,soil-erosion,soil-resources,uncertainty,united-states,wildfires},
langid = {english}
}
More Accountability for Big-Data Algorithms. Nature 537(7621):449. Paper doi abstract bibtex To avoid bias and improve transparency, algorithm designers must make data sources and profiles public. [Excerpt] [...] Algorithms, from the simplest to the most complex, follow sets of instructions or learn to accomplish a goal. In principle, they could help to make impartial analyses and decisions by reducing human biases and prejudices. But there is growing concern that they risk doing the opposite, and will replicate and exacerbate human failings [...]. And in an era of powerful computers, machine learning and big data, these equations have taken on a life of their own. [Bias in, bias out] [...] There are many sources of bias in algorithms. One is the hard-coding of rules and use of data sets that already reflect common societal spin. Put bias in and get bias out. Spurious or dubious correlations are another pitfall. [...] [] [...] a strong movement for greater 'algorithmic accountability' is now under way in academia and, to their credit, parts of the tech industry such as Google and Microsoft. This has been spurred largely by the increasing pace and adoption of machine learning and other artificial-intelligence (AI) techniques. A sensible step in the direction of greater transparency would be for the designers of algorithms to make public the source of the data sets they use to train and feed them. Disclosure of the design of the algorithms themselves would open these up to scrutiny, but is almost certain to collide with companies' desire to protect their secrets (and prevent gaming). Researchers hope to find ways to audit for bias without revealing the algorithms. [..] As with the use of science metrics in research assessment, a simplistic over-reliance on algorithms is heavily flawed. It's clear that the (vastly more complex) algorithms that help to drive the rest of the world are here to stay. Indeed, ubiquitous and even more sophisticated AI algorithms are already in view. Society needs to discuss in earnest how to rid software and machines of human bugs.
@article{natureMoreAccountabilityBigdata2016,
title = {More Accountability for Big-Data Algorithms},
author = {{Nature}},
date = {2016-09},
journaltitle = {Nature},
volume = {537},
pages = {449},
issn = {0028-0836},
doi = {10.1038/537449a},
url = {https://doi.org/10.1038/537449a},
abstract = {To avoid bias and improve transparency, algorithm designers must make data sources and profiles public.
[Excerpt] [...] Algorithms, from the simplest to the most complex, follow sets of instructions or learn to accomplish a goal. In principle, they could help to make impartial analyses and decisions by reducing human biases and prejudices. But there is growing concern that they risk doing the opposite, and will replicate and exacerbate human failings [...]. And in an era of powerful computers, machine learning and big data, these equations have taken on a life of their own.
[Bias in, bias out]
[...] There are many sources of bias in algorithms. One is the hard-coding of rules and use of data sets that already reflect common societal spin. Put bias in and get bias out. Spurious or dubious correlations are another pitfall. [...]
[] [...] a strong movement for greater 'algorithmic accountability' is now under way in academia and, to their credit, parts of the tech industry such as Google and Microsoft. This has been spurred largely by the increasing pace and adoption of machine learning and other artificial-intelligence (AI) techniques. A sensible step in the direction of greater transparency would be for the designers of algorithms to make public the source of the data sets they use to train and feed them. Disclosure of the design of the algorithms themselves would open these up to scrutiny, but is almost certain to collide with companies' desire to protect their secrets (and prevent gaming). Researchers hope to find ways to audit for bias without revealing the algorithms. [..] As with the use of science metrics in research assessment, a simplistic over-reliance on algorithms is heavily flawed. It's clear that the (vastly more complex) algorithms that help to drive the rest of the world are here to stay. Indeed, ubiquitous and even more sophisticated AI algorithms are already in view. Society needs to discuss in earnest how to rid software and machines of human bugs.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14143609,algorithmic-accountability,big-data,communicating-uncertainty,data-uncertainty,modelling-uncertainty,open-data,open-science,science-ethics,science-policy-interface,science-society-interface,scientific-communication,uncertainty},
number = {7621}
}
Electric Sector Capacity Planning under Uncertainty: Climate Policy and Natural Gas in the US. Bistline, J. E. Energy Economics. Paper doi abstract bibtex This research investigates the dynamics of capacity planning and dispatch in the US electric power sector under a range of technological, economic, and policy-related uncertainties. Using a two-stage stochastic programming approach, model results suggest that the two most critical risks in the near-term planning process of the uncertainties considered here are natural gas prices and the stringency of climate policy. Stochastic strategies indicate that some near-term hedging from lower-cost wind and nuclear may occur but robustly demonstrate that delaying investment and waiting for more information can be optimal to avoid stranding capital-intensive assets. Hedging strategies protect against downside losses while retaining the option value of deferring irreversible commitments until more information is available about potentially lucrative market opportunities. These results are explained in terms of the optionality of investments in the electric power sector, leading to more general insights about uncertainty, learning, and irreversibility. The stochastic solution is especially valuable if decision-makers do not sufficiently account for the potential of climate constraints in future decades or if fuel price projections are outdated.
@article{bistline_electric_????,
title = {Electric {Sector} {Capacity} {Planning} under {Uncertainty}: {Climate} {Policy} and {Natural} {Gas} in the {US}},
issn = {0140-9883},
shorttitle = {Electric {Sector} {Capacity} {Planning} under {Uncertainty}},
url = {http://www.sciencedirect.com/science/article/pii/S0140988315002157},
doi = {10.1016/j.eneco.2015.07.008},
abstract = {This research investigates the dynamics of capacity planning and dispatch in the US electric power sector under a range of technological, economic, and policy-related uncertainties. Using a two-stage stochastic programming approach, model results suggest that the two most critical risks in the near-term planning process of the uncertainties considered here are natural gas prices and the stringency of climate policy. Stochastic strategies indicate that some near-term hedging from lower-cost wind and nuclear may occur but robustly demonstrate that delaying investment and waiting for more information can be optimal to avoid stranding capital-intensive assets. Hedging strategies protect against downside losses while retaining the option value of deferring irreversible commitments until more information is available about potentially lucrative market opportunities. These results are explained in terms of the optionality of investments in the electric power sector, leading to more general insights about uncertainty, learning, and irreversibility. The stochastic solution is especially valuable if decision-makers do not sufficiently account for the potential of climate constraints in future decades or if fuel price projections are outdated.},
urldate = {2015-07-31},
journal = {Energy Economics},
author = {Bistline, John E.},
keywords = {climate policy, Electricity, risk management, stochastic programming, uncertainty}
}
Are More Complex Physiological Models of Forest Ecosystems Better Choices for Plot and Regional Predictions?. Jin, W., He, H. S., & Thompson, F. R. 75:1–14. Paper doi abstract bibtex [Highlights] [::] We evaluated performance of process-based forest ecosystem models. [::] A complex physiological model performed best at the plot scale. [::] A hybrid empirical-physiological model performed best at the regional scale. [Abstract] We evaluated performance of process-based forest ecosystem models. A complex physiological model performed best at the plot scale. A hybrid empirical-physiological model performed best at the regional scale. Process-based forest ecosystem models vary from simple physiological, complex physiological, to hybrid empirical-physiological models. Previous studies indicate that complex models provide the best prediction at plot scale with a temporal extent of less than 10 years, however, it is largely untested as to whether complex models outperform the other two types of models at plot and regional scale in longer timeframe (i.e. decades). We compared model predictions of aboveground carbon by one representative model of each model type (PnET-II, ED2 and LINKAGES v2.2, respectively) with field data (19-77 years) at both scales in the Central Hardwood Forests of the United States. At plot scale, predictions by complex physiological model were the most concordant with field data, suggesting that physiological processes are more influential than forest composition and structure. Hybrid model provided the best predictions at regional scale, suggesting that forest composition and structure may be more influential than physiological processes. [Excerpt: Regional scale comparisons] The percent bias of all models was larger at the regional scale than the plot scale. Abiotic environmental heterogeneity at the regional scale could be one of the factors contributing to the larger percent bias at the regional scale. Even though we used ecological subsections as our regional scale study areas-areas where the vegetation and environmental factors are considered relatively homogenous (McNab et al., 2007), the environmental heterogeneity within each subsection would still be higher than that at each plot-scale site. Small-scale abiotic environmental variations, such as difference in water availability at different slope positions of the same soil type, have been ignored in this regional scale study, and summarized environmental factors were used to represent the average physical situations across the entire subsection. [...] [\n] Predictions based on empirical relationships like those in the hybrid model may not hold true under changing environments in the future, since those relationships were established based on observations in the past (Gustafson, 2013 and Cuddington et al., 2013). However, predictions based on empirical life history attributes might retain validity in the future due to niche conservatism (Crisp et al., 2009 and Wiens et al., 2010). For example, the tolerance range of growing degree days of a given plant species may remain largely constant despite climate change. [\n] None of the models we examined simulate forest landscape processes, which are spatially continuous and temporally dynamic processes (e.g., fire disturbance). Forest landscape processes are likely to have greater contribution to forest ecosystem responses than climate variables alone (Gustafson et al., 2010, Kurz et al., 2008, Girardin and Mudelsee, 2008 and Li et al., 2013). Therefore, greater bias could occur if forest landscape processes are not included in the prediction of forest ecosystem dynamics (Reynolds et al., 2001). [...] [\n] Superiority of complex physiological model at the plot scale was achieved at the price of more detailed input data, longer time of simulation, and more simplified representation of forest composition. While the hybrid model was the best model at the regional scale, it cannot provide carbon dynamics at a fine temporal scale (e.g., daily carbon sequestration) like the complex physiological model can. The simple physiological model provided the worst prediction. Although we primarily focused on density of aboveground woody biomass, other traits associated with forest ecosystems (e.g., forest composition, basal area) are often of interest in ecosystem and landscape modeling. Data preparation and simulation time are also often, if not always, of concern.
@article{jinAreMoreComplex2016,
title = {Are More Complex Physiological Models of Forest Ecosystems Better Choices for Plot and Regional Predictions?},
author = {Jin, Wenchi and He, Hong S. and Thompson, Frank R.},
date = {2016-01},
journaltitle = {Environmental Modelling \& Software},
volume = {75},
pages = {1--14},
issn = {1364-8152},
doi = {10.1016/j.envsoft.2015.10.004},
url = {https://doi.org/10.1016/j.envsoft.2015.10.004},
abstract = {[Highlights]
[::] We evaluated performance of process-based forest ecosystem models. [::] A complex physiological model performed best at the plot scale. [::] A hybrid empirical-physiological model performed best at the regional scale.
[Abstract]
We evaluated performance of process-based forest ecosystem models. A complex physiological model performed best at the plot scale. A hybrid empirical-physiological model performed best at the regional scale. Process-based forest ecosystem models vary from simple physiological, complex physiological, to hybrid empirical-physiological models. Previous studies indicate that complex models provide the best prediction at plot scale with a temporal extent of less than 10 years, however, it is largely untested as to whether complex models outperform the other two types of models at plot and regional scale in longer timeframe (i.e. decades). We compared model predictions of aboveground carbon by one representative model of each model type (PnET-II, ED2 and LINKAGES v2.2, respectively) with field data (19-77 years) at both scales in the Central Hardwood Forests of the United States. At plot scale, predictions by complex physiological model were the most concordant with field data, suggesting that physiological processes are more influential than forest composition and structure. Hybrid model provided the best predictions at regional scale, suggesting that forest composition and structure may be more influential than physiological processes.
[Excerpt: Regional scale comparisons]
The percent bias of all models was larger at the regional scale than the plot scale. Abiotic environmental heterogeneity at the regional scale could be one of the factors contributing to the larger percent bias at the regional scale. Even though we used ecological subsections as our regional scale study areas-areas where the vegetation and environmental factors are considered relatively homogenous (McNab et al., 2007), the environmental heterogeneity within each subsection would still be higher than that at each plot-scale site. Small-scale abiotic environmental variations, such as difference in water availability at different slope positions of the same soil type, have been ignored in this regional scale study, and summarized environmental factors were used to represent the average physical situations across the entire subsection. [...]
[\textbackslash n] Predictions based on empirical relationships like those in the hybrid model may not hold true under changing environments in the future, since those relationships were established based on observations in the past (Gustafson, 2013 and Cuddington et al., 2013). However, predictions based on empirical life history attributes might retain validity in the future due to niche conservatism (Crisp et al., 2009 and Wiens et al., 2010). For example, the tolerance range of growing degree days of a given plant species may remain largely constant despite climate change.
[\textbackslash n] None of the models we examined simulate forest landscape processes, which are spatially continuous and temporally dynamic processes (e.g., fire disturbance). Forest landscape processes are likely to have greater contribution to forest ecosystem responses than climate variables alone (Gustafson et al., 2010, Kurz et al., 2008, Girardin and Mudelsee, 2008 and Li et al., 2013). Therefore, greater bias could occur if forest landscape processes are not included in the prediction of forest ecosystem dynamics (Reynolds et al., 2001). [...]
[\textbackslash n] Superiority of complex physiological model at the plot scale was achieved at the price of more detailed input data, longer time of simulation, and more simplified representation of forest composition. While the hybrid model was the best model at the regional scale, it cannot provide carbon dynamics at a fine temporal scale (e.g., daily carbon sequestration) like the complex physiological model can. The simple physiological model provided the worst prediction. Although we primarily focused on density of aboveground woody biomass, other traits associated with forest ecosystems (e.g., forest composition, basal area) are often of interest in ecosystem and landscape modeling. Data preparation and simulation time are also often, if not always, of concern.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13886261,~to-add-doi-URL,bias-toward-primacy-of-theory-over-reality,comparison,complexity,complexity-vs-uncertainty,ecosystem,environmental-modelling,forest-resources,local-over-complication,local-scale,model-comparison,modelling,regional-scale,system-of-systems,uncertainty}
}
Research to Integrate Productivity Enhancement, Environmental Protection, and Human Development. Sayer, J. A. & Campbell, B. 5(2):32++. Paper abstract bibtex To meet the challenges of poverty and environmental sustainability, a different kind of research will be needed. This research will need to embrace the complexity of these systems by redirecting the objectives of research toward enhancing adaptive capacity, by incorporating more participatory approaches, by embracing key principles such as multi-scale analysis and intervention, and by the use of a variety of tools (e.g., systems analysis, information management tools, and impact assessment tools). Integration will be the key concept in the new approach; integration across scales, components, stakeholders, and disciplines. Integrated approaches, as described in this Special Feature, will require changes in the culture and organization of research.
@article{sayerResearchIntegrateProductivity2001,
title = {Research to {{Integrate Productivity Enhancement}}, {{Environmental Protection}}, and {{Human Development}}},
author = {Sayer, Jeffrey A. and Campbell, Bruce},
date = {2001},
journaltitle = {Ecology and Society},
volume = {5},
pages = {32++},
issn = {1708-3087},
url = {http://mfkp.org/INRMM/article/12603983},
abstract = {To meet the challenges of poverty and environmental sustainability, a different kind of research will be needed. This research will need to embrace the complexity of these systems by redirecting the objectives of research toward enhancing adaptive capacity, by incorporating more participatory approaches, by embracing key principles such as multi-scale analysis and intervention, and by the use of a variety of tools (e.g., systems analysis, information management tools, and impact assessment tools). Integration will be the key concept in the new approach; integration across scales, components, stakeholders, and disciplines. Integrated approaches, as described in this Special Feature, will require changes in the culture and organization of research.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-12603983,complexity,conservation,ecology,indicators,integration-techniques,multi-objective-planning,multi-stakeholder-decision-making,non-linearity,scientific-communication,uncertainty},
number = {2}
}
Communicating Thematic Data Quality with Web Map Services. Blower, J., Masó, J., D́ıaz, D., Roberts, C., Griffiths, G., Lewis, J., Yang, X., & Pons, X. 4(4):1965–1981. Paper doi abstract bibtex Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium's Web Map Service (WMS) standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new ” quality-enabled” profile of WMS, which we call ” WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers.
@article{blowerCommunicatingThematicData2015,
title = {Communicating Thematic Data Quality with Web Map Services},
author = {Blower, Jon and Masó, Joan and D́ıaz, Daniel and Roberts, Charles and Griffiths, Guy and Lewis, Jane and Yang, Xiaoyu and Pons, Xavier},
date = {2015-10},
journaltitle = {ISPRS International Journal of Geo-Information},
volume = {4},
pages = {1965--1981},
issn = {2220-9964},
doi = {10.3390/ijgi4041965},
url = {https://doi.org/10.3390/ijgi4041965},
abstract = {Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium's Web Map Service (WMS) standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new ” quality-enabled” profile of WMS, which we call ” WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13815787,~to-add-doi-URL,communicating-uncertainty,geospatial,scientific-communication,uncertainty,visualization,web-map-services},
number = {4}
}
The Himalayas Must Be Protected. Pandit, M. K. 501(7467):283. Paper doi abstract bibtex Climate change and human activities are pushing the fragile ecosystem ever closer to instability, warns Maharaj K. Pandit.
@article{panditHimalayasMustBe2013,
title = {The {{Himalayas}} Must Be Protected},
author = {Pandit, Maharaj K.},
date = {2013-09},
journaltitle = {Nature},
volume = {501},
pages = {283},
issn = {0028-0836},
doi = {10.1038/501283a},
url = {https://doi.org/10.1038/501283a},
abstract = {Climate change and human activities are pushing the fragile ecosystem ever closer to instability, warns Maharaj K. Pandit.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-12634465,climate-change,disasters,ecosystem-change,ecosystem-resilience,himalayan-region,science-policy-interface,uncertainty},
number = {7467}
}
Defining Extreme Wildfire Events: Difficulties, Challenges, and Impacts. Tedim, F., Leone, V., Amraoui, M., Bouillon, C., Coughlan, M., Delogu, G., Fernandes, P., Ferreira, C., McCaffrey, S., McGee, T., Parente, J., Paton, D., Pereira, M., Ribeiro, L., Viegas, D., & Xanthopoulos, G. 1(1):9+. Paper doi abstract bibtex Every year worldwide some extraordinary wildfires occur, overwhelming suppression capabilities, causing substantial damages, and often resulting in fatalities. Given their increasing frequency, there is a debate about how to address these wildfires with significant social impacts, but there is no agreement upon terminology to describe them. The concept of extreme wildfire event (EWE) has emerged to bring some coherence on this kind of events. It is increasingly used, often as a synonym of other terms related to wildfires of high intensity and size, but its definition remains elusive. The goal of this paper is to go beyond drawing on distinct disciplinary perspectives to develop a holistic view of EWE as a social-ecological phenomenon. Based on literature review and using a transdisciplinary approach, this paper proposes a definition of EWE as a process and an outcome. Considering the lack of a consistent ” scale of gravity” to leverage extreme wildfire events such as in natural hazards (e.g., tornados, hurricanes and earthquakes) we present a proposal of wildfire classification with seven categories based on measurable fire spread and behavior parameters and suppression difficulty. The categories 5 to 7 are labeled as EWE.
@article{tedimDefiningExtremeWildfire2018,
title = {Defining Extreme Wildfire Events: Difficulties, Challenges, and Impacts},
author = {Tedim, Fantina and Leone, Vittorio and Amraoui, Malik and Bouillon, Christophe and Coughlan, Michael and Delogu, Giuseppe and Fernandes, Paulo and Ferreira, Carmen and McCaffrey, Sarah and McGee, Tara and Parente, Joana and Paton, Douglas and Pereira, Mário and Ribeiro, Lúıs and Viegas, Domingos and Xanthopoulos, Gavriil},
date = {2018-02},
journaltitle = {Fire},
volume = {1},
pages = {9+},
issn = {2571-6255},
doi = {10.3390/fire1010009},
url = {https://doi.org/10.3390/fire1010009},
abstract = {Every year worldwide some extraordinary wildfires occur, overwhelming suppression capabilities, causing substantial damages, and often resulting in fatalities. Given their increasing frequency, there is a debate about how to address these wildfires with significant social impacts, but there is no agreement upon terminology to describe them. The concept of extreme wildfire event (EWE) has emerged to bring some coherence on this kind of events. It is increasingly used, often as a synonym of other terms related to wildfires of high intensity and size, but its definition remains elusive. The goal of this paper is to go beyond drawing on distinct disciplinary perspectives to develop a holistic view of EWE as a social-ecological phenomenon. Based on literature review and using a transdisciplinary approach, this paper proposes a definition of EWE as a process and an outcome. Considering the lack of a consistent ” scale of gravity” to leverage extreme wildfire events such as in natural hazards (e.g., tornados, hurricanes and earthquakes) we present a proposal of wildfire classification with seven categories based on measurable fire spread and behavior parameters and suppression difficulty. The categories 5 to 7 are labeled as EWE.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14686423,ambiguity,classification,climate-extremes,definition,extreme-events,extreme-weather,fire-spotting-distance,forest-fires,forest-resources,pyroconvection,spatial-spread,terminology,uncertainty,vegetation,wildfires,wind},
number = {1}
}
Biomass and Stem Volume Equations for Tree Species in Europe. Zianis, D., Muukkonen, P., Mäkipää, R., & Mencuccini, M. Volume 4 of Silva Fennica, Finnish Society of Forest Science, Finnish Forest Research Institute. Paper abstract bibtex Review of stem volume and biomass equations for tree species growing in Europe is presented. The mathematical forms of the empirical models, the associated statistical parameters and information about the size of the trees and the country of origin were collated from scientific articles and from technical reports. The collected information provides a basic tool for estimation of carbon stocks and nutrient balance of forest ecosystems across Europe as well as for validation of theoretical models of biomass allocation.
@book{zianisBiomassStemVolume2005,
title = {Biomass and Stem Volume Equations for Tree Species in {{Europe}}},
author = {Zianis, Dimitris and Muukkonen, Petteri and Mäkipää, Raisa and Mencuccini, Maurizio},
date = {2005},
volume = {4},
publisher = {{Finnish Society of Forest Science, Finnish Forest Research Institute}},
issn = {1457-7356},
url = {http://www.metla.fi/silvafennica/abs/sma/sma004.htm},
abstract = {Review of stem volume and biomass equations for tree species growing in Europe is presented. The mathematical forms of the empirical models, the associated statistical parameters and information about the size of the trees and the country of origin were collated from scientific articles and from technical reports. The collected information provides a basic tool for estimation of carbon stocks and nutrient balance of forest ecosystems across Europe as well as for validation of theoretical models of biomass allocation.},
isbn = {951-40-1984-9},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-11858948,biomass,empirical-equation,forest-biomass,forest-resources,modelling-uncertainty,regression,review,uncertainty},
series = {Silva {{Fennica}}}
}
The Evolution of Error: Error Management, Cognitive Constraints, and Adaptive Decision-Making Biases. Johnson, D. D. P., Blumstein, D. T., Fowler, J. H., & Haselton, M. G. 28(8):474–481. Paper doi abstract bibtex Counterintuitively, biases can improve decision making. Numerous studies have identified biases as an effective way to manage errors. Given cognitive and evolutionary constraints, psychological biases can be adaptive. EMT has a wide scope of application for modern challenges. Counterintuitively, biases in behavior or cognition can improve decision making. Under conditions of uncertainty and asymmetric costs of 'false-positive' and 'false-negative' errors, biases can lead to mistakes in one direction but - in so doing - steer us away from more costly mistakes in the other direction. For example, we sometimes think sticks are snakes (which is harmless), but rarely that snakes are sticks (which can be deadly). We suggest that 'error management' biases: (i) have been independently identified by multiple interdisciplinary studies, suggesting the phenomenon is robust across domains, disciplines, and methodologies; (ii) represent a general feature of life, with common sources of variation; and (iii) offer an explanation, in error management theory (EMT), for the evolution of cognitive biases as the best way to manage errors under cognitive and evolutionary constraints.
@article{johnsonEvolutionErrorError2013,
title = {The Evolution of Error: Error Management, Cognitive Constraints, and Adaptive Decision-Making Biases},
author = {Johnson, Dominic D. P. and Blumstein, Daniel T. and Fowler, James H. and Haselton, Martie G.},
date = {2013-08},
journaltitle = {Trends in Ecology \& Evolution},
volume = {28},
pages = {474--481},
issn = {0169-5347},
doi = {10.1016/j.tree.2013.05.014},
url = {https://doi.org/10.1016/j.tree.2013.05.014},
abstract = {Counterintuitively, biases can improve decision making. Numerous studies have identified biases as an effective way to manage errors. Given cognitive and evolutionary constraints, psychological biases can be adaptive. EMT has a wide scope of application for modern challenges. Counterintuitively, biases in behavior or cognition can improve decision making. Under conditions of uncertainty and asymmetric costs of 'false-positive' and 'false-negative' errors, biases can lead to mistakes in one direction but - in so doing - steer us away from more costly mistakes in the other direction. For example, we sometimes think sticks are snakes (which is harmless), but rarely that snakes are sticks (which can be deadly). We suggest that 'error management' biases: (i) have been independently identified by multiple interdisciplinary studies, suggesting the phenomenon is robust across domains, disciplines, and methodologies; (ii) represent a general feature of life, with common sources of variation; and (iii) offer an explanation, in error management theory (EMT), for the evolution of cognitive biases as the best way to manage errors under cognitive and evolutionary constraints.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-12447186,cognitive-biases,communicating-uncertainty,errors,evolution,science-based-decision-making,scientific-communication,transdisciplinary-research,uncertainty},
number = {8}
}
The Value of Coordinated Management of Interacting Ecosystem Services. White, C., Costello, C., Kendall, B. E., & Brown, C. J. 15(6):509–519. Paper doi abstract bibtex Coordinating decisions and actions among interacting sectors is a critical component of ecosystem-based management, but uncertainty about coordinated management's effects is compromising its perceived value and use. We constructed an analytical framework for explicitly calculating how coordination affects management decisions, ecosystem state and the provision of ecosystem services in relation to ecosystem dynamics and socio-economic objectives. The central insight is that the appropriate comparison strategy to optimal coordinated management is optimal uncoordinated management, which can be identified at the game theoretic Nash equilibrium. Using this insight we can calculate coordination's effects in relation to uncoordinated management and other reference scenarios. To illustrate how this framework can help identify ecosystem and socio-economic conditions under which coordination is most influential and valuable, we applied it to a heuristic case study and a simulation model for the California Current Marine Ecosystem. Results indicate that coordinated management can more than double an ecosystem's societal value, especially when sectors can effectively manipulate resources that interact strongly. However, societal gains from coordination will need to be reconciled with observations that it also leads to strategic simplification of the ecological food web, and generates both positive and negative impacts on individual sectors and non-target species.
@article{whiteValueCoordinatedManagement2012,
title = {The Value of Coordinated Management of Interacting Ecosystem Services},
author = {White, Crow and Costello, Christopher and Kendall, Bruce E. and Brown, Christopher J.},
date = {2012-06},
journaltitle = {Ecology Letters},
volume = {15},
pages = {509--519},
issn = {1461-0248},
doi = {10.1111/j.1461-0248.2012.01773.x},
url = {https://doi.org/10.1111/j.1461-0248.2012.01773.x},
abstract = {Coordinating decisions and actions among interacting sectors is a critical component of ecosystem-based management, but uncertainty about coordinated management's effects is compromising its perceived value and use. We constructed an analytical framework for explicitly calculating how coordination affects management decisions, ecosystem state and the provision of ecosystem services in relation to ecosystem dynamics and socio-economic objectives. The central insight is that the appropriate comparison strategy to optimal coordinated management is optimal uncoordinated management, which can be identified at the game theoretic Nash equilibrium. Using this insight we can calculate coordination's effects in relation to uncoordinated management and other reference scenarios. To illustrate how this framework can help identify ecosystem and socio-economic conditions under which coordination is most influential and valuable, we applied it to a heuristic case study and a simulation model for the California Current Marine Ecosystem. Results indicate that coordinated management can more than double an ecosystem's societal value, especially when sectors can effectively manipulate resources that interact strongly. However, societal gains from coordination will need to be reconciled with observations that it also leads to strategic simplification of the ecological food web, and generates both positive and negative impacts on individual sectors and non-target species.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-10562193,cross-disciplinary-perspective,ecology,ecosystem-services,integrated-natural-resources-modelling-and-management,integration-techniques,multi-objective-planning,uncertainty},
number = {6}
}
Rise of the Citizen Scientist. Nature 524(7565):265. Paper doi abstract bibtex From the oceans to the soil, technology is changing the part that amateurs can play in research. But this greater involvement raises concerns that must be addressed. [Excerpt] [...] Citizen science has come a long way from the first distributed-computing projects that hoovered up spare processing power on home computers to perform calculations or search for alien signals. And it has progressed further still since the earliest public surveys of wildlife: it was way back in 1900 that the Audubon Society persuaded Americans to exchange their Christmas tradition of shooting birds for a more productive effort to count them instead. [] Some professional scientists are sniffy about the role of amateurs, but as an increasing number of academic papers makes clear, the results can be valuable and can help both to generate data and to inform policy. [...] [] Technology can make scientists of us all. Data churned out by the rapid spread of consumer gadgets equipped with satellite navigation, cameras and a suite of other sensors, and the ease of sharing the results digitally, are driving the boom in citizen science. Volunteers can already identify whale songs from recordings, report litter and invasive species, and send in the skeletons of fish they have caught and consumed. But there is more to being a scientist, of course, than collecting and sharing data – especially if the results are to be used to help determine policy. [] Critics have raised concerns about data quality, and some studies do find that volunteers are less able to identify plant species than are academics and land managers. And there are issues around how to reward and recognize the contribution of volunteers, and around ensuring that data are shared or kept confidential as appropriate. But these problems seem relatively simple to address – not least because they reflect points – from authorship to data quality and access – that the professional scientific community is already wrestling with. [] More troubling, perhaps, is the potential for conflicts of interest. One reason that some citizen scientists volunteer is to advance their political objectives. [...] Scientists and funders are right to encourage the shift from passive citizen science – number crunching – to more-active roles, including sample collection. But as increased scrutiny falls on the reliability of the work of professional scientists, full transparency about the motives and ambitions of amateurs is essential.
@article{natureRiseCitizenScientist2015,
title = {Rise of the Citizen Scientist},
author = {{Nature}},
date = {2015-08},
journaltitle = {Nature},
volume = {524},
pages = {265},
issn = {0028-0836},
doi = {10.1038/524265a},
url = {https://doi.org/10.1038/524265a},
abstract = {From the oceans to the soil, technology is changing the part that amateurs can play in research. But this greater involvement raises concerns that must be addressed.
[Excerpt] [...] Citizen science has come a long way from the first distributed-computing projects that hoovered up spare processing power on home computers to perform calculations or search for alien signals. And it has progressed further still since the earliest public surveys of wildlife: it was way back in 1900 that the Audubon Society persuaded Americans to exchange their Christmas tradition of shooting birds for a more productive effort to count them instead.
[] Some professional scientists are sniffy about the role of amateurs, but as an increasing number of academic papers makes clear, the results can be valuable and can help both to generate data and to inform policy. [...]
[] Technology can make scientists of us all. Data churned out by the rapid spread of consumer gadgets equipped with satellite navigation, cameras and a suite of other sensors, and the ease of sharing the results digitally, are driving the boom in citizen science. Volunteers can already identify whale songs from recordings, report litter and invasive species, and send in the skeletons of fish they have caught and consumed. But there is more to being a scientist, of course, than collecting and sharing data -- especially if the results are to be used to help determine policy.
[] Critics have raised concerns about data quality, and some studies do find that volunteers are less able to identify plant species than are academics and land managers. And there are issues around how to reward and recognize the contribution of volunteers, and around ensuring that data are shared or kept confidential as appropriate. But these problems seem relatively simple to address -- not least because they reflect points -- from authorship to data quality and access -- that the professional scientific community is already wrestling with.
[] More troubling, perhaps, is the potential for conflicts of interest. One reason that some citizen scientists volunteer is to advance their political objectives. [...] Scientists and funders are right to encourage the shift from passive citizen science -- number crunching -- to more-active roles, including sample collection. But as increased scrutiny falls on the reliability of the work of professional scientists, full transparency about the motives and ambitions of amateurs is essential.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13706673,~to-add-doi-URL,citizen-science,citizen-sensor,data-collection-bias,data-uncertainty,integration-techniques,science-ethics,uncertainty},
number = {7565}
}
Uncertainty in the Environmental Modelling Process - A Framework and Guidance. Refsgaard, J. C., van der Sluijs, J. P., Højberg, A. L., & Vanrolleghem, P. A. 22(11):1543–1556. Paper doi abstract bibtex A terminology and typology of uncertainty is presented together with a framework for the modelling process, its interaction with the broader water management process and the role of uncertainty at different stages in the modelling processes. Brief reviews have been made of 14 different (partly complementary) methods commonly used in uncertainty assessment and characterisation: data uncertainty engine (DUE), error propagation equations, expert elicitation, extended peer review, inverse modelling (parameter estimation), inverse modelling (predictive uncertainty), Monte Carlo analysis, multiple model simulation, NUSAP, quality assurance, scenario analysis, sensitivity analysis, stakeholder involvement and uncertainty matrix. The applicability of these methods has been mapped according to purpose of application, stage of the modelling process and source and type of uncertainty addressed. It is concluded that uncertainty assessment is not just something to be added after the completion of the modelling work. Instead uncertainty should be seen as a red thread throughout the modelling study starting from the very beginning, where the identification and characterisation of all uncertainty sources should be performed jointly by the modeller, the water manager and the stakeholders.
@article{refsgaardUncertaintyEnvironmentalModelling2007,
title = {Uncertainty in the Environmental Modelling Process - {{A}} Framework and Guidance},
author = {Refsgaard, Jens C. and van der Sluijs, Jeroen P. and Højberg, Anker L. and Vanrolleghem, Peter A.},
date = {2007-11},
journaltitle = {Environmental Modelling \& Software},
volume = {22},
pages = {1543--1556},
issn = {1364-8152},
doi = {10.1016/j.envsoft.2007.02.004},
url = {https://doi.org/10.1016/j.envsoft.2007.02.004},
abstract = {A terminology and typology of uncertainty is presented together with a framework for the modelling process, its interaction with the broader water management process and the role of uncertainty at different stages in the modelling processes. Brief reviews have been made of 14 different (partly complementary) methods commonly used in uncertainty assessment and characterisation: data uncertainty engine (DUE), error propagation equations, expert elicitation, extended peer review, inverse modelling (parameter estimation), inverse modelling (predictive uncertainty), Monte Carlo analysis, multiple model simulation, NUSAP, quality assurance, scenario analysis, sensitivity analysis, stakeholder involvement and uncertainty matrix. The applicability of these methods has been mapped according to purpose of application, stage of the modelling process and source and type of uncertainty addressed. It is concluded that uncertainty assessment is not just something to be added after the completion of the modelling work. Instead uncertainty should be seen as a red thread throughout the modelling study starting from the very beginning, where the identification and characterisation of all uncertainty sources should be performed jointly by the modeller, the water manager and the stakeholders.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-1507111,environmental-modelling,monte-carlo,multi-stakeholder-decision-making,participatory-modelling,scenario-analysis,uncertainty},
number = {11},
options = {useprefix=true}
}
Free and Open Source Software Underpinning the European Forest Data Centre. Rodriguez-Aseretto, D., Di Leo, M., de Rigo, D., Corti, P., McInerney, D., Camia, A., & San-Miguel-Ayanz, J. 15:12101+. Paper doi abstract bibtex Worldwide, governments are growingly focusing on free and open source software (FOSS) as a move toward transparency and the freedom to run, copy, study, change and improve the software. The European Commission (EC) is also supporting the development of FOSS [...]. In addition to the financial savings, FOSS contributes to scientific knowledge freedom in computational science (CS) and is increasingly rewarded in the science-policy interface within the emerging paradigm of open science. Since complex computational science applications may be affected by software uncertainty, FOSS may help to mitigate part of the impact of software errors by CS community- driven open review, correction and evolution of scientific code. The continental scale of EC science-based policy support implies wide networks of scientific collaboration. Thematic information systems also may benefit from this approach within reproducible integrated modelling. This is supported by the EC strategy on FOSS: "for the development of new information systems, where deployment is foreseen by parties outside of the EC infrastructure, FOSS will be the preferred choice and in any case used whenever possible". The aim of this contribution is to highlight how a continental scale information system may exploit and integrate FOSS technologies within the transdisciplinary research underpinning such a complex system. A European example is discussed where FOSS innervates both the structure of the information system itself and the inherent transdisciplinary research for modelling the data and information which constitute the system content. [...]
@article{rodriguez-aserettoFreeOpenSource2013,
title = {Free and {{Open Source Software}} Underpinning the {{European Forest Data Centre}}},
author = {Rodriguez-Aseretto, Dario and Di Leo, Margherita and de Rigo, Daniele and Corti, Paolo and McInerney, Daniel and Camia, Andrea and San-Miguel-Ayanz, Jesús},
date = {2013},
journaltitle = {Geophysical Research Abstracts},
volume = {15},
pages = {12101+},
issn = {1607-7962},
doi = {10.6084/m9.figshare.155700},
url = {https://doi.org/10.6084/m9.figshare.155700},
abstract = {Worldwide, governments are growingly focusing on free and open source software (FOSS) as a move toward transparency and the freedom to run, copy, study, change and improve the software. The European Commission (EC) is also supporting the development of FOSS [...]. In addition to the financial savings, FOSS contributes to scientific knowledge freedom in computational science (CS) and is increasingly rewarded in the science-policy interface within the emerging paradigm of open science. Since complex computational science applications may be affected by software uncertainty, FOSS may help to mitigate part of the impact of software errors by CS community- driven open review, correction and evolution of scientific code. The continental scale of EC science-based policy support implies wide networks of scientific collaboration. Thematic information systems also may benefit from this approach within reproducible integrated modelling. This is supported by the EC strategy on FOSS: "for the development of new information systems, where deployment is foreseen by parties outside of the EC infrastructure, FOSS will be the preferred choice and in any case used whenever possible". The aim of this contribution is to highlight how a continental scale information system may exploit and integrate FOSS technologies within the transdisciplinary research underpinning such a complex system. A European example is discussed where FOSS innervates both the structure of the information system itself and the inherent transdisciplinary research for modelling the data and information which constitute the system content. [...]},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-11988844,computational-science,data-transformation-modelling,environmental-modelling,europe,free-scientific-knowledge,free-scientific-software,free-software,gdal,geospatial,geospatial-semantic-array-programming,gis,gnu-octave,gnu-r,guidos-mspa,integrated-modelling,integrated-natural-resources-modelling-and-management,mastrave-modelling-library,modelling,modelling-uncertainty,numpy,open-science,pktools,python,robust-modelling,science-policy-interface,scipy,semantic-array-programming,semantics,semap,software-engineering,software-errors,software-uncertainty,system-engineering,uncertainty},
options = {useprefix=true},
series = {Geophysical {{Research Abstracts}}}
}
Rising Policy Conflicts in Europe over Bioenergy and Forestry. Söderberg, C. & Eckerberg, K. 33:112–119. Paper doi abstract bibtex [Highlights] [::] EU Bioenergy policy cuts across forest, agriculture, energy and transport sectors. [::] Increased pressure on forest biomass risks putting EU in a wood-deficit situation. [::] Bioenergy conflicts regard land use, biodiversity, climate and sustainability. [::] Conflicts on environmental consequences from bioenergy policy are reconcilable. [::] Conflicts on globally shared rights and responsibilities are not easily reconciled. [Abstract] Growing concerns over emissions of green-house gases causing climate change as well as energy security concerns have spurred the interest in bioenergy production pushed by EU targets to fulfil the goal of 20~per cent renewable energy in 2020, as well as the goal of 10~per cent renewable fuels in transport by 2020. Increased bioenergy production is also seen to have political and economic benefits for rural areas and farming regions in Europe and in the developing world. There are, however, conflicting views on the potential benefits of large scale bioenergy production, and recent debates have also drawn attention to a range of environmental and socio-economic issues that may arise in this respect. One of these challenges will be that of accommodating forest uses - including wood for energy, and resulting intensification of forest management - with biodiversity protection in order to meet EU policy goals. We note that the use of biomass and biofuels spans over several economic sector policy areas, which calls for assessing and integrating environmental concerns across forest, agriculture, energy and transport sectors. In this paper, we employ frame analysis to identify the arguments for promoting bioenergy and assess the potential policy conflicts in the relevant sectors, through the analytical lens of environmental policy integration. We conclude that while there is considerable leverage of environmental arguments in favour of bioenergy in the studied economic sectors, and potential synergies with other policy goals, environmental interest groups remain sceptical to just how bioenergy is currently being promoted. There is a highly polarised debate particularly relating to biofuel production. Based on our analysis, we discuss the potential for how those issues could be reconciled drawing on the frame conflict theory, distinguishing between policy disagreements and policy controversies.
@article{soderbergRisingPolicyConflicts2013,
title = {Rising Policy Conflicts in {{Europe}} over Bioenergy and Forestry},
author = {Söderberg, Charlotta and Eckerberg, Katarina},
date = {2013-08},
journaltitle = {Forest Policy and Economics},
volume = {33},
pages = {112--119},
issn = {1389-9341},
doi = {10.1016/j.forpol.2012.09.015},
url = {https://doi.org/10.1016/j.forpol.2012.09.015},
abstract = {[Highlights]
[::] EU Bioenergy policy cuts across forest, agriculture, energy and transport sectors. [::] Increased pressure on forest biomass risks putting EU in a wood-deficit situation. [::] Bioenergy conflicts regard land use, biodiversity, climate and sustainability. [::] Conflicts on environmental consequences from bioenergy policy are reconcilable. [::] Conflicts on globally shared rights and responsibilities are not easily reconciled.
[Abstract] Growing concerns over emissions of green-house gases causing climate change as well as energy security concerns have spurred the interest in bioenergy production pushed by EU targets to fulfil the goal of 20~per cent renewable energy in 2020, as well as the goal of 10~per cent renewable fuels in transport by 2020. Increased bioenergy production is also seen to have political and economic benefits for rural areas and farming regions in Europe and in the developing world. There are, however, conflicting views on the potential benefits of large scale bioenergy production, and recent debates have also drawn attention to a range of environmental and socio-economic issues that may arise in this respect. One of these challenges will be that of accommodating forest uses - including wood for energy, and resulting intensification of forest management - with biodiversity protection in order to meet EU policy goals. We note that the use of biomass and biofuels spans over several economic sector policy areas, which calls for assessing and integrating environmental concerns across forest, agriculture, energy and transport sectors. In this paper, we employ frame analysis to identify the arguments for promoting bioenergy and assess the potential policy conflicts in the relevant sectors, through the analytical lens of environmental policy integration. We conclude that while there is considerable leverage of environmental arguments in favour of bioenergy in the studied economic sectors, and potential synergies with other policy goals, environmental interest groups remain sceptical to just how bioenergy is currently being promoted. There is a highly polarised debate particularly relating to biofuel production. Based on our analysis, we discuss the potential for how those issues could be reconciled drawing on the frame conflict theory, distinguishing between policy disagreements and policy controversies.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-11738844,~to-add-doi-URL,bioenergy,biomass,europe,forest-resources,ghg,science-policy-interface,uncertainty}
}
Probabilistic Population Projections with Migration Uncertainty. Azose, J. J., Ševč́ıková, H., & Raftery, A. E. 113(23):6460–6465. Paper doi abstract bibtex [Significance] Projected populations to the end of this century are an important factor in many policy decisions. Population forecasts become less reliable as we look farther into the future, suggesting a probabilistic approach to convey uncertainty. Migration projections have been largely deterministic until now, even in probabilistic population projections. Deterministic migration projections neglect a substantial source of population uncertainty. We incorporate a probabilistic migration model with probabilistic models of fertility and mortality to produce probabilistic population projections for all countries until 2100. The result is a substantial increase in uncertainty about the populations of Europe and Northern America, with very little change to uncertainty about the population of Africa, Asia, and the world as a whole. [Abstract] We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70\,% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated.
@article{azoseProbabilisticPopulationProjections2016,
title = {Probabilistic Population Projections with Migration Uncertainty},
author = {Azose, Jonathan J. and Ševč́ıková, Hana and Raftery, Adrian E.},
date = {2016-06},
journaltitle = {Proceedings of the National Academy of Sciences},
volume = {113},
pages = {6460--6465},
issn = {1091-6490},
doi = {10.1073/pnas.1606119113},
url = {http://mfkp.org/INRMM/article/14062239},
abstract = {[Significance]
Projected populations to the end of this century are an important factor in many policy decisions. Population forecasts become less reliable as we look farther into the future, suggesting a probabilistic approach to convey uncertainty. Migration projections have been largely deterministic until now, even in probabilistic population projections. Deterministic migration projections neglect a substantial source of population uncertainty. We incorporate a probabilistic migration model with probabilistic models of fertility and mortality to produce probabilistic population projections for all countries until 2100. The result is a substantial increase in uncertainty about the populations of Europe and Northern America, with very little change to uncertainty about the population of Africa, Asia, and the world as a whole.
[Abstract]
We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70\,\% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14062239,~to-add-doi-URL,migration-pattern,population-growth,statistics,uncertainty},
number = {23}
}
Beyond Competition: Incorporating Positive Interactions between Species to Predict Ecosystem Invasibility. Bulleri, F., Bruno, J. F., & Benedetti-Cecchi, L. 6(6):e162+. Paper doi abstract bibtex Incorporating positive species interactions into models relating native species richness to community invasibility will increase our ability to forecast, prevent, and manage future invasions.
@article{bulleriCompetitionIncorporatingPositive2008,
title = {Beyond Competition: Incorporating Positive Interactions between Species to Predict Ecosystem Invasibility},
author = {Bulleri, Fabio and Bruno, John F. and Benedetti-Cecchi, Lisandro},
date = {2008-06},
journaltitle = {PLoS Biol},
volume = {6},
pages = {e162+},
doi = {10.1371/journal.pbio.0060162},
url = {https://doi.org/10.1371/journal.pbio.0060162},
abstract = {Incorporating positive species interactions into models relating native species richness to community invasibility will increase our ability to forecast, prevent, and manage future invasions.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-2931459,competition,ecosystem-invasibility,ecosystem-resilience,invasive-species,non-linearity,species-positive-interaction,uncertainty},
number = {6}
}
Trick of the Light. Nature 506(7486):6. Paper doi abstract bibtex The Amazon doesn't absorb extra carbon in the dry season after all. It can become a carbon source.
@article{natureTrickLight2014,
title = {Trick of the Light},
author = {{Nature}},
date = {2014-02},
journaltitle = {Nature},
volume = {506},
pages = {6},
issn = {0028-0836},
doi = {10.1038/506006b},
url = {https://doi.org/10.1038/506006b},
abstract = {The Amazon doesn't absorb extra carbon in the dry season after all. It can become a carbon source.},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-12972834,amazonia,carbon-cycle,forest-resources,modelling,modelling-uncertainty,organic-carbon,precipitation,remote-sensing,solar-radiation,uncertainty},
number = {7486}
}
Reproducibility: A Tragedy of Errors. Allison, D. B., Brown, A. W., George, B. J., & Kaiser, K. A. 530(7588):27–29. Paper doi abstract bibtex Mistakes in peer-reviewed papers are easy to find but hard to fix, report David B. Allison and colleagues. [Excerpt: Three common errors] As the influential twentieth-century statistician Ronald Fisher (pictured) said: ” To consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination. He can perhaps say what the experiment died of.” [] [...] Frequent errors, once recognized, can be kept out of the literature with targeted education and policies. Three of the most common are outlined below. These and others are described in depth in an upcoming publication7. [::1. Mistaken design or analysis of cluster-randomized trials] In these studies, all participants in a cluster (for example, a cage, school or hospital) are given the same treatment. The number of clusters (not just the number of individuals) must be incorporated into the analysis. Otherwise, results often seem, falsely, to be statistically significant8, 9. Increasing the number of individuals within clusters can increase power, but the gains are minute compared with increasing clusters. Designs with only one cluster per treatment are not valid as randomized experiments, regardless of how many individuals are included. [::2. Miscalculation in meta-analyses] Effect sizes are often miscalculated when meta-analysts are confronted with incomplete information and do not adapt appropriately. Another problem is confusion about how to calculate the variance of effects. Different study designs and meta-analyses require different approaches. Incorrect or inconsistent choices can change effect sizes, study weighting or the overall conclusions4. [::3. Inappropriate baseline comparisons] In at least six articles, authors tested for changes from the baseline in separate groups; if one was significant and one not, the authors (wrongly) proposed a difference between groups. Rather than comparing 'differences in nominal significance' (the DINS error) differences between groups must be compared directly. For studies comparing two equal-sized groups, the DINS error can inflate the false-positive rate from 5\,% to as much as 50\,% (ref. 10). [] [...]
@article{allisonReproducibilityTragedyErrors2016,
title = {Reproducibility: A Tragedy of Errors},
author = {Allison, David B. and Brown, Andrew W. and George, Brandon J. and Kaiser, Kathryn A.},
date = {2016-02},
journaltitle = {Nature},
volume = {530},
pages = {27--29},
issn = {0028-0836},
doi = {10.1038/530027a},
url = {https://doi.org/10.1038/530027a},
abstract = {Mistakes in peer-reviewed papers are easy to find but hard to fix, report David B. Allison and colleagues.
[Excerpt: Three common errors]
As the influential twentieth-century statistician Ronald Fisher (pictured) said: ” To consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination. He can perhaps say what the experiment died of.”
[] [...]
Frequent errors, once recognized, can be kept out of the literature with targeted education and policies. Three of the most common are outlined below. These and others are described in depth in an upcoming publication7.
[::1. Mistaken design or analysis of cluster-randomized trials] In these studies, all participants in a cluster (for example, a cage, school or hospital) are given the same treatment. The number of clusters (not just the number of individuals) must be incorporated into the analysis. Otherwise, results often seem, falsely, to be statistically significant8, 9. Increasing the number of individuals within clusters can increase power, but the gains are minute compared with increasing clusters. Designs with only one cluster per treatment are not valid as randomized experiments, regardless of how many individuals are included.
[::2. Miscalculation in meta-analyses] Effect sizes are often miscalculated when meta-analysts are confronted with incomplete information and do not adapt appropriately. Another problem is confusion about how to calculate the variance of effects. Different study designs and meta-analyses require different approaches. Incorrect or inconsistent choices can change effect sizes, study weighting or the overall conclusions4.
[::3. Inappropriate baseline comparisons] In at least six articles, authors tested for changes from the baseline in separate groups; if one was significant and one not, the authors (wrongly) proposed a difference between groups. Rather than comparing 'differences in nominal significance' (the DINS error) differences between groups must be compared directly. For studies comparing two equal-sized groups, the DINS error can inflate the false-positive rate from 5\,\% to as much as 50\,\% (ref. 10).
[] [...]},
keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-13924997,~to-add-doi-URL,bias-correction,cognitive-biases,data-collection-bias,peer-review,post-publication-peer-review,publication-errors,reproducible-research,research-management,science-ethics,statistics,uncertainty,uncertainty-propagation},
number = {7588}
}