Sample records for probabilistic eruption forecasting

  1. Probabilistic eruption forecasting at short and long time scales

    NASA Astrophysics Data System (ADS)

    Marzocchi, Warner; Bebbington, Mark S.

    2012-10-01

    Any effective volcanic risk mitigation strategy requires a scientific assessment of the future evolution of a volcanic system and its eruptive behavior. Some consider the onus should be on volcanologists to provide simple but emphatic deterministic forecasts. This traditional way of thinking, however, does not deal with the implications of inherent uncertainties, both aleatoric and epistemic, that are inevitably present in observations, monitoring data, and interpretation of any natural system. In contrast to deterministic predictions, probabilistic eruption forecasting attempts to quantify these inherent uncertainties utilizing all available information to the extent that it can be relied upon and is informative. As with many other natural hazards, probabilistic eruption forecasting is becoming established as the primary scientific basis for planning rational risk mitigation actions: at short-term (hours to weeks or months), it allows decision-makers to prioritize actions in a crisis; and at long-term (years to decades), it is the basic component for land use and emergency planning. Probabilistic eruption forecasting consists of estimating the probability of an eruption event and where it sits in a complex multidimensional time-space-magnitude framework. In this review, we discuss the key developments and features of models that have been used to address the problem.

  2. Monitoring and Modeling: The Future of Volcanic Eruption Forecasting

    NASA Astrophysics Data System (ADS)

    Poland, M. P.; Pritchard, M. E.; Anderson, K. R.; Furtney, M.; Carn, S. A.

    2016-12-01

    Eruption forecasting typically uses monitoring data from geology, gas geochemistry, geodesy, and seismology, to assess the likelihood of future eruptive activity. Occasionally, months to years of warning are possible from specific indicators (e.g., deep LP earthquakes, elevated CO2 emissions, and aseismic deformation) or a buildup in one or more monitoring parameters. More often, observable changes in unrest occur immediately before eruption, as magma is rising toward the surface. In some cases, little or no detectable unrest precedes eruptive activity. Eruption forecasts are usually based on the experience of volcanologists studying the activity, but two developing fields offer a potential leap beyond this practice. First, remote sensing data, which can track thermal, gas, and ash emissions, as well as surface deformation, are increasingly available, allowing statistically significant research into the characteristics of unrest. For example, analysis of hundreds of volcanoes indicates that deformation is a more common pre-eruptive phenomenon than thermal anomalies, and that most episodes of satellite-detected unrest are not immediately followed by eruption. Such robust datasets inform the second development—probabilistic models of eruption potential, especially those that are based on physical-chemical models of the dynamics of magma accumulation and ascent. Both developments are essential for refining forecasts and reducing false positives. For example, many caldera systems have not erupted but are characterized by unrest that, in another context, would elicit strong concern from volcanologists. More observations of this behavior and better understanding of the underlying physics of unrest will improve forecasts of such activity. While still many years from implementation as a forecasting tool, probabilistic physio-chemical models incorporating satellite data offer a complement to expert assessments that, together, can form a powerful forecasting approach.

  3. Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model

    NASA Astrophysics Data System (ADS)

    Anderson, K. R.

    2016-12-01

    Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics-based, mixed deterministic-probabilistic eruption forecasting approach in reducing and quantifying these uncertainties.

  4. The Eruption Forecasting Information System (EFIS) database project

    NASA Astrophysics Data System (ADS)

    Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Pallister, John; Wright, Heather

    2016-04-01

    The Eruption Forecasting Information System (EFIS) project is a new initiative of the U.S. Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) with the goal of enhancing VDAP's ability to forecast the outcome of volcanic unrest. The EFIS project seeks to: (1) Move away from relying on the collective memory to probability estimation using databases (2) Create databases useful for pattern recognition and for answering common VDAP questions; e.g. how commonly does unrest lead to eruption? how commonly do phreatic eruptions portend magmatic eruptions and what is the range of antecedence times? (3) Create generic probabilistic event trees using global data for different volcano 'types' (4) Create background, volcano-specific, probabilistic event trees for frequently active or particularly hazardous volcanoes in advance of a crisis (5) Quantify and communicate uncertainty in probabilities A major component of the project is the global EFIS relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest, including the timing of phreatic eruptions, column heights, eruptive products, etc. and will be initially populated using chronicles of eruptive activity from Alaskan volcanic eruptions in the GeoDIVA database (Cameron et al. 2013). This database module allows us to query across other global databases such as the WOVOdat database of monitoring data and the Smithsonian Institution's Global Volcanism Program (GVP) database of eruptive histories and volcano information. The EFIS database is in the early stages of development and population; thus, this contribution also serves as a request for feedback from the community.

  5. A New Statistical Model for Eruption Forecasting at Open Conduit Volcanoes: an Application to Mt Etna and Kilauea Volcanoes

    NASA Astrophysics Data System (ADS)

    Passarelli, Luigi; Sanso, Bruno; Laura, Sandri; Marzocchi, Warner

    2010-05-01

    One of the main goals in volcanology is to forecast volcanic eruptions. A trenchant forecast should be made before the onset of a volcanic eruption, using the data available at that time, with the aim of mitigating the volcanic risk associated to the volcanic event. In other words, models implemented with forecast purposes have to take into account the possibility to provide "forward" forecasts and should avoid the idea of a merely "retrospective" fitting of the data available. In this perspective, the main idea of the present model is to forecast the next volcanic eruption after the end of the last one, using only the data available at that time. We focus our attention on volcanoes with open conduit regime and high eruption frequency. We assume a generalization of the classical time predictable model to describe the eruptive behavior of open conduit volcanoes and we use a Bayesian hierarchical model to make probabilistic forecast. We apply the model to Kilauea volcano eruptive data and Mt. Etna volcano flank eruption data. The aims of this model are: 1) to test whether or not the Kilauea and Mt Etna volcanoes follow a time predictable behavior; 2) to discuss the volcanological implications of the time predictable model parameters inferred; 3) to compare the forecast capabilities of this model with other models present in literature. The results obtained using the MCMC sampling algorithm show that both volcanoes follow a time predictable behavior. The numerical values of the time predictable model parameters inferred suggest that the amount of the erupted volume could change the dynamics of the magma chamber refilling process during the repose period. The probability gain of this model compared with other models already present in literature is appreciably greater than zero. This means that our model performs better forecast than previous models and it could be used in a probabilistic volcanic hazard assessment scheme. In this perspective, the probability of eruptions given by our model for Mt Etna volcano flank eruption are published on a internet website and are updated after any change in the eruptive activity.

  6. Forecasting the duration of volcanic eruptions: an empirical probabilistic model

    NASA Astrophysics Data System (ADS)

    Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.

    2014-01-01

    The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well-documented volcanoes or for different duration data such as the duration of explosive episodes or the duration of repose periods between eruptions.

  7. From multi-disciplinary monitoring observation to probabilistic eruption forecasting: a Bayesian view

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2011-12-01

    Eruption forecasting is the probability of eruption in a specific time-space-magnitude window. The use of probabilities to track the evolution of a phase of unrest is unavoidable for two main reasons: first, eruptions are intrinsically unpredictable in a deterministic sense, and, second, probabilities represent a quantitative tool that can be rationally used by decision-makers (this is usually done in many other fields). The primary information for the probability assessment during a phase of unrest come from monitoring data of different quantities, such as the seismic activity, ground deformation, geochemical signatures, and so on. Nevertheless, the probabilistic forecast based on monitoring data presents two main difficulties. First, many high-risk volcanoes do not have monitoring pre-eruptive and unrest databases, making impossible a probabilistic assessment based on the frequency of past observations. The ongoing project WOVOdat (led by Christopher Newhall) is trying to tackle this limitation creating a sort of worldwide epidemiological database that may cope with the lack of monitoring pre-eruptive and unrest databases for a specific volcano using observations of 'analogs' volcanoes. Second, the quantity and quality of monitoring data are rapidly increasing in many volcanoes, creating strongly inhomogeneous dataset. In these cases, classical statistical analysis can be performed on high quality monitoring observations only for (usually too) short periods of time, or alternatively using only few specific monitoring data that are available for longer times (such as the number of earthquakes), therefore neglecting a lot of information carried out by the most recent kind of monitoring. Here, we explore a possible strategy to cope with these limitations. In particular, we present a Bayesian strategy that merges different kinds of information. In this approach, all relevant monitoring observations are embedded into a probabilistic scheme through expert opinion, conceptual models, and, possibly, real past data. After discussing all scientific and philosophical aspects of such approach, we present some applications for Campi Flegrei and Vesuvius.

  8. Solar Eruptive Flares: from Physical Understanding to Probabilistic Forecasting

    NASA Astrophysics Data System (ADS)

    Georgoulis, M. K.

    2013-12-01

    We describe a new, emerging physical picture of the triggering of major solar eruptions. First, we discuss and aim to interpret the single distinguishing feature of tight, shear-ridden magnetic polarity inversion lines (PILs) in solar active regions, where most of these eruptions occur. Then we analyze the repercussions of this feature, that acts to form increasingly helical pre-eruption structures. Eruptions, with the CME progenitor preceding the flare, tend to release parts of the accumulated magnetic free energy and helicity that are always much smaller than the respective budgets of the source active region. These eruption-related decreases, however, are not optimal for eruption forecasting - this role is claimed by physically intuitive proxy parameters that could show increased pre-eruption sensitivity at time scales practical for prediction. Concluding, we show how reconciling this new information - jointly enabled by the exceptional resolution and quality of Hinode and cadence of SDO data - can lead to advances in understanding that outline the current state-of-the-art of our eruption-forecasting capability.

  9. The Eruption Forecasting Information System: Volcanic Eruption Forecasting Using Databases

    NASA Astrophysics Data System (ADS)

    Ogburn, S. E.; Harpel, C. J.; Pesicek, J. D.; Wellik, J.

    2016-12-01

    Forecasting eruptions, including the onset size, duration, location, and impacts, is vital for hazard assessment and risk mitigation. The Eruption Forecasting Information System (EFIS) project is a new initiative of the US Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) and will advance VDAP's ability to forecast the outcome of volcanic unrest. The project supports probability estimation for eruption forecasting by creating databases useful for pattern recognition, identifying monitoring data thresholds beyond which eruptive probabilities increase, and for answering common forecasting questions. A major component of the project is a global relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest. This module allows us to query eruption chronologies, monitoring data, descriptive information, operational data, and eruptive phases alongside other global databases, such as WOVOdat and the Global Volcanism Program. The EFIS database is in the early stages of development and population; thus, this contribution also is a request for feedback from the community. Preliminary data are already benefitting several research areas. For example, VDAP provided a forecast of the likely remaining eruption duration for Sinabung volcano, Indonesia, using global data taken from similar volcanoes in the DomeHaz database module, in combination with local monitoring time-series data. In addition, EFIS seismologists used a beta-statistic test and empirically-derived thresholds to identify distal volcano-tectonic earthquake anomalies preceding Alaska volcanic eruptions during 1990-2015 to retrospectively evaluate Alaska Volcano Observatory eruption precursors. This has identified important considerations for selecting analog volcanoes for global data analysis, such as differences between closed and open system volcanoes.

  10. Probing magma reservoirs to improve volcano forecasts

    USGS Publications Warehouse

    Lowenstern, Jacob B.; Sisson, Thomas W.; Hurwitz, Shaul

    2017-01-01

    When it comes to forecasting eruptions, volcano observatories rely mostly on real-time signals from earthquakes, ground deformation, and gas discharge, combined with probabilistic assessments based on past behavior [Sparks and Cashman, 2017]. There is comparatively less reliance on geophysical and petrological understanding of subsurface magma reservoirs.

  11. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before and during eruptions: BET_VHst for Mt. Etna

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Scollo, Simona; Costa, Antonio; Brancato, Alfonso; Prestifilippo, Michele

    2015-04-01

    Tephra dispersal, even in small amounts, may heavily affect public health and critical infrastructures, such as airports, train and road networks, and electric power supply systems. Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at managing and mitigating the risk posed by activity during volcanic crises and during eruptions. Short-term PVHA (over time intervals in the order of hours to few days) must account for rapidly changing information coming from the monitoring system, as well as, updated wind forecast, and they must be accomplished in near-real-time. In addition, while during unrest the primary goal is to forecast potential eruptions, during eruptions it is also fundamental to correctly account for the real-time status of the eruption and of tephra dispersal, as well as its potential evolution in the short-term. Here, we present a preliminary application of BET_VHst model (Selva et al. 2014) for Mt. Etna. The model has its roots into present state deterministic procedure, and it deals with the large uncertainty that such procedures typically ignore, like uncertainty on the potential position of the vent and eruptive size, on the possible evolution of volcanological input during ongoing eruptions, as well as, on wind field. Uncertainty is treated by making use of Bayesian inference, alternative modeling procedures for tephra dispersal, and statistical mixing of long- and short-term analyses. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  12. A new Bayesian Event Tree tool to track and quantify volcanic unrest and its application to Kawah Ijen volcano

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Rouwet, Dmitri; Caudron, Corentin; Marzocchi, Warner; Suparjan

    2016-07-01

    Although most of volcanic hazard studies focus on magmatic eruptions, volcanic hazardous events can also occur when no migration of magma can be recognized. Examples are tectonic and hydrothermal unrest that may lead to phreatic eruptions. Recent events (e.g., Ontake eruption on September 2014) have demonstrated that phreatic eruptions are still hard to forecast, despite being potentially very hazardous. For these reasons, it is of paramount importance to identify indicators that define the condition of nonmagmatic unrest, in particular for hydrothermal systems. Often, this type of unrest is driven by movement of fluids, requiring alternative monitoring setups, beyond the classical seismic-geodetic-geochemical architectures. Here we present a new version of the probabilistic BET (Bayesian Event Tree) model, specifically developed to include the forecasting of nonmagmatic unrest and related hazards. The structure of the new event tree differs from the previous schemes by adding a specific branch to detail nonmagmatic unrest outcomes. A further goal of this work consists in providing a user-friendly, open-access, and straightforward tool to handle the probabilistic forecast and visualize the results as possible support during a volcanic crisis. The new event tree and tool are here applied to Kawah Ijen stratovolcano, Indonesia, as exemplificative application. In particular, the tool is set on the basis of monitoring data for the learning period 2000-2010, and is then blindly applied to the test period 2010-2012, during which significant unrest phases occurred.

  13. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madankan, R.; Pouget, S.; Singla, P., E-mail: psingla@buffalo.edu

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This papermore » presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.« less

  14. WOVOdat, A Worldwide Volcano Unrest Database, to Improve Eruption Forecasts

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, C.; Costa, F.; Win, N. T. Z.; Tan, K.; Newhall, C. G.; Ratdomopurbo, A.

    2015-12-01

    WOVOdat is the World Organization of Volcano Observatories' Database of Volcanic Unrest. An international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. Despite the large spectrum of monitoring techniques, the interpretation of monitoring data throughout the evolution of the unrest and making timely forecasts remain the most challenging tasks for volcanologists. The field of eruption forecasting is becoming more quantitative, based on the understanding of the pre-eruptive magmatic processes and dynamic interaction between variables that are at play in a volcanic system. Such forecasts must also acknowledge and express the uncertainties, therefore most of current research in this field focused on the application of event tree analysis to reflect multiple possible scenarios and the probability of each scenario. Such forecasts are critically dependent on comprehensive and authoritative global volcano unrest data sets - the very information currently collected in WOVOdat. As the database becomes more complete, Boolean searches, side-by-side digital and thus scalable comparisons of unrest, pattern recognition, will generate reliable results. Statistical distribution obtained from WOVOdat can be then used to estimate the probabilities of each scenario after specific patterns of unrest. We established main web interface for data submission and visualizations, and have now incorporated ~20% of worldwide unrest data into the database, covering more than 100 eruptive episodes. In the upcoming years we will concentrate in acquiring data from volcano observatories develop a robust data query interface, optimizing data mining, and creating tools by which WOVOdat can be used for probabilistic eruption forecasting. The more data in WOVOdat, the more useful it will be.

  15. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in an underestimate of the likelihood of an event occurring ‘today’ leading to potentially inappropriate action choices. We thus present some initial guidelines for communicating such eruption forecasts.

  16. WOVOdat as a worldwide resource to improve eruption forecasts

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, Christina; Costa, Fidel; Zar Win Nang, Thin; Tan, Karine; Newhall, Chris; Ratdomopurbo, Antonius

    2015-04-01

    During periods of volcanic unrest, volcanologists need to interpret signs of unrest to be able to forecast whether an eruption is likely to occur. Some volcanic eruptions display signs of impending eruption such as seismic activity, surface deformation, or gas emissions; but not all will give signs and not all signs are necessarily followed by an eruption. Volcanoes behave differently. Precursory signs of an eruption are sometimes very short, less than an hour, but can be also weeks, months, or even years. Some volcanoes are regularly active and closely monitored, while other aren't. Often, the record of precursors to historical eruptions of a volcano isn't enough to allow a forecast of its future activity. Therefore, volcanologists must refer to monitoring data of unrest and eruptions at similar volcanoes. WOVOdat is the World Organization of Volcano Observatories' Database of volcanic unrest - an international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. We have up to now incorporated about 15% of worldwide unrest data into WOVOdat, covering more than 100 eruption episodes, which includes: volcanic background data, eruptive histories, monitoring data (seismic, deformation, gas, hydrology, thermal, fields, and meteorology), monitoring metadata, and supporting data such as reports, images, maps and videos. Nearly all data in WOVOdat are time-stamped and geo-referenced. Along with creating a database on volcanic unrest, WOVOdat also developing web-tools to help users to query, visualize, and compare data, which further can be used for probabilistic eruption forecasting. Reference to WOVOdat will be especially helpful at volcanoes that have not erupted in historical or 'instrumental' time and thus for which no previous data exist. The more data in WOVOdat, the more useful it will be. We actively solicit relevant data contributions from volcano observatories, other institutions, and individual researchers. Detailed information and documentation about the database and how to use it can be found at www.wovodat.org.

  17. Dynamic Statistical Models for Pyroclastic Density Current Generation at Soufrière Hills Volcano

    NASA Astrophysics Data System (ADS)

    Wolpert, Robert L.; Spiller, Elaine T.; Calder, Eliza S.

    2018-05-01

    To mitigate volcanic hazards from pyroclastic density currents, volcanologists generate hazard maps that provide long-term forecasts of areas of potential impact. Several recent efforts in the field develop new statistical methods for application of flow models to generate fully probabilistic hazard maps that both account for, and quantify, uncertainty. However a limitation to the use of most statistical hazard models, and a key source of uncertainty within them, is the time-averaged nature of the datasets by which the volcanic activity is statistically characterized. Where the level, or directionality, of volcanic activity frequently changes, e.g. during protracted eruptive episodes, or at volcanoes that are classified as persistently active, it is not appropriate to make short term forecasts based on longer time-averaged metrics of the activity. Thus, here we build, fit and explore dynamic statistical models for the generation of pyroclastic density current from Soufrière Hills Volcano (SHV) on Montserrat including their respective collapse direction and flow volumes based on 1996-2008 flow datasets. The development of this approach allows for short-term behavioral changes to be taken into account in probabilistic volcanic hazard assessments. We show that collapses from the SHV lava dome follow a clear pattern, and that a series of smaller flows in a given direction often culminate in a larger collapse and thereafter directionality of the flows change. Such models enable short term forecasting (weeks to months) that can reflect evolving conditions such as dome and crater morphology changes and non-stationary eruptive behavior such as extrusion rate variations. For example, the probability of inundation of the Belham Valley in the first 180 days of a forecast period is about twice as high for lava domes facing Northwest toward that valley as it is for domes pointing East toward the Tar River Valley. As rich multi-parametric volcano monitoring dataset become increasingly available, eruption forecasting is becoming an increasingly viable and important research field. We demonstrate an approach to utilize such data in order to appropriately 'tune' probabilistic hazard assessments for pyroclastic flows. Our broader objective with development of this method is to help advance time-dependent volcanic hazard assessment, by bridging the

  18. Quantifying probabilities of eruptions at Mount Etna (Sicily, Italy).

    NASA Astrophysics Data System (ADS)

    Brancato, Alfonso

    2010-05-01

    One of the major goals of modern volcanology is to set up sound risk-based decision-making in land-use planning and emergency management. Volcanic hazard must be managed with reliable estimates of quantitative long- and short-term eruption forecasting, but the large number of observables involved in a volcanic process suggests that a probabilistic approach could be a suitable tool in forecasting. The aim of this work is to quantify probabilistic estimate of the vent location for a suitable lava flow hazard assessment at Mt. Etna volcano, through the application of the code named BET (Marzocchi et al., 2004, 2008). The BET_EF model is based on the event tree philosophy assessed by Newhall and Hoblitt (2002), further developing the concept of vent location, epistemic uncertainties, and a fuzzy approach for monitoring measurements. A Bayesian event tree is a specialized branching graphical representation of events in which individual branches are alternative steps from a general prior event, and evolving into increasingly specific subsequent states. Then, the event tree attempts to graphically display all relevant possible outcomes of volcanic unrest in progressively higher levels of detail. The procedure is set to estimate an a priori probability distribution based upon theoretical knowledge, to accommodate it by using past data, and to modify it further by using current monitoring data. For the long-term forecasting, an a priori model, dealing with the present tectonic and volcanic structure of the Mt. Etna, is considered. The model is mainly based on past vent locations and fracture location datasets (XX century of eruptive history of the volcano). Considering the variation of the information through time, and their relationship with the structural setting of the volcano, datasets we are also able to define an a posteriori probability map for next vent opening. For short-term forecasting vent opening hazard assessment, the monitoring has a leading role, primarily based on seismological and volcanological data, integrated with strain, geochemical, gravimetric and magnetic parameters. In the code, is necessary to fix an appropriate forecasting time window. On open-conduit volcanoes as Mt. Etna, a forecast time window of a month (as fixed in other applications worldwide) seems unduly long, because variations of the state of the volcano (significant variation of a specific monitoring parameter could occur in time scale shorter than the forecasting time window) are expected with shorter time scale (hour, day or week). This leads to set a week as forecasting time window, coherently with the number of weeks in which an unrest has been experienced. The short-term vent opening hazard assessment will be estimated during an unrest phase; the testing case (2001 July eruption) will include all the monitoring parameters collected at Mt. Etna during the six months preceding the eruption. The monitoring role has been assessed eliciting more than 50 parameters, including seismic activity, ground deformation, geochemistry, gravity, magnetism, and distributed inside the first three nodes of the procedure. Parameter values describe the Mt. Etna volcano activity, being more detailed through the code, particularly in time units. The methodology allows all assumptions and thresholds to be clearly identified and provides a rational means for their revision if new data or information are incoming. References Newhall C.G. and Hoblitt R.P.; 2002: Constructing event trees for volcanic crises, Bull. Volcanol., 64, 3-20, doi: 10.1007/s0044500100173. Marzocchi W., Sandri L., Gasparini P., Newhall C. and Boschi E.; 2004: Quantifying probabilities of volcanic events: The example of volcanic hazard at Mount Vesuvius, J. Geophys. Res., 109, B11201, doi:10.1029/2004JB00315U. Marzocchi W., Sandri, L. and Selva, J.; 2008: BET_EF: a probabilistic tool for long- and short-term eruption forecasting, Bull. Volcanol., 70, 623 - 632, doi: 10.1007/s00445-007-0157-y.

  19. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner

    2015-04-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  20. Operational short-term Probabilistic Volcanic Hazard Assessment of tephra fallout: an example from the 1982-1984 unrest at Campi Flegrei

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Selva, Jacopo; Costa, Antonio; Macedonio, Giovanni; Marzocchi, Warner

    2014-05-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology for the short-term PVHA and its operational implementation, based on the model BET_EF, in which measures from the monitoring system are used to routinely update the forecast of some parameters related to the eruption dynamics, that is, the probabilities of eruption, of every possible vent position and every possible eruption size. Then, considering all possible vent positions and eruptive sizes, tephra dispersal models are coupled with frequently updated meteorological forecasts. Finally, these results are merged through a Bayesian procedure, accounting for epistemic uncertainties at all the considered steps. As case study we retrospectively study some stages of the volcanic unrest that took place in Campi Flegrei (CF) in 1982-1984. In particular, we aim at presenting a practical example of possible operational tephra fall PVHA on a daily basis, in the surroundings of CF at different stages of the 1982-84 unrest. Tephra dispersal is simulated using the analytical HAZMAP code. We consider three possible eruptive sizes (a low, a medium and a high eruption "scenario" respectively) and 700 possible vent positions within the CF Neapolitan Yellow Tuff caldera. The probabilities related to eruption dynamics, and estimated by BET_EF, are based on the set up of the code obtained specifically for CF during a 6-years long elicitation project, and on the actual monitoring parameters measured during the unrest and published in the literature. We take advantage here of two novel improvements: (i) a time function to describe how the probability of eruption evolves within the time window defined for the forecast, and (ii) the production of hazard curves and their confidence levels, a tool that allows a complete description of PVHA and its uncertainties. The general goal of this study is to show what, and how, pieces of scientific knowledge can be operationally transferred to decision makers, and specifically how this could have been translated in practice during the 1982-84 Campi Flegrei crisis, if scientists knew what we know today about this volcano.

  1. Using multiple data sets to populate probabilistic volcanic event trees

    USGS Publications Warehouse

    Newhall, C.G.; Pallister, John S.

    2014-01-01

    The key parameters one needs to forecast outcomes of volcanic unrest are hidden kilometers beneath the Earth’s surface, and volcanic systems are so complex that there will invariably be stochastic elements in the evolution of any unrest. Fortunately, there is sufficient regularity in behaviour that some, perhaps many, eruptions can be forecast with enough certainty for populations to be evacuated and kept safe. Volcanologists charged with forecasting eruptions must try to understand each volcanic system well enough that unrest can be interpreted in terms of pre-eruptive process, but must simultaneously recognize and convey uncertainties in their assessment. We have found that use of event trees helps to focus discussion, integrate data from multiple sources, reach consensus among scientists about both pre-eruptive process and uncertainties and, in some cases, to explain all of this to officials. Figure 1 shows a generic volcanic event tree from Newhall and Hoblitt (2002) that can be modified as needed for each specific volcano. This paper reviews how we and our colleagues have used such trees during a number of volcanic crises worldwide, for rapid hazard assessments in situations in which more formal expert elicitations could not be conducted. We describe how Multiple Data Sets can be used to estimate probabilities at each node and branch. We also present case histories of probability estimation during crises, how the estimates were used by public officials, and some suggestions for future improvements.

  2. Scientists vs. Vesuvius: limits of volcanology

    NASA Astrophysics Data System (ADS)

    Carlino, Stefano; Somma, Renato

    2014-05-01

    Recently, Italian newspapers reported the statements of Japanese and American volcanologists which declared the high hazard related to the future occurrence of catastrophic eruption at Vesuvius. Is this a reliable picture from scientific point of view? The evaluation of volcanic hazard is based on a general statistical law for which the chances of an eruptive event increase when energy decreases. This law is constructed on the basis of empirical data. Thus, the possibility that a plinian-like eruption occurs, for each volcano, is rare and further reduced for worst-case scenario. However, empirical data are not supported by a robust scientific theory, experimentally verifiable through an exact forecast of a long-term eruption, both in time limits and in energy. Today, the lack of paradigms able to predict in a deterministic way such a complex phenomena, limit the field of the scientists that cannot go further evaluations of a purely probabilistic nature. From this point of view volcanology cannot be considered an hard quantitative Science. The declaration according to which Vesuvius, sooner or later, will produce a catastrophic eruption, yet apparently obvious if we consider the very high degree of urbanization, is not supported by any experimentally verifiable theory. Therefore, the statement according to which Vesuvius next eruptive event will be catastrophic is false. In probabilistic terms, it is actually the least possible scenario. Recognizing the cognitive limits in this research field means to encourage research itself towards the determination of more solid paradigms, in order to get more exact forecasts about such complex phenomena. The scientific compromise of defining risk scenarios, rather than deterministic evaluations about future eruptive events, precisely reflects the limits of research that have to be contemplated even by Civil Protection. Having considered these limits, every risk scenario, even the most conservative, will be ineffective in absence of an adequate political program about the reduction of the exposed value of the area and the systemic risk. In such a context, the Vesuvius area, the recent enlargement of the red zone could not represent an effective method of defence from natural disasters.

  3. Natural disasters: forecasting economic and life losses

    USGS Publications Warehouse

    Nishenko, Stuart P.; Barton, Christopher C.

    1997-01-01

    Events such as hurricanes, earthquakes, floods, tsunamis, volcanic eruptions, and tornadoes are natural disasters because they negatively impact society, and so they must be measured and understood in human-related terms. At the U.S. Geological Survey, we have developed a new method to examine fatality and dollar-loss data, and to make probabilistic estimates of the frequency and magnitude of future events. This information is vital to large sectors of society including disaster relief agencies and insurance companies.

  4. Learning to recognize volcanic non-eruptions

    USGS Publications Warehouse

    Poland, Michael P.

    2010-01-01

    An important goal of volcanology is to answer the questions of when, where, and how a volcano will erupt—in other words, eruption prediction. Generally, eruption predictions are based on insights from monitoring data combined with the history of the volcano. An outstanding example is the A.D. 1980–1986 lava dome growth at Mount St. Helens, Washington (United States). Recognition of a consistent pattern of precursors revealed by geophysical, geological, and geochemical monitoring enabled successful predictions of more than 12 dome-building episodes (Swanson et al., 1983). At volcanic systems that are more complex or poorly understood, probabilistic forecasts can be useful (e.g., Newhall and Hoblitt, 2002; Marzocchi and Woo, 2009). In such cases, the probabilities of different types of volcanic events are quantified, using historical accounts and geological studies of a volcano's past activity, supplemented by information from similar volcanoes elsewhere, combined with contemporary monitoring information.

  5. A review of tephra transport and dispersal models: Evolution, current status, and future perspectives

    NASA Astrophysics Data System (ADS)

    Folch, A.

    2012-08-01

    Tephra transport models try to predict atmospheric dispersion and sedimentation of tephra depending on meteorology, particle properties, and eruption characteristics, defined by eruption column height, mass eruption rate, and vertical distribution of mass. Models are used for different purposes, from operational forecast of volcanic ash clouds to hazard assessment of tephra dispersion and fallout. The size of the erupted particles, a key parameter controlling the dynamics of particle sedimentation in the atmosphere, varies within a wide range. Largest centimetric to millimetric particles fallout at proximal to medial distances from the volcano and sediment by gravitational settling. On the other extreme, smallest micrometric to sub-micrometric particles can be transported at continental or even at global scales and are affected by other deposition and aggregation mechanisms. Different scientific communities had traditionally modeled the dispersion of these two end members. Volcanologists developed families of models suitable for lapilli and coarse ash and aimed at computing fallout deposits and for hazard assessment. In contrast, meteorologists and atmospheric scientists have traditionally used other atmospheric transport models, dealing with finer particles, for tracking motion of volcanic ash clouds and, eventually, for computing airborne ash concentrations. During the last decade, the increasing demand for model accuracy and forecast reliability has pushed on two fronts. First, the original gap between these different families of models has been filled with the emergence of multi-scale and multi-purpose models. Second, new modeling strategies including, for example, ensemble and probabilistic forecast or model data assimilation are being investigated for future implementation in models and or modeling strategies. This paper reviews the evolution of tephra transport and dispersal models during the last two decades, presents the status and limitations of the current modeling strategies, and discusses some emergent perspectives expected to be implemented at operational level during the next few years. Improvements in both real-time forecasting and long-term hazard assessment are necessary to loss prevention programs on a local, regional, national and international level.

  6. Forecasting eruptions of Mauna Loa Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Decker, Robert W.; Klein, Fred W.; Okamura, Arnold T.; Okubo, Paul G.

    Past eruption patterns and various kinds of precursors are the two basic ingredients of eruption forecasts. The 39 historical eruptions of Mauna Loa from 1832 to 1984 have intervals as short as 104 days and as long as 9,165 days between the beginning of an eruption and the beginning of the next one. These recurrence times roughly fit a Poisson distribution pattern with a mean recurrence time of 1,459 days, yielding a probability of 22% (P=.22) for an eruption of Mauna Loa during any next year. The long recurrence times since 1950, however, suggest that the probability is not random, and that the current probability for an eruption during the next year may be as low as 6%. Seismicity beneath Mauna Loa increased for about two years prior to the 1975 and 1984 eruptions. Inflation of the summit area took place between eruptions with the highest rates occurring for a year or two before and after the 1975 and 1984 eruptions. Volcanic tremor beneath Mauna Loa began 51 minutes prior to the 1975 eruption and 115 minutes prior to the 1984 eruption. Eruption forecasts were published in 1975, 1976, and 1983. The 1975 and 1983 forecasts, though vaguely worded, were qualitatively correct regarding the timing of the next eruption. The 1976 forecast was more quantitative; it was wrong on timing but accurate on forecasting the location of the 1984 eruption. This paper urges that future forecasts be specific so they can be evaluated quantitatively.

  7. Volcanic risk metrics at Mt Ruapehu, New Zealand: some background to a probabilistic eruption forecasting scheme and a cost/benefit analysis at an open conduit volcano

    NASA Astrophysics Data System (ADS)

    Jolly, Gill; Sandri, Laura; Lindsay, Jan; Scott, Brad; Sherburn, Steve; Jolly, Art; Fournier, Nico; Keys, Harry; Marzocchi, Warner

    2010-05-01

    The Bayesian Event Tree for Eruption Forecasting software (BET_EF) is a probabilistic model based on an event tree scheme that was created specifically to compute long- and short-term probabilities of different outcomes (volcanic unrest, magmatic unrest, eruption, vent location and eruption size) at long-time dormant and routinely monitored volcanoes. It is based on the assumption that upward movements of magma in a closed conduit volcano will produce detectable changes in the monitored parameters at the surface. In this perspective, the goal of BET_EF is to compute probabilities by merging information from geology, models, past data and present monitoring measurements, through a Bayesian inferential method. In the present study, we attempt to apply BET_EF to Mt Ruapehu, a very active and well-monitored volcano exhibiting the typical features of open conduit volcanoes. In such conditions, current monitoring at the surface is not necessarily able to detect short term changes at depth that may occur only seconds to minutes before an eruption. This results in so-called "blue sky eruptions" of Mt Ruapehu (for example in September 2007), that are volcanic eruptions apparently not preceded by any presently detectable signal in the current monitoring. A further complication at Mt Ruapehu arises from the well-developed hydrothermal system and the permanent crater lake sitting on top of the magmatic conduit. Both the hydrothermal system and crater lake may act to mask or change monitoring signals (if present) that magma produces deeper in the edifice. Notwithstanding these potential drawbacks, we think that an attempt to apply BET_EF at Ruapehu is worthwhile, for several reasons. First, with the exception of a few "blue sky" events, monitoring data at Mt Ruapehu can be helpful in forecasting major events, especially if a large amount of magma is intruded into the edifice and becomes available for phreatomagmatic or magmatic eruptions, as for example in 1995-96. Secondly, in setting up BET_EF for Mt Ruapehu we are forced to define quantitatively what the background activity is. This will result in a quantitative evaluation of what changes in long time monitored parameters may influence the probability of future eruptions. The slopes of Mt Ruapehu host the largest ski area in North Island, New Zealand. Lahars have been generated as a result of several eruptions in the last 50 years, and some of these have reached the ski runs in a very short time frame (around 90 seconds from the beginning of the eruption). In the light of these potentially hazardous lahars, we use the output probabilities provided by BET_EF in a practical and rational decision scheme recently proposed by Marzocchi and Woo (2009) based on a cost/benefit analysis (CBA). In such scheme, a C/L ratio is computed, based on the costs (C) of practical mitigation actions to reduce risk (e.g., a public warning scheme and other means of raising awareness, and a call for a temporary and/or partial closure of the ski area) and on the potential loss (L) if no mitigation action is taken and an eruption occurs causing lahars down the ski fields. By comparing the probability of eruption-driven lahars and the C/L ratio, it is possible to define the most rational mitigation actions that can be taken to reduce the risk to skiers, snowboarders and staff on skifield. As BET_EF probability of eruption changes dynamically as updated monitoring data are received, the authorities can decide, at any specific point in time, what is the best action according to the current monitoring of the volcano. In this respect, CBA represents a bridge linking scientific output (probabilities) and Decision Makers (practical mitigation actions).

  8. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  9. Eruption Forecasting in Alaska: A Retrospective and Test of the Distal VT Model

    NASA Astrophysics Data System (ADS)

    Prejean, S. G.; Pesicek, J. D.; Wellik, J.; Cameron, C.; White, R. A.; McCausland, W. A.; Buurman, H.

    2015-12-01

    United States volcano observatories have successfully forecast most significant US eruptions in the past decade. However, eruptions of some volcanoes remain stubbornly difficult to forecast effectively using seismic data alone. The Alaska Volcano Observatory (AVO) has responded to 28 eruptions from 10 volcanoes since 2005. Eruptions that were not forecast include those of frequently active volcanoes with basaltic-andesite magmas, like Pavlof, Veniaminof, and Okmok volcanoes. In this study we quantify the success rate of eruption forecasting in Alaska and explore common characteristics of eruptions not forecast. In an effort to improve future forecasts, we re-examine seismic data from eruptions and known intrusive episodes in Alaska to test the effectiveness of the distal VT model commonly employed by the USGS-USAID Volcano Disaster Assistance Program (VDAP). In the distal VT model, anomalous brittle failure or volcano-tectonic (VT) earthquake swarms in the shallow crust surrounding the volcano occur as a secondary response to crustal strain induced by magma intrusion. Because the Aleutian volcanic arc is among the most seismically active regions on Earth, distinguishing distal VT earthquake swarms for eruption forecasting purposes from tectonic seismicity unrelated to volcanic processes poses a distinct challenge. In this study, we use a modified beta-statistic to identify pre-eruptive distal VT swarms and establish their statistical significance with respect to long-term background seismicity. This analysis allows us to explore the general applicability of the distal VT model and quantify the likelihood of encountering false positives in eruption forecasting using this model alone.

  10. The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments

    NASA Astrophysics Data System (ADS)

    Chen, Fajing; Jiao, Meiyan; Chen, Jing

    2013-04-01

    Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.

  11. A volcanic event forecasting model for multiple tephra records, demonstrated on Mt. Taranaki, New Zealand

    NASA Astrophysics Data System (ADS)

    Damaschke, Magret; Cronin, Shane J.; Bebbington, Mark S.

    2018-01-01

    Robust time-varying volcanic hazard assessments are difficult to develop, because they depend upon having a complete and extensive eruptive activity record. Missing events in eruption records are endemic, due to poor preservation or erosion of tephra and other volcanic deposits. Even with many stratigraphic studies, underestimation or overestimation of eruption numbers is possible due to mis-matching tephras with similar chemical compositions or problematic age models. It is also common to have gaps in event coverage due to sedimentary records not being available in all directions from the volcano, especially downwind. Here, we examine the sensitivity of probabilistic hazard estimates using a suite of four new and two existing high-resolution tephra records located around Mt. Taranaki, New Zealand. Previous estimates were made using only single, or two correlated, tephra records. In this study, tephra data from six individual sites in lake and peat bogs covering an arc of 120° downwind of the volcano provided an excellent temporal high-resolution event record. The new data confirm a previously identified semi-regular pattern of variable eruption frequency at Mt. Taranaki. Eruption intervals exhibit a bimodal distribution, with eruptions being an average of 65 years apart, and in 2% of cases, centuries separate eruptions. The long intervals are less common than seen in earlier studies, but they have not disappeared with the inclusion of our comprehensive new dataset. Hence, the latest long interval of quiescence, since AD 1800, is unusual, but not out of character with the volcano. The new data also suggest that one of the tephra records (Lake Rotokare) used in earlier work had an old carbon effect on age determinations. This shifted ages of the affected tephras so that they were not correlated to other sites, leading to an artificially high eruption frequency in the previous combined record. New modelled time-varying frequency estimates suggest a 33-42% probability of an explosive eruption from Mt. Taranaki in the next 50 years, which is significantly lower than suggested by previous studies. This work also demonstrates some of the pitfalls to be avoided in combining stratigraphic records for eruption forecasting.

  12. Combining probabilistic hazard assessment with cost-benefit analysis to support decision making in a volcanic crisis from the Auckland Volcanic Field, New Zealand

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Jolly, Gill; Lindsay, Jan; Howe, Tracy; Marzocchi, Warner

    2010-05-01

    One of the main challenges of modern volcanology is to provide the public with robust and useful information for decision-making in land-use planning and in emergency management. From the scientific point of view, this translates into reliable and quantitative long- and short-term volcanic hazard assessment and eruption forecasting. Because of the complexity in characterizing volcanic events, and of the natural variability of volcanic processes, a probabilistic approach is more suitable than deterministic modeling. In recent years, two probabilistic codes have been developed for quantitative short- and long-term eruption forecasting (BET_EF) and volcanic hazard assessment (BET_VH). Both of them are based on a Bayesian Event Tree, in which volcanic events are seen as a chain of logical steps of increasing detail. At each node of the tree, the probability is computed by taking into account different sources of information, such as geological and volcanological models, past occurrences, expert opinion and numerical modeling of volcanic phenomena. Since it is a Bayesian tool, the output probability is not a single number, but a probability distribution accounting for aleatory and epistemic uncertainty. In this study, we apply BET_VH in order to quantify the long-term volcanic hazard due to base surge invasion in the region around Auckland, New Zealand's most populous city. Here, small basaltic eruptions from monogenetic cones pose a considerable risk to the city in case of phreatomagmatic activity: evidence for base surges are not uncommon in deposits from past events. Currently, we are particularly focussing on the scenario simulated during Exercise Ruaumoko, a national disaster exercise based on the build-up to an eruption in the Auckland Volcanic Field. Based on recent papers by Marzocchi and Woo, we suggest a possible quantitative strategy to link probabilistic scientific output and Boolean decision making. It is based on cost-benefit analysis, in which all costs and benefits of mitigation actions have to be evaluated and compared, weighting them with the probability of occurrence of a specific threatening volcanic event. An action should be taken when the benefit of that action outweighs the costs. It is worth remarking that this strategy does not guarantee to recommend a decision that we would have taken with the benefit of hindsight. However, this strategy will be successful over the long-tem. Furthermore, it has the overwhelming advantage of providing a quantitative decision rule that is set before any emergency, and thus it will be justifiable at any stage of the process. In our present application, we are trying to set up a cost-benefit scheme for the call of an evacuation to protect people in the Auckland Volcanic Field against base surge invasion. Considering the heterogeneity of the urban environment and the size of the region at risk, we propose a cost-benefit scheme that is space dependent, to take into account higher costs when an eruption threatens sensible sites for the city and/or the nation, such as the international airport or the harbour. Finally, we compare our findings with the present Contingency Plan for Auckland.

  13. Developing International Guidelines on Volcanic Hazard Assessments for Nuclear Facilities

    NASA Astrophysics Data System (ADS)

    Connor, Charles

    2014-05-01

    Worldwide, tremendous progress has been made in recent decades in forecasting volcanic events, such as episodes of volcanic unrest, eruptions, and the potential impacts of eruptions. Generally these forecasts are divided into two categories. Short-term forecasts are prepared in response to unrest at volcanoes, rely on geophysical monitoring and related observations, and have the goal of forecasting events on timescales of hours to weeks to provide time for evacuation of people, shutdown of facilities, and implementation of related safety measures. Long-term forecasts are prepared to better understand the potential impacts of volcanism in the future and to plan for potential volcanic activity. Long-term forecasts are particularly useful to better understand and communicate the potential consequences of volcanic events for populated areas around volcanoes and for siting critical infrastructure, such as nuclear facilities. Recent work by an international team, through the auspices of the International Atomic Energy Agency, has focused on developing guidelines for long-term volcanic hazard assessments. These guidelines have now been implemented for hazard assessment for nuclear facilities in nations including Indonesia, the Philippines, Armenia, Chile, and the United States. One any time scale, all volcanic hazard assessments rely on a geologically reasonable conceptual model of volcanism. Such conceptual models are usually built upon years or decades of geological studies of specific volcanic systems, analogous systems, and development of a process-level understanding of volcanic activity. Conceptual models are used to bound potential rates of volcanic activity, potential magnitudes of eruptions, and to understand temporal and spatial trends in volcanic activity. It is these conceptual models that provide essential justification for assumptions made in statistical model development and the application of numerical models to generate quantitative forecasts. It is a tremendous challenge in quantitative volcanic hazard assessments to encompass alternative conceptual models, and to create models that are robust to evolving understanding of specific volcanic systems by the scientific community. A central question in volcanic hazards forecasts is quantifying rates of volcanic activity. Especially for long-dormant volcanic systems, data from the geologic record may be sparse, individual events may be missing or unrecognized in the geologic record, patterns of activity may be episodic or otherwise nonstationary. This leads to uncertainty in forecasting long-term rates of activity. Hazard assessments strive to quantify such uncertainty, for example by comparing observed rates of activity with alternative parametric and nonparametric models. Numerical models are presented that characterize the spatial distribution of potential volcanic events. These spatial density models serve as the basis for application of numerical models of specific phenomena such as development of lava flow, tephra fallout, and a host of other volcanic phenomena. Monte Carlo techniques (random sampling, stratified sampling, importance sampling) are methods used to sample vent location and other key eruption parameters, such as eruption volume, magma rheology, and eruption column height for probabilistic models. The development of coupled scenarios (e.g., the probability of tephra accumulation on a slope resulting in subsequent debris flows) is also assessed through these methods, usually with the aid of event trees. The primary products of long-term forecasts are a statistical model of the conditional probability of the potential effects of volcanism, should an eruption occur, and the probability of such activity occurring. It is emphasized that hazard forecasting is an iterative process, and board consideration must be given to alternative conceptual models of volcanism, weighting of volcanological data in the analyses, and alternative statistical and numerical models. This structure is amenable to expert elicitation in order to weight alternative models and to explore alternative scenarios.

  14. Scientific assessment of accuracy, skill and reliability of ocean probabilistic forecast products.

    NASA Astrophysics Data System (ADS)

    Wei, M.; Rowley, C. D.; Barron, C. N.; Hogan, P. J.

    2016-02-01

    As ocean operational centers are increasingly adopting and generating probabilistic forecast products for their customers with valuable forecast uncertainties, how to assess and measure these complicated probabilistic forecast products objectively is challenging. The first challenge is how to deal with the huge amount of the data from the ensemble forecasts. The second one is how to describe the scientific quality of probabilistic products. In fact, probabilistic forecast accuracy, skills, reliability, resolutions are different attributes of a forecast system. We briefly introduce some of the fundamental metrics such as the Reliability Diagram, Reliability, Resolution, Brier Score (BS), Brier Skill Score (BSS), Ranked Probability Score (RPS), Ranked Probability Skill Score (RPSS), Continuous Ranked Probability Score (CRPS), and Continuous Ranked Probability Skill Score (CRPSS). The values and significance of these metrics are demonstrated for the forecasts from the US Navy's regional ensemble system with different ensemble members. The advantages and differences of these metrics are studied and clarified.

  15. The case for probabilistic forecasting in hydrology

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, Roman

    2001-08-01

    That forecasts should be stated in probabilistic, rather than deterministic, terms has been argued from common sense and decision-theoretic perspectives for almost a century. Yet most operational hydrological forecasting systems produce deterministic forecasts and most research in operational hydrology has been devoted to finding the 'best' estimates rather than quantifying the predictive uncertainty. This essay presents a compendium of reasons for probabilistic forecasting of hydrological variates. Probabilistic forecasts are scientifically more honest, enable risk-based warnings of floods, enable rational decision making, and offer additional economic benefits. The growing demand for information about risk and the rising capability to quantify predictive uncertainties create an unparalleled opportunity for the hydrological profession to dramatically enhance the forecasting paradigm.

  16. Constructing event trees for volcanic crises

    USGS Publications Warehouse

    Newhall, C.; Hoblitt, R.

    2002-01-01

    Event trees are useful frameworks for discussing probabilities of possible outcomes of volcanic unrest. Each branch of the tree leads from a necessary prior event to a more specific outcome, e.g., from an eruption to a pyroclastic flow. Where volcanic processes are poorly understood, probability estimates might be purely empirical - utilizing observations of past and current activity and an assumption that the future will mimic the past or follow a present trend. If processes are better understood, probabilities might be estimated from a theoritical model, either subjectively or by numerical simulations. Use of Bayes' theorem aids in the estimation of how fresh unrest raises (or lowers) the probabilities of eruptions. Use of event trees during volcanic crises can help volcanologists to critically review their analysis of hazard, and help officials and individuals to compare volcanic risks with more familiar risks. Trees also emphasize the inherently probabilistic nature of volcano forecasts, with multiple possible outcomes.

  17. Weighing costs and losses: A decision making game using probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan

    2017-04-01

    Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting and water-related risks, and one of the audiences comprised a group of experts in probabilistic forecasting. Results show that the different shop owners do take the costs of taking action and the potential losses into account in their decisions. Shop owners with a low cost/loss ratio were found to be more inclined to take actions based on the forecasts, though the absolute value of the losses also increased the willingness to take action. Little differentiation was found between the different groups of players.

  18. Using volcanic tremor for eruption forecasting at White Island volcano (Whakaari), New Zealand

    NASA Astrophysics Data System (ADS)

    Chardot, Lauriane; Jolly, Arthur D.; Kennedy, Ben M.; Fournier, Nicolas; Sherburn, Steven

    2015-09-01

    Eruption forecasting is a challenging task because of the inherent complexity of volcanic systems. Despite remarkable efforts to develop complex models in order to explain volcanic processes prior to eruptions, the material Failure Forecast Method (FFM) is one of the very few techniques that can provide a forecast time for an eruption. However, the method requires testing and automation before being used as a real-time eruption forecasting tool at a volcano. We developed an automatic algorithm to issue forecasts from volcanic tremor increase episodes recorded by Real-time Seismic Amplitude Measurement (RSAM) at one station and optimised this algorithm for the period August 2011-January 2014 which comprises the recent unrest period at White Island volcano (Whakaari), New Zealand. A detailed residual analysis was paramount to select the most appropriate model explaining the RSAM time evolutions. In a hindsight simulation, four out of the five small eruptions reported during this period occurred within a failure window forecast by our optimised algorithm and the probability of an eruption on a day within a failure window was 0.21, which is 37 times higher than the probability of having an eruption on any day during the same period (0.0057). Moreover, the forecasts were issued prior to the eruptions by a few hours which is important from an emergency management point of view. Whereas the RSAM time evolutions preceding these four eruptions have a similar goodness-of-fit with the FFM, their spectral characteristics are different. The duration-amplitude distributions of the precursory tremor episodes support the hypothesis that several processes were likely occurring prior to these eruptions. We propose that slow rock failure and fluid flow processes are plausible candidates for the tremor source of these episodes. This hindsight exercise can be useful for future real-time implementation of the FFM at White Island. A similar methodology could also be tested at other volcanoes even if only a limited network is available.

  19. Real-time prediction of rain-triggered lahars: incorporating seasonality and catchment recovery

    NASA Astrophysics Data System (ADS)

    Jones, Robbie; Manville, Vern; Peakall, Jeff; Froude, Melanie J.; Odbert, Henry M.

    2017-12-01

    Rain-triggered lahars are a significant secondary hydrological and geomorphic hazard at volcanoes where unconsolidated pyroclastic material produced by explosive eruptions is exposed to intense rainfall, often occurring for years to decades after the initial eruptive activity. Previous studies have shown that secondary lahar initiation is a function of rainfall parameters, source material characteristics and time since eruptive activity. In this study, probabilistic rain-triggered lahar forecasting models are developed using the lahar occurrence and rainfall record of the Belham River valley at the Soufrière Hills volcano (SHV), Montserrat, collected between April 2010 and April 2012. In addition to the use of peak rainfall intensity (PRI) as a base forecasting parameter, considerations for the effects of rainfall seasonality and catchment evolution upon the initiation of rain-triggered lahars and the predictability of lahar generation are also incorporated into these models. Lahar probability increases with peak 1 h rainfall intensity throughout the 2-year dataset and is higher under given rainfall conditions in year 1 than year 2. The probability of lahars is also enhanced during the wet season, when large-scale synoptic weather systems (including tropical cyclones) are more common and antecedent rainfall and thus levels of deposit saturation are typically increased. The incorporation of antecedent conditions and catchment evolution into logistic-regression-based rain-triggered lahar probability estimation models is shown to enhance model performance and displays the potential for successful real-time prediction of lahars, even in areas featuring strongly seasonal climates and temporal catchment recovery.

  20. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2012-12-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.

  1. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  2. Probabilistically modeling lava flows with MOLASSES

    NASA Astrophysics Data System (ADS)

    Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.

    2017-12-01

    Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.

  3. Collaborative Cyber-infrastructures for the Management of the UNESCO-IGCP Research Project "Forecast of tephra fallout"

    NASA Astrophysics Data System (ADS)

    Folch, A.; Costa, A.; Cordoba, G.

    2009-04-01

    Tephra fallout following explosive volcanic eruptions produces several hazardous effects on inhabitants, infrastructure, and property and represents a serious threat for communities located around active volcanoes. In order to mitigate the effects on the surrounding areas, scientists and civil decision-making authorities need reliable short-term forecasts during episodes of eruptive crisis and long-term probabilistic maps to plan territorial policies and land use. Modelling, together with field studies and volcano monitoring, constitutes an indispensable tool to achieve these objectives. The UNESCO-IGCP research project proposal "Forecast of tephra fallout" has the aim to produce a series of tools capable to elaborate both short-term forecasts and long-term hazard assessments using the cutting-edge models for tephra transport and sedimentation. A special project website will be designed to supply a set of models, procedures and expertise to several Latino-American Institutes based in countries seriously threatened by this geo-hazard (Argentina, Chile, Colombia, Ecuador, Mexico, and Nicaragua). This will proportionate to the final users a tool to elaborate short-term forecasts of tephra deposition on the ground, and determine airborne ash concentrations (a quantity of special relevance for aerial navigation safety) during eruptions and emergencies. The project web-site will have a public section and a password-protected area to exchange information and data among participants and, eventually, to allow remote execution of high-resolution mesoscale meteorological forecasts at the BSC facilities. The public website section will be updated periodically and will include sections describing the project objectives and achievements as well as the hazard maps for the investigated volcanoes, and will be linked to other relevant websites such as IAVCEI, IGCP, IUGS and UNESCO homepages. A part of the public section of the website will be devoted to disseminate achieved scientific results, provide general advice, and display hazard maps to a larger public beyond the scientific community. The website private section will include a software and documentation download section as well as a gateway to run the WRF mesoscale meteorological model and the parallel version of the FALL3D model at the BSC facilities. It will be invaluable during an eventual emergency if the affected institution does not yet have an agreement with its national weather service.

  4. Comparison of the economic impact of different wind power forecast systems for producers

    NASA Astrophysics Data System (ADS)

    Alessandrini, S.; Davò, F.; Sperati, S.; Benini, M.; Delle Monache, L.

    2014-05-01

    Deterministic forecasts of wind production for the next 72 h at a single wind farm or at the regional level are among the main end-users requirement. However, for an optimal management of wind power production and distribution it is important to provide, together with a deterministic prediction, a probabilistic one. A deterministic forecast consists of a single value for each time in the future for the variable to be predicted, while probabilistic forecasting informs on probabilities for potential future events. This means providing information about uncertainty (i.e. a forecast of the PDF of power) in addition to the commonly provided single-valued power prediction. A significant probabilistic application is related to the trading of energy in day-ahead electricity markets. It has been shown that, when trading future wind energy production, using probabilistic wind power predictions can lead to higher benefits than those obtained by using deterministic forecasts alone. In fact, by using probabilistic forecasting it is possible to solve economic model equations trying to optimize the revenue for the producer depending, for example, on the specific penalties for forecast errors valid in that market. In this work we have applied a probabilistic wind power forecast systems based on the "analog ensemble" method for bidding wind energy during the day-ahead market in the case of a wind farm located in Italy. The actual hourly income for the plant is computed considering the actual selling energy prices and penalties proportional to the unbalancing, defined as the difference between the day-ahead offered energy and the actual production. The economic benefit of using a probabilistic approach for the day-ahead energy bidding are evaluated, resulting in an increase of 23% of the annual income for a wind farm owner in the case of knowing "a priori" the future energy prices. The uncertainty on price forecasting partly reduces the economic benefit gained by using a probabilistic energy forecast system.

  5. Probabilistic approach to decision making under uncertainty during volcanic crises. Retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Sobradelo, Rosa; Martí, Joan; Kilburn, Christopher; López, Carmen

    2014-05-01

    Understanding the potential evolution of a volcanic crisis is crucial to improving the design of effective mitigation strategies. This is especially the case for volcanoes close to densely-populated regions, where inappropriate decisions may trigger widespread loss of life, economic disruption and public distress. An outstanding goal for improving the management of volcanic crises, therefore, is to develop objective, real-time methodologies for evaluating how an emergency will develop and how scientists communicate with decision makers. Here we present a new model BADEMO (Bayesian Decision Model) that applies a general and flexible, probabilistic approach to managing volcanic crises. The model combines the hazard and risk factors that decision makers need for a holistic analysis of a volcanic crisis. These factors include eruption scenarios and their probabilities of occurrence, the vulnerability of populations and their activities, and the costs of false alarms and failed forecasts. The model can be implemented before an emergency, to identify actions for reducing the vulnerability of a district; during an emergency, to identify the optimum mitigating actions and how these may change as new information is obtained; and after an emergency, to assess the effectiveness of a mitigating response and, from the results, to improve strategies before another crisis occurs. As illustrated by a retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands, BADEMO provides the basis for quantifying the uncertainty associated with each recommended action as an emergency evolves, and serves as a mechanism for improving communications between scientists and decision makers.

  6. Global forecasting of thermal health hazards: the skill of probabilistic predictions of the Universal Thermal Climate Index (UTCI).

    PubMed

    Pappenberger, F; Jendritzky, G; Staiger, H; Dutra, E; Di Giuseppe, F; Richardson, D S; Cloke, H L

    2015-03-01

    Although over a hundred thermal indices can be used for assessing thermal health hazards, many ignore the human heat budget, physiology and clothing. The Universal Thermal Climate Index (UTCI) addresses these shortcomings by using an advanced thermo-physiological model. This paper assesses the potential of using the UTCI for forecasting thermal health hazards. Traditionally, such hazard forecasting has had two further limitations: it has been narrowly focused on a particular region or nation and has relied on the use of single 'deterministic' forecasts. Here, the UTCI is computed on a global scale, which is essential for international health-hazard warnings and disaster preparedness, and it is provided as a probabilistic forecast. It is shown that probabilistic UTCI forecasts are superior in skill to deterministic forecasts and that despite global variations, the UTCI forecast is skilful for lead times up to 10 days. The paper also demonstrates the utility of probabilistic UTCI forecasts on the example of the 2010 heat wave in Russia.

  7. Short-term volcanic hazard assessment through Bayesian inference: retrospective application to the Pinatubo 1991 volcanic crisis

    NASA Astrophysics Data System (ADS)

    Sobradelo, Rosa; Martí, Joan

    2015-01-01

    One of the most challenging aspects of managing a volcanic crisis is the interpretation of the monitoring data, so as to anticipate to the evolution of the unrest and implement timely mitigation actions. An unrest episode may include different stages or time intervals of increasing activity that may or may not precede a volcanic eruption, depending on the causes of the unrest (magmatic, geothermal or tectonic). Therefore, one of the main goals in monitoring volcanic unrest is to forecast whether or not such increase of activity will end up with an eruption, and if this is the case, how, when, and where this eruption will take place. As an alternative method to expert elicitation for assessing and merging monitoring data and relevant past information, we present a probabilistic method to transform precursory activity into the probability of experiencing a significant variation by the next time interval (i.e. the next step in the unrest), given its preceding evolution, and by further estimating the probability of the occurrence of a particular eruptive scenario combining monitoring and past data. With the 1991 Pinatubo volcanic crisis as a reference, we have developed such a method to assess short-term volcanic hazard using Bayesian inference.

  8. Clustering and Hazard Estimation in the Auckland Volcanic Field, New Zealand

    NASA Astrophysics Data System (ADS)

    Cronin, S. J.; Bebbington, M. S.

    2009-12-01

    The Auckland Volcanic Field (AVF) with its 49 eruptive centres formed over the last c. 250 ka presents several unique challenges to our understanding of distributed volcanic field construction and evolution. Due to the youth of the field, high-resolution stratigraphy of eruption centres and ash-fall sequences is possible, allowing time-breaks, soil and peat formation between eruption units to be identified. Radiocarbon dating of sediments between volcanic deposits shows that at least five of the centres have erupted on more than one occasion, with time breaks of 50-100 years between episodes. In addition, paleomagnetic and ash fall evidence implies that there has been strong clustering of eruption events over time, with a specific “flare-up” event involving over possibly up to 19 eruptions occurring between 35-25 ka, in spatially disparate locations. An additional complicating factor is that the only centre that shows any major evidence for evolution out of standard alkali basaltic compositions is also the youngest and largest in volume by several orders of magnitude. All of these features of the AVF, along with relatively poor age-control for many of the vents make spatio-temporal hazard forecasting for the field based on assumptions of past behaviour extremely difficult. Any relationships that take volumetric considerations into account are particularly difficult, since any trend analysis produces unreasonably large future eruptions. The most reasonable model is spatial, via eruption location. We have re-examined the age progression of eruptive events in the AVF, incorporating the most reliable sources of age and stratigraphic data, including developing new correlations between ashfall records in cores and likely vent locations via a probabilistic model of tephra dispersal. A Monte Carlo procedure using the age-progression, stratigraphy and dating constraints can then randomly reproduce likely orderings of events in the field. These were fitted by a clustering-based model of vent locations as originally applied by Magill et al (2005: Mathematical Geol. 37: 227-242) to the Allen and Smith (1994; Geosci. Report Shizuoka Univ 20: 5-14) age ordering of volcanism at AVF. Applying this model, modified by allowing continuation of activity at or around the youngest event, to sampled age orderings from the Monte Carlo procedure shows a very different spatial forecast to the earlier analysis. It is also different to the distribution from randomly ordered events, implying there is at least some clustering control on the location of eruptions in the field. Further iterations of this modelling approach will be tested in relation to eruptive volume and applied to other comparative volcanic fields.

  9. Forecasting volcanic ash dispersal and coeval resuspension during the April-May 2015 Calbuco eruption

    NASA Astrophysics Data System (ADS)

    Reckziegel, F.; Bustos, E.; Mingari, L.; Báez, W.; Villarosa, G.; Folch, A.; Collini, E.; Viramonte, J.; Romero, J.; Osores, S.

    2016-07-01

    Atmospheric dispersion of volcanic ash from explosive eruptions or from subsequent fallout deposit resuspension causes a range of impacts and disruptions on human activities and ecosystems. The April-May 2015 Calbuco eruption in Chile involved eruption and resuspension activities. We overview the chronology, effects, and products resulting from these events, in order to validate an operational forecast strategy for tephra dispersal. The modelling strategy builds on coupling the meteorological Weather Research and Forecasting (WRF/ARW) model with the FALL3D dispersal model for eruptive and resuspension processes. The eruption modelling considers two distinct particle granulometries, a preliminary first guess distribution used operationally when no field data was available yet, and a refined distribution based on field measurements. Volcanological inputs were inferred from eruption reports and results from an Argentina-Chilean ash sample data network, which performed in-situ sampling during the eruption. In order to validate the modelling strategy, results were compared with satellite retrievals and ground deposit measurements. Results indicate that the WRF-FALL3D modelling system can provide reasonable forecasts in both eruption and resuspension modes, particularly when the adjusted granulometry is considered. The study also highlights the importance of having dedicated datasets of active volcanoes furnishing first-guess model inputs during the early stages of an eruption.

  10. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    USGS Publications Warehouse

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  11. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  12. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  13. Volcanic Eruption Forecasts From Accelerating Rates of Drumbeat Long-Period Earthquakes

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.; Naylor, Mark; Hernandez, Stephen; Main, Ian G.; Gaunt, H. Elizabeth; Mothes, Patricia; Ruiz, Mario

    2018-02-01

    Accelerating rates of quasiperiodic "drumbeat" long-period earthquakes (LPs) are commonly reported before eruptions at andesite and dacite volcanoes, and promise insights into the nature of fundamental preeruptive processes and improved eruption forecasts. Here we apply a new Bayesian Markov chain Monte Carlo gamma point process methodology to investigate an exceptionally well-developed sequence of drumbeat LPs preceding a recent large vulcanian explosion at Tungurahua volcano, Ecuador. For more than 24 hr, LP rates increased according to the inverse power law trend predicted by material failure theory, and with a retrospectively forecast failure time that agrees with the eruption onset within error. LPs resulted from repeated activation of a single characteristic source driven by accelerating loading, rather than a distributed failure process, showing that similar precursory trends can emerge from quite different underlying physics. Nevertheless, such sequences have clear potential for improving forecasts of eruptions at Tungurahua and analogous volcanoes.

  14. Visualising probabilistic flood forecast information: expert preferences and perceptions of best practice in uncertainty communication

    NASA Astrophysics Data System (ADS)

    Pappenberger, F.; Stephens, E. M.; Thielen, J.; Salomon, P.; Demeritt, D.; van Andel, S.; Wetterhall, F.; Alfieri, L.

    2011-12-01

    The aim of this paper is to understand and to contribute to improved communication of the probabilistic flood forecasts generated by Hydrological Ensemble Prediction Systems (HEPS) with particular focus on the inter expert communication. Different users are likely to require different kinds of information from HEPS and thus different visualizations. The perceptions of this expert group are important both because they are the designers and primary users of existing HEPS. Nevertheless, they have sometimes resisted the release of uncertainty information to the general public because of doubts about whether it can be successfully communicated in ways that would be readily understood to non-experts. In this paper we explore the strengths and weaknesses of existing HEPS visualization methods and thereby formulate some wider recommendations about best practice for HEPS visualization and communication. We suggest that specific training on probabilistic forecasting would foster use of probabilistic forecasts with a wider range of applications. The result of a case study exercise showed that there is no overarching agreement between experts on how to display probabilistic forecasts and what they consider essential information that should accompany plots and diagrams. In this paper we propose a list of minimum properties that, if consistently displayed with probabilistic forecasts, would make the products more easily understandable.

  15. Improvements on the relationship between plume height and mass eruption rate: Implications for volcanic ash cloud forecasting

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.; Mastin, L. G.; Steensen, T. S.

    2011-12-01

    Volcanic ash plumes and the dispersing clouds into the atmosphere are a hazard for local populations as well as for the aviation industry. Volcanic ash transport and dispersion (VATD) models, used to forecast the movement of these hazardous ash emissions, require eruption source parameters (ESP) such as plume height, eruption rate and duration. To estimate mass eruption rate, empirical relationships with observed plume height have been applied. Theoretical relationships defined by Morton et al. (1956) and Wilson et al. (1976) use default values for the environmental lapse rate (ELR), thermal efficiency, density of ash, specific heat capacity, initial temperature of the erupted material and final temperature of the material. Each volcano, based on its magma type, has a different density, specific heat capacity and initial eruptive temperature compared to these default parameters, and local atmospheric conditions can produce a very different ELR. Our research shows that a relationship between plume height and mass eruption rate can be defined for each eruptive event for each volcano. Additionally, using the one-dimensional modeling program, Plumeria, our analysis assesses the importance of factors such as vent diameter and eruption velocity on the relationship between the eruption rate and measured plume height. Coupling such a tool with a VATD model should improve pre-eruptive forecasts of ash emissions downwind and lead to improvements in ESP data that VATD models use for operational volcanic ash cloud forecasting.

  16. A novel visualisation tool for climate services: a case study of temperature extremes and human mortality in Europe

    NASA Astrophysics Data System (ADS)

    Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.

    2013-12-01

    Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.

  17. Performance of the 'material Failure Forecast Method' in real-time situations: A Bayesian approach applied on effusive and explosive eruptions

    NASA Astrophysics Data System (ADS)

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.; Arámbula-Mendoza, R.; Budi-Santoso, A.

    2016-11-01

    Most attempts of deterministic eruption forecasting are based on the material Failure Forecast Method (FFM). This method assumes that a precursory observable, such as the rate of seismic activity, can be described by a simple power law which presents a singularity at a time close to the eruption onset. Until now, this method has been applied only in a small number of cases, generally for forecasts in hindsight. In this paper, a rigorous Bayesian approach of the FFM designed for real-time applications is applied. Using an automatic recognition system, seismo-volcanic events are detected and classified according to their physical mechanism and time series of probability distributions of the rates of events are calculated. At each time of observation, a Bayesian inversion provides estimations of the exponent of the power law and of the time of eruption, together with their probability density functions. Two criteria are defined in order to evaluate the quality and reliability of the forecasts. Our automated procedure has allowed the analysis of long, continuous seismic time series: 13 years from Volcán de Colima, Mexico, 10 years from Piton de la Fournaise, Reunion Island, France, and several months from Merapi volcano, Java, Indonesia. The new forecasting approach has been applied to 64 pre-eruptive sequences which present various types of dominant seismic activity (volcano-tectonic or long-period events) and patterns of seismicity with different level of complexity. This has allowed us to test the FFM assumptions, to determine in which conditions the method can be applied, and to quantify the success rate of the forecasts. 62% of the precursory sequences analysed are suitable for the application of FFM and half of the total number of eruptions are successfully forecast in hindsight. In real-time, the method allows for the successful forecast of 36% of all the eruptions considered. Nevertheless, real-time forecasts are successful for 83% of the cases that fulfil the reliability criteria. Therefore, good confidence on the method is obtained when the reliability criteria are met.

  18. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  19. Visualizing Uncertainty for Probabilistic Weather Forecasting based on Reforecast Analogs

    NASA Astrophysics Data System (ADS)

    Pelorosso, Leandro; Diehl, Alexandra; Matković, Krešimir; Delrieux, Claudio; Ruiz, Juan; Gröeller, M. Eduard; Bruckner, Stefan

    2016-04-01

    Numerical weather forecasts are prone to uncertainty coming from inaccuracies in the initial and boundary conditions and lack of precision in numerical models. Ensemble of forecasts partially addresses these problems by considering several runs of the numerical model. Each forecast is generated with different initial and boundary conditions and different model configurations [GR05]. The ensembles can be expressed as probabilistic forecasts, which have proven to be very effective in the decision-making processes [DE06]. The ensemble of forecasts represents only some of the possible future atmospheric states, usually underestimating the degree of uncertainty in the predictions [KAL03, PH06]. Hamill and Whitaker [HW06] introduced the "Reforecast Analog Regression" (RAR) technique to overcome the limitations of ensemble forecasting. This technique produces probabilistic predictions based on the analysis of historical forecasts and observations. Visual analytics provides tools for processing, visualizing, and exploring data to get new insights and discover hidden information patterns in an interactive exchange between the user and the application [KMS08]. In this work, we introduce Albero, a visual analytics solution for probabilistic weather forecasting based on the RAR technique. Albero targets at least two different type of users: "forecasters", who are meteorologists working in operational weather forecasting and "researchers", who work in the construction of numerical prediction models. Albero is an efficient tool for analyzing precipitation forecasts, allowing forecasters to make and communicate quick decisions. Our solution facilitates the analysis of a set of probabilistic forecasts, associated statistical data, observations and uncertainty. A dashboard with small-multiples of probabilistic forecasts allows the forecasters to analyze at a glance the distribution of probabilities as a function of time, space, and magnitude. It provides the user with a more accurate measure of forecast uncertainty that could result in better decision-making. It offers different level of abstractions to help with the recalibration of the RAR method. It also has an inspection tool that displays the selected analogs, their observations and statistical data. It gives the users access to inner parts of the method, unveiling hidden information. References [GR05] GNEITING T., RAFTERY A. E.: Weather forecasting with ensemble methods. Science 310, 5746, 248-249, 2005. [KAL03] KALNAY E.: Atmospheric modeling, data assimilation and predictability. Cambridge University Press, 2003. [PH06] PALMER T., HAGEDORN R.: Predictability of weather and climate. Cambridge University Press, 2006. [HW06] HAMILL T. M., WHITAKER J. S.: Probabilistic quantitative precipitation forecasts based on reforecast analogs: Theory and application. Monthly Weather Review 134, 11, 3209-3229, 2006. [DE06] DEITRICK S., EDSALL R.: The influence of uncertainty visualization on decision making: An empirical evaluation. Springer, 2006. [KMS08] KEIM D. A., MANSMANN F., SCHNEIDEWIND J., THOMAS J., ZIEGLER H.: Visual analytics: Scope and challenges. Springer, 2008.

  20. Global link between deformation and volcanic eruption quantified by satellite imagery.

    PubMed

    Biggs, J; Ebmeier, S K; Aspinall, W P; Lu, Z; Pritchard, M E; Sparks, R S J; Mather, T A

    2014-04-03

    A key challenge for volcanological science and hazard management is that few of the world's volcanoes are effectively monitored. Satellite imagery covers volcanoes globally throughout their eruptive cycles, independent of ground-based monitoring, providing a multidecadal archive suitable for probabilistic analysis linking deformation with eruption. Here we show that, of the 198 volcanoes systematically observed for the past 18 years, 54 deformed, of which 25 also erupted. For assessing eruption potential, this high proportion of deforming volcanoes that also erupted (46%), together with the proportion of non-deforming volcanoes that did not erupt (94%), jointly represent indicators with 'strong' evidential worth. Using a larger catalogue of 540 volcanoes observed for 3 years, we demonstrate how this eruption-deformation relationship is influenced by tectonic, petrological and volcanic factors. Satellite technology is rapidly evolving and routine monitoring of the deformation status of all volcanoes from space is anticipated, meaning probabilistic approaches will increasingly inform hazard decisions and strategic development.

  1. How much are you prepared to PAY for a forecast?

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Coughlan, Erin; Ramos, Maria-Helena; Pappenberger, Florian; Wetterhall, Fredrik; Bachofen, Carina; van Andel, Schalk Jan

    2015-04-01

    Probabilistic hydro-meteorological forecasts are a crucial element of the decision-making chain in the field of flood prevention. The operational use of probabilistic forecasts is increasingly promoted through the development of new novel state-of-the-art forecast methods and numerical skill is continuously increasing. However, the value of such forecasts for flood early-warning systems is a topic of diverging opinions. Indeed, the word value, when applied to flood forecasting, is multifaceted. It refers, not only to the raw cost of acquiring and maintaining a probabilistic forecasting system (in terms of human and financial resources, data volume and computational time), but also and most importantly perhaps, to the use of such products. This game aims at investigating this point. It is a willingness to pay game, embedded in a risk-based decision-making experiment. Based on a ``Red Cross/Red Crescent, Climate Centre'' game, it is a contribution to the international Hydrologic Ensemble Prediction Experiment (HEPEX). A limited number of probabilistic forecasts will be auctioned to the participants; the price of these forecasts being market driven. All participants (irrespective of having bought or not a forecast set) will then be taken through a decision-making process to issue warnings for extreme rainfall. This game will promote discussions around the topic of the value of forecasts for decision-making in the field of flood prevention.

  2. Flash-flood early warning using weather radar data: from nowcasting to forecasting

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Panziera, Luca; Germann, Urs; Zappa, Massimiliano

    2013-04-01

    In our study we explore the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 hours between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic forcing.

  3. Flash-flood early warning using weather radar data: from nowcasting to forecasting

    NASA Astrophysics Data System (ADS)

    Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.

    2013-01-01

    This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.

  4. Understanding causality and uncertainty in volcanic observations: An example of forecasting eruptive activity on Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Sheldrake, T. E.; Aspinall, W. P.; Odbert, H. M.; Wadge, G.; Sparks, R. S. J.

    2017-07-01

    Following a cessation in eruptive activity it is important to understand how a volcano will behave in the future and when it may next erupt. Such an assessment can be based on the volcano's long-term pattern of behaviour and insights into its current state via monitoring observations. We present a Bayesian network that integrates these two strands of evidence to forecast future eruptive scenarios using expert elicitation. The Bayesian approach provides a framework to quantify the magmatic causes in terms of volcanic effects (i.e., eruption and unrest). In October 2013, an expert elicitation was performed to populate a Bayesian network designed to help forecast future eruptive (in-)activity at Soufrière Hills Volcano. The Bayesian network was devised to assess the state of the shallow magmatic system, as a means to forecast the future eruptive activity in the context of the long-term behaviour at similar dome-building volcanoes. The findings highlight coherence amongst experts when interpreting the current behaviour of the volcano, but reveal considerable ambiguity when relating this to longer patterns of volcanism at dome-building volcanoes, as a class. By asking questions in terms of magmatic causes, the Bayesian approach highlights the importance of using short-term unrest indicators from monitoring data as evidence in long-term forecasts at volcanoes. Furthermore, it highlights potential biases in the judgements of volcanologists and identifies sources of uncertainty in terms of magmatic causes rather than scenario-based outcomes.

  5. Improvement forecasting of volcanic activity by applying a Kalman filter to the SSEM signal. The case of the El Hierro Island eruption (October 2011)

    NASA Astrophysics Data System (ADS)

    Garcia, A.; Berrocoso, M.; Marrero, J. M.; Ortiz, R.

    2012-04-01

    The FFM (Failure Forecast Method) is developed from the eruption of St. Helens, being repeatedly applied to forecast eruptions and recently to the prediction of seismic activity in active volcanic areas. The underwater eruption of El Hierro Island has been monitored from three months before starting (October 10, 2011). This allowed a large catalogue of seismic events (over 11000) and continuous recording seismic signals that cover the entire period. Since the beginning of the seismic-volcanic crisis (July 2011), the FFM was applied to the SSEM signal of seismic records. Mainly because El Hierro is a very small island, the SSEM has a high noise (traffic and oceanic noise). To improve the signal / noise ratio has been used a Kalman filter. The Kalman filter coefficients are adjusted using an inversion process based on forecasting errors occurred in the twenty days preceding. The application of this filter has been a significant improvement in the reliability of forecasts. The analysis of the results shows, before the start of the eruption, that 90% of the forecasts are obtained with errors less than 10 minutes with more than 24 hours in advance. It is noteworthy that the method predicts the events of greater magnitude and especially the beginning of each swarm of seismic events. At the time the eruption starts reducing the efficiency of the forecast 50% with a dispersion of more than one hour. This fact is probably due to decreased detectability by saturation of some of the seismic stations and decreased the average magnitude. However, the events of magnitude greater than 4 were predicted with an error less than 20 minutes.

  6. Global link between deformation and volcanic eruption quantified by satellite imagery

    PubMed Central

    Biggs, J.; Ebmeier, S. K.; Aspinall, W. P.; Lu, Z.; Pritchard, M. E.; Sparks, R. S. J.; Mather, T. A.

    2014-01-01

    A key challenge for volcanological science and hazard management is that few of the world’s volcanoes are effectively monitored. Satellite imagery covers volcanoes globally throughout their eruptive cycles, independent of ground-based monitoring, providing a multidecadal archive suitable for probabilistic analysis linking deformation with eruption. Here we show that, of the 198 volcanoes systematically observed for the past 18 years, 54 deformed, of which 25 also erupted. For assessing eruption potential, this high proportion of deforming volcanoes that also erupted (46%), together with the proportion of non-deforming volcanoes that did not erupt (94%), jointly represent indicators with ‘strong’ evidential worth. Using a larger catalogue of 540 volcanoes observed for 3 years, we demonstrate how this eruption–deformation relationship is influenced by tectonic, petrological and volcanic factors. Satellite technology is rapidly evolving and routine monitoring of the deformation status of all volcanoes from space is anticipated, meaning probabilistic approaches will increasingly inform hazard decisions and strategic development. PMID:24699342

  7. Sensibility analysis of VORIS lava-flow simulations: application to Nyamulagira volcano, Democratic Republic of Congo

    NASA Astrophysics Data System (ADS)

    Syavulisembo, A. M.; Havenith, H.-B.; Smets, B.; d'Oreye, N.; Marti, J.

    2015-03-01

    Assessment and management of volcanic risk are important scientific, economic, and political issues, especially in densely populated areas threatened by volcanoes. The Virunga area in the Democratic Republic of Congo, with over 1 million inhabitants, has to cope permanently with the threat posed by the active Nyamulagira and Nyiragongo volcanoes. During the past century, Nyamulagira erupted at intervals of 1-4 years - mostly in the form of lava flows - at least 30 times. Its summit and flank eruptions lasted for periods of a few days up to more than two years, and produced lava flows sometimes reaching distances of over 20 km from the volcano, thereby affecting very large areas and having a serious impact on the region of Virunga. In order to identify a useful tool for lava flow hazard assessment at the Goma Volcano Observatory (GVO), we tested VORIS 2.0.1 (Felpeto et al., 2007), a freely available software (http://www.gvb-csic.es) based on a probabilistic model that considers topography as the main parameter controlling lava flow propagation. We tested different Digital Elevation Models (DEM) - SRTM1, SRTM3, and ASTER GDEM - to analyze the sensibility of the input parameters of VORIS 2.0.1 in simulation of recent historical lava-flow for which the pre-eruption topography is known. The results obtained show that VORIS 2.0.1 is a quick, easy-to-use tool for simulating lava-flow eruptions and replicates to a high degree of accuracy the eruptions tested. In practice, these results will be used by GVO to calibrate VORIS model for lava flow path forecasting during new eruptions, hence contributing to a better volcanic crisis management.

  8. Modelling the dynamics and hazards of explosive eruptions: Where we are now, and confronting the next challenges (Sergey Soloviev Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Neri, Augusto

    2017-04-01

    Understanding of explosive eruption dynamics and assessment of their hazards continue to represent challenging issues to the present-day volcanology community. This is largely due to the complex and diverse nature of the phenomena, and the variability and unpredictability of volcanic processes. Nevertheless, important and continuing progress has been made in the last few decades in understanding fundamental processes and in forecasting the occurrences of these phenomena, thanks to significant advances in field, experimental and theoretical modeling investigations. For over four decades, for example, volcanologists have made major progress in the description of the nature of explosive eruptions, considerably aided by the development, improvement, and application of physical-mathematical models. Integral steady-state homogeneous flow models were first used to investigate the different controlling mechanisms and to infer the genesis and evolution of the phenomena. Through continuous improvements and quantum-leap developments, a variety of transient, 3D, multiphase flow models of volcanic phenomena now can implement state-of-the-art formulations of the underlying physics, new-generation analytical and experimental data, as well as high-performance computational techniques. These numerical models have proved to be able to provide key insights in the understanding of the dynamics of explosive eruptions (e.g. convective plumes, collapsing columns, pyroclastic density currents, short-lived explosions, etc.), as well as to represent a valuable tool in the quantification of potential eruptive scenarios and associated hazards. Simplified models based on a reduction of the system complexity have been also proved useful, combined with Monte Carlo and statistical methods, to generate quantitative probabilistic hazard maps at different space and time scales, some including the quantification of important sources of uncertainty. Nevertheless, the development of physical models able to accurately replicate, within acceptable statistical uncertainty, the evolution of explosive eruptions remains a challenging goal still to be achieved. Testing of the developed models versus large-scale experimental data and well-measured real events, real-time assimilation of observational data to forecast the process nature and evolution, as well as the quantification of the uncertainties affecting our system and modelling representations appear key next steps to further progress volcanological research and its essential contribution to the mitigation of volcanic risk.

  9. Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes.

    PubMed

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto

    2016-04-12

    Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for.

  10. Online probabilistic learning with an ensemble of forecasts

    NASA Astrophysics Data System (ADS)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  11. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  12. Tidal Triggering of Microearthquakes Over an Eruption Cycle at 9°50'N East Pacific Rise

    NASA Astrophysics Data System (ADS)

    Tan, Yen Joe; Tolstoy, Maya; Waldhauser, Felix; Bohnenstiehl, DelWayne R.

    2018-02-01

    Studies have found that earthquake timing often correlates with tides at mid-ocean ridges and some terrestrial settings. Studies have also suggested that tidal triggering may preferentially happen when a region is critically stressed, making it a potential tool to forecast earthquakes and volcanic eruptions. We examine tidal triggering of ˜100,000 microearthquakes near 9°50'N East Pacific Rise recorded between October 2003 and January 2007, which encompasses an eruption in January 2006. This allows us to look at how tidal triggering signal varies over an eruption cycle to examine its utility as a forecasting tool. We find that tidal triggering signal is strong but does not vary systematically in the 2+ years leading up to the eruption. However, tidal triggering signal disappears immediately posteruption. Our findings suggest that tidal triggering variation may not be useful for forecasting mid-ocean ridge eruptions over a 2+ year timescale but might be useful over a longer timescale.

  13. Using Bayesian Model Averaging (BMA) to calibrate probabilistic surface temperature forecasts over Iran

    NASA Astrophysics Data System (ADS)

    Soltanzadeh, I.; Azadi, M.; Vakili, G. A.

    2011-07-01

    Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.

  14. Communicating uncertainty in hydrological forecasts: mission impossible?

    NASA Astrophysics Data System (ADS)

    Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian

    2010-05-01

    Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.

  15. A probabilistic drought forecasting framework: A combined dynamical and statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh

    In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less

  16. Understanding and forecasting phreatic eruptions driven by magmatic degassing

    NASA Astrophysics Data System (ADS)

    Stix, John; de Moor, J. Maarten

    2018-05-01

    This paper examines phreatic eruptions which are driven by inputs of magma and magmatic gas. We synthesize data from several significant phreatic systems, including two in Costa Rica (Turrialba and Poás) which are currently highly active and hazardous. We define two endmember types of phreatic eruptions, the first (type 1) in which a deeper hydrothermal system fed by magmatic gases is sealed and produces overpressure sufficient to drive explosive eruptions, and the second (type 2) where magmatic gases are supplied via open-vent degassing to a near-surface hydrothermal system, vaporizing liquid water which drives the phreatic eruptions. The surficial source of type 2 eruptions is characteristic, while the source depth of type 1 eruptions is commonly greater. Hence, type 1 eruptions tend to be more energetic than type 2 eruptions. The first type of eruption we term "phreato-vulcanian", and the second we term "phreato-surtseyan". Some systems (e.g., Ruapehu, Poás) can produce both type 1 and type 2 eruptions, and all systems can undergo sealing at various timescales. We examine a number of precursory signals which appear to be important in understanding and forecasting phreatic eruptions; these include very long period events, banded tremor, and gas ratios, in particular H2S/SO2 and CO2/SO2. We propose that if these datasets are carefully integrated during a monitoring program, it may be possible to accurately forecast phreatic eruptions.[Figure not available: see fulltext.

  17. Exploring the calibration of a wind forecast ensemble for energy applications

    NASA Astrophysics Data System (ADS)

    Heppelmann, Tobias; Ben Bouallegue, Zied; Theis, Susanne

    2015-04-01

    In the German research project EWeLiNE, Deutscher Wetterdienst (DWD) and Fraunhofer Institute for Wind Energy and Energy System Technology (IWES) are collaborating with three German Transmission System Operators (TSO) in order to provide the TSOs with improved probabilistic power forecasts. Probabilistic power forecasts are derived from probabilistic weather forecasts, themselves derived from ensemble prediction systems (EPS). Since the considered raw ensemble wind forecasts suffer from underdispersiveness and bias, calibration methods are developed for the correction of the model bias and the ensemble spread bias. The overall aim is to improve the ensemble forecasts such that the uncertainty of the possible weather deployment is depicted by the ensemble spread from the first forecast hours. Additionally, the ensemble members after calibration should remain physically consistent scenarios. We focus on probabilistic hourly wind forecasts with horizon of 21 h delivered by the convection permitting high-resolution ensemble system COSMO-DE-EPS which has become operational in 2012 at DWD. The ensemble consists of 20 ensemble members driven by four different global models. The model area includes whole Germany and parts of Central Europe with a horizontal resolution of 2.8 km and a vertical resolution of 50 model levels. For verification we use wind mast measurements around 100 m height that corresponds to the hub height of wind energy plants that belong to wind farms within the model area. Calibration of the ensemble forecasts can be performed by different statistical methods applied to the raw ensemble output. Here, we explore local bivariate Ensemble Model Output Statistics at individual sites and quantile regression with different predictors. Applying different methods, we already show an improvement of ensemble wind forecasts from COSMO-DE-EPS for energy applications. In addition, an ensemble copula coupling approach transfers the time-dependencies of the raw ensemble to the calibrated ensemble. The calibrated wind forecasts are evaluated first with univariate probabilistic scores and additionally with diagnostics of wind ramps in order to assess the time-consistency of the calibrated ensemble members.

  18. Against all odds -- Probabilistic forecasts and decision making

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Zappa, Massimiliano

    2015-04-01

    In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.

  19. Probability hazard map for future vent opening at Etna volcano (Sicily, Italy).

    NASA Astrophysics Data System (ADS)

    Brancato, Alfonso; Tusa, Giuseppina; Coltelli, Mauro; Proietti, Cristina

    2014-05-01

    Mount Etna is a composite stratovolcano located along the Ionian coast of eastern Sicily. The frequent flank eruptions occurrence (at an interval of years, mostly concentrated along the NE, S and W rift zones) lead to a high volcanic hazard that, linked with intense urbanization, poses a high volcanic risk. A long-term volcanic hazard assessment, mainly based on the past behaviour of the Etna volcano, is the basic tool for the evaluation of this risk. Then, a reliable forecast where the next eruption will occur is needed. A computer-assisted analysis and probabilistic evaluations will provide the relative map, thus allowing identification of the areas prone to the highest hazard. Based on these grounds, the use of a code such BET_EF (Bayesian Event Tree_Eruption Forecasting) showed that a suitable analysis can be explored (Selva et al., 2012). Following an analysis we are performing, a total of 6886 point-vents referring to the last 4.0 ka of Etna flank activity, and spread over an area of 744 km2 (divided into N=2976 squared cell, with side of 500 m), allowed us to estimate a pdf by applying a Gaussian kernel. The probability values represent a complete set of outcomes mutually exclusive and the relative sum is normalized to one over the investigated area; then, the basic assumptions of a Dirichlet distribution (the prior distribution set in the BET_EF code (Marzocchi et al., 2004, 2008)) still hold. One fundamental parameter is the the equivalent number of data, that depicts our confidence on the best guess probability. The BET_EF code also works with a likelihood function. This is modelled by a Multinomial distribution, with parameters representing the number of vents in each cell and the total number of past data (i.e. the 6886 point-vents). Given the grid of N cells, the final posterior distribution will be evaluated by multiplying the a priori Dirichlet probability distribution with the past data in each cell through the likelihood. The probability hazard map shows a tendency to concentrate along the NE and S rifts, as well as Valle del Bove, increasing the difference in probability between these areas and the rest of the volcano edifice. It is worthy notice that a higher significance is still evident along the W rift, even if not comparable with the ones of the above mentioned areas. References Marzocchi W., Sandri L., Gasparini P., Newhall C. and Boschi E.; 2004: Quantifying probabilities of volcanic events: The example of volcanic hazard at Mount Vesuvius, J. Geophys. Res., 109, B11201, doi:10.1029/2004JB00315U. Marzocchi W., Sandri, L. and Selva, J.; 2008: BET_EF: a probabilistic tool for long- and short-term eruption forecasting, Bull. Volcanol., 70, 623 - 632, doi: 10.1007/s00445-007-0157-y. Selva J., Orsi G., Di Vito M.A., Marzocchi W. And Sandri L.; 2012: Probability hazard mapfor future vent opening atthe Campi Flegrei caldera, Italy, Bull. Volcanol., 74, 497 - 510, doi: 10.1007/s00445-011-0528-2.

  20. A parimutuel gambling perspective to compare probabilistic seismicity forecasts

    NASA Astrophysics Data System (ADS)

    Zechar, J. Douglas; Zhuang, Jiancang

    2014-10-01

    Using analogies to gaming, we consider the problem of comparing multiple probabilistic seismicity forecasts. To measure relative model performance, we suggest a parimutuel gambling perspective which addresses shortcomings of other methods such as likelihood ratio, information gain and Molchan diagrams. We describe two variants of the parimutuel approach for a set of forecasts: head-to-head, in which forecasts are compared in pairs, and round table, in which all forecasts are compared simultaneously. For illustration, we compare the 5-yr forecasts of the Regional Earthquake Likelihood Models experiment for M4.95+ seismicity in California.

  1. Towards real-time eruption forecasting in the Auckland Volcanic Field: application of BET_EF during the New Zealand National Disaster Exercise `Ruaumoko'

    NASA Astrophysics Data System (ADS)

    Lindsay, Jan; Marzocchi, Warner; Jolly, Gill; Constantinescu, Robert; Selva, Jacopo; Sandri, Laura

    2010-03-01

    The Auckland Volcanic Field (AVF) is a young basaltic field that lies beneath the urban area of Auckland, New Zealand’s largest city. Over the past 250,000 years the AVF has produced at least 49 basaltic centers; the last eruption was only 600 years ago. In recognition of the high risk associated with a possible future eruption in Auckland, the New Zealand government ran Exercise Ruaumoko in March 2008, a test of New Zealand’s nation-wide preparedness for responding to a major disaster resulting from a volcanic eruption in Auckland City. The exercise scenario was developed in secret, and covered the period of precursory activity up until the eruption. During Exercise Ruaumoko we adapted a recently developed statistical code for eruption forecasting, namely BET_EF (Bayesian Event Tree for Eruption Forecasting), to independently track the unrest evolution and to forecast the most likely onset time, location and style of the initial phase of the simulated eruption. The code was set up before the start of the exercise by entering reliable information on the past history of the AVF as well as the monitoring signals expected in the event of magmatic unrest and an impending eruption. The average probabilities calculated by BET_EF during Exercise Ruaumoko corresponded well to the probabilities subjectively (and independently) estimated by the advising scientists (differences of few percentage units), and provided a sound forecast of the timing (before the event, the eruption probability reached 90%) and location of the eruption. This application of BET_EF to a volcanic field that has experienced no historical activity and for which otherwise limited prior information is available shows its versatility and potential usefulness as a tool to aid decision-making for a wide range of volcano types. Our near real-time application of BET_EF during Exercise Ruaumoko highlighted its potential to clarify and possibly optimize decision-making procedures in a future AVF eruption crisis, and as a rational starting point for discussions in a scientific advisory group. It also stimulated valuable scientific discussion around how a future AVF eruption might progress, and highlighted areas of future volcanological research that would reduce epistemic uncertainties through the development of better input models.

  2. Statistical forecasting of repetitious dome failures during the waning eruption of Redoubt Volcano, Alaska, February-April 1990

    USGS Publications Warehouse

    Page, R.A.; Lahr, J.C.; Chouet, B.A.; Power, J.A.; Stephens, C.D.

    1994-01-01

    The waning phase of the 1989-1990 eruption of Redoubt Volcano in the Cook Inlet region of south-central Alaska comprised a quasi-regular pattern of repetitious dome growth and destruction that lasted from February 15 to late April 1990. The dome failures produced ash plumes hazardous to airline traffic. In response to this hazard, the Alaska Volcano Observatory sought to forecast these ash-producing events using two approaches. One approach built on early successes in issuing warnings before major eruptions on December 14, 1989 and January 2, 1990. These warnings were based largely on changes in seismic activity related to the occurrence of precursory swarms of long-period seismic events. The search for precursory swarms of long-period seismicity was continued through the waning phase of the eruption and led to warnings before tephra eruptions on March 23 and April 6. The observed regularity of dome failures after February 15 suggested that a statistical forecasting method based on a constant-rate failure model might also be successful. The first statistical forecast was issued on March 16 after seven events had occurred, at an average interval of 4.5 days. At this time, the interval between dome failures abruptly lengthened. Accordingly, the forecast was unsuccessful and further forecasting was suspended until the regularity of subsequent failures could be confirmed. Statistical forecasting resumed on April 12, after four dome failure episodes separated by an average of 7.8 days. One dome failure (April 15) was successfully forecast using a 70% confidence window, and a second event (April 21) was narrowly missed before the end of the activity. The cessation of dome failures after April 21 resulted in a concluding false alarm. Although forecasting success during the eruption was limited, retrospective analysis shows that early and consistent application of the statistical method using a constant-rate failure model and a 90% confidence window could have yielded five successful forecasts and two false alarms; no events would have been missed. On closer examination, the intervals between successive dome failures are not uniform but tend to increase with time. This increase attests to the continuous, slowly decreasing supply of magma to the surface vent during the waning phase of the eruption. The domes formed in a precarious position in a breach in the summit crater rim where they were susceptible to gravitational collapse. The instability of the February 15-April 21 domes relative to the earlier domes is attributed to reaming the lip of the vent by a laterally directed explosion during the major dome-destroying eruption of February 15, a process which would leave a less secure foundation for subsequent domes. ?? 1994.

  3. Integrating geological and geophysical data to improve probabilistic hazard forecasting of Arabian Shield volcanism

    NASA Astrophysics Data System (ADS)

    Runge, Melody G.; Bebbington, Mark S.; Cronin, Shane J.; Lindsay, Jan M.; Moufti, Mohammed R.

    2016-02-01

    During probabilistic volcanic hazard analysis of volcanic fields, a greater variety of spatial data on crustal features should help improve forecasts of future vent locations. Without further examination, however, geophysical estimations of crustal or other features may be non-informative. Here, we present a new, robust, non-parametric method to quantitatively determine the existence of any relationship between natural phenomena (e.g., volcanic eruptions) and a variety of geophysical data. This provides a new validation tool for incorporating a range of potentially hazard-diagnostic observable data into recurrence rate estimates and hazard analyses. Through this study it is shown that the location of Cenozoic volcanic fields across the Arabian Shield appear to be related to locations of major and minor faults, at higher elevations, and regions where gravity anomaly values were between - 125 mGal and 0 mGal. These findings support earlier hypotheses that the western shield uplift was related to Cenozoic volcanism. At the harrat (volcanic field)-scale, higher vent density regions are related to both elevation and gravity anomaly values. A by-product of this work is the collection of existing data on the volcanism across Saudi Arabia, with all vent locations provided herein, as well as updated maps for Harrats Kura, Khaybar, Ithnayn, Kishb, and Rahat. This work also highlights the potential dangers of assuming relationships between observed data and the occurrence of a natural phenomenon without quantitative assessment or proper consideration of the effects of data resolution.

  4. Forecasting deflation, intrusion and eruption at inflating volcanoes

    NASA Astrophysics Data System (ADS)

    Blake, Stephen; Cortés, Joaquín A.

    2018-01-01

    A principal goal of volcanology is to successfully forecast the start of volcanic eruptions. This paper introduces a general forecasting method, which relies on a stream of monitoring data and a statistical description of a given threshold criterion for an eruption to start. Specifically we investigate the timing of intrusive and eruptive events at inflating volcanoes. The gradual inflation of the ground surface is a well-known phenomenon at many volcanoes and is attributable to pressurised magma accumulating within a shallow chamber. Inflation usually culminates in a rapid deflation event caused by magma escaping from the chamber to produce a shallow intrusion and, in some cases, a volcanic eruption. We show that the ground elevation during 15 inflation periods at Krafla volcano, Iceland, increased with time towards a limiting value by following a decaying exponential with characteristic timescale τ. The available data for Krafla, Kilauea and Mauna Loa volcanoes show that the duration of inflation (t*) is approximately equal to τ. The distribution of t* / τ values follows a log-logistic distribution in which the central 60% of the data lie between 0.99

  5. Probabilistic-numerical assessment of pyroclastic current hazard at Campi Flegrei and Naples city: Multi-VEI scenarios as a tool for "full-scale" risk management.

    PubMed

    Mastrolorenzo, Giuseppe; Palladino, Danilo M; Pappalardo, Lucia; Rossano, Sergio

    2017-01-01

    The Campi Flegrei volcanic field (Italy) poses very high risk to the highly urbanized Neapolitan area. Eruptive history was dominated by explosive activity producing pyroclastic currents (hereon PCs; acronym for Pyroclastic Currents) ranging in scale from localized base surges to regional flows. Here we apply probabilistic numerical simulation approaches to produce PC hazard maps, based on a comprehensive spectrum of flow properties and vent locations. These maps are incorporated in a Geographic Information System (GIS) and provide all probable Volcanic Explosivity Index (VEI) scenarios from different source vents in the caldera, relevant for risk management planning. For each VEI scenario, we report the conditional probability for PCs (i.e., the probability for a given area to be affected by the passage of PCs in case of a PC-forming explosive event) and related dynamic pressure. Model results indicate that PCs from VEI<4 events would be confined within the Campi Flegrei caldera, PC propagation being impeded by the northern and eastern caldera walls. Conversely, PCs from VEI 4-5 events could invade a wide area beyond the northern caldera rim, as well as part of the Naples metropolitan area to the east. A major controlling factor of PC dispersal is represented by the location of the vent area. PCs from the potentially largest eruption scenarios (analogous to the ~15 ka, VEI 6 Neapolitan Yellow Tuff or even the ~39 ka, VEI 7 Campanian Ignimbrite extreme event) would affect a large part of the Campanian Plain to the north and the city of Naples to the east. Thus, in case of renewal of eruptive activity at Campi Flegrei, up to 3 million people will be potentially exposed to volcanic hazard, pointing out the urgency of an emergency plan. Considering the present level of uncertainty in forecasting the future eruption type, size and location (essentially based on statistical analysis of previous activity), we suggest that appropriate planning measures should face at least the VEI 5 reference scenario (at least 2 occurrences documented in the last 10 ka).

  6. Probabilistic-numerical assessment of pyroclastic current hazard at Campi Flegrei and Naples city: Multi-VEI scenarios as a tool for “full-scale” risk management

    PubMed Central

    Mastrolorenzo, Giuseppe; Palladino, Danilo M.; Pappalardo, Lucia; Rossano, Sergio

    2017-01-01

    The Campi Flegrei volcanic field (Italy) poses very high risk to the highly urbanized Neapolitan area. Eruptive history was dominated by explosive activity producing pyroclastic currents (hereon PCs; acronym for Pyroclastic Currents) ranging in scale from localized base surges to regional flows. Here we apply probabilistic numerical simulation approaches to produce PC hazard maps, based on a comprehensive spectrum of flow properties and vent locations. These maps are incorporated in a Geographic Information System (GIS) and provide all probable Volcanic Explosivity Index (VEI) scenarios from different source vents in the caldera, relevant for risk management planning. For each VEI scenario, we report the conditional probability for PCs (i.e., the probability for a given area to be affected by the passage of PCs in case of a PC-forming explosive event) and related dynamic pressure. Model results indicate that PCs from VEI<4 events would be confined within the Campi Flegrei caldera, PC propagation being impeded by the northern and eastern caldera walls. Conversely, PCs from VEI 4–5 events could invade a wide area beyond the northern caldera rim, as well as part of the Naples metropolitan area to the east. A major controlling factor of PC dispersal is represented by the location of the vent area. PCs from the potentially largest eruption scenarios (analogous to the ~15 ka, VEI 6 Neapolitan Yellow Tuff or even the ~39 ka, VEI 7 Campanian Ignimbrite extreme event) would affect a large part of the Campanian Plain to the north and the city of Naples to the east. Thus, in case of renewal of eruptive activity at Campi Flegrei, up to 3 million people will be potentially exposed to volcanic hazard, pointing out the urgency of an emergency plan. Considering the present level of uncertainty in forecasting the future eruption type, size and location (essentially based on statistical analysis of previous activity), we suggest that appropriate planning measures should face at least the VEI 5 reference scenario (at least 2 occurrences documented in the last 10 ka). PMID:29020018

  7. Probabilistic Solar Wind and Geomagnetic Forecasting Using an Analogue Ensemble or "Similar Day" Approach

    NASA Astrophysics Data System (ADS)

    Owens, M. J.; Riley, P.; Horbury, T. S.

    2017-05-01

    Effective space-weather prediction and mitigation requires accurate forecasting of near-Earth solar-wind conditions. Numerical magnetohydrodynamic models of the solar wind, driven by remote solar observations, are gaining skill at forecasting the large-scale solar-wind features that give rise to near-Earth variations over days and weeks. There remains a need for accurate short-term (hours to days) solar-wind forecasts, however. In this study we investigate the analogue ensemble (AnEn), or "similar day", approach that was developed for atmospheric weather forecasting. The central premise of the AnEn is that past variations that are analogous or similar to current conditions can be used to provide a good estimate of future variations. By considering an ensemble of past analogues, the AnEn forecast is inherently probabilistic and provides a measure of the forecast uncertainty. We show that forecasts of solar-wind speed can be improved by considering both speed and density when determining past analogues, whereas forecasts of the out-of-ecliptic magnetic field [BN] are improved by also considering the in-ecliptic magnetic-field components. In general, the best forecasts are found by considering only the previous 6 - 12 hours of observations. Using these parameters, the AnEn provides a valuable probabilistic forecast for solar-wind speed, density, and in-ecliptic magnetic field over lead times from a few hours to around four days. For BN, which is central to space-weather disturbance, the AnEn only provides a valuable forecast out to around six to seven hours. As the inherent predictability of this parameter is low, this is still likely a marked improvement over other forecast methods. We also investigate the use of the AnEn in forecasting geomagnetic indices Dst and Kp. The AnEn provides a valuable probabilistic forecast of both indices out to around four days. We outline a number of future improvements to AnEn forecasts of near-Earth solar-wind and geomagnetic conditions.

  8. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE PAGES

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...

    2017-07-11

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  9. Probabilistic versus deterministic skill in predicting the western North Pacific-East Asian summer monsoon variability with multimodel ensembles

    NASA Astrophysics Data System (ADS)

    Yang, Xiu-Qun; Yang, Dejian; Xie, Qian; Zhang, Yaocun; Ren, Xuejuan; Tang, Youmin

    2017-04-01

    Based on historical forecasts of three quasi-operational multi-model ensemble (MME) systems, this study assesses the superiority of coupled MME over contributing single-model ensembles (SMEs) and over uncoupled atmospheric MME in predicting the Western North Pacific-East Asian summer monsoon variability. The probabilistic and deterministic forecast skills are measured by Brier skill score (BSS) and anomaly correlation (AC), respectively. A forecast-format dependent MME superiority over SMEs is found. The probabilistic forecast skill of the MME is always significantly better than that of each SME, while the deterministic forecast skill of the MME can be lower than that of some SMEs. The MME superiority arises from both the model diversity and the ensemble size increase in the tropics, and primarily from the ensemble size increase in the subtropics. The BSS is composed of reliability and resolution, two attributes characterizing probabilistic forecast skill. The probabilistic skill increase of the MME is dominated by the dramatic improvement in reliability, while resolution is not always improved, similar to AC. A monotonic resolution-AC relationship is further found and qualitatively explained, whereas little relationship can be identified between reliability and AC. It is argued that the MME's success in improving the reliability arises from an effective reduction of the overconfidence in forecast distributions. Moreover, it is examined that the seasonal predictions with coupled MME are more skillful than those with the uncoupled atmospheric MME forced by persisting sea surface temperature (SST) anomalies, since the coupled MME has better predicted the SST anomaly evolution in three key regions.

  10. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  11. Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes

    PubMed Central

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto

    2016-01-01

    Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for. PMID:27067389

  12. Detecting and Characterizing Repeating Earthquake Sequences During Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Tepp, G.; Haney, M. M.; Wech, A.

    2017-12-01

    A major challenge in volcano seismology is forecasting eruptions. Repeating earthquake sequences often precede volcanic eruptions or lava dome activity, providing an opportunity for short-term eruption forecasting. Automatic detection of these sequences can lead to timely eruption notification and aid in continuous monitoring of volcanic systems. However, repeating earthquake sequences may also occur after eruptions or along with magma intrusions that do not immediately lead to an eruption. This additional challenge requires a better understanding of the processes involved in producing these sequences to distinguish those that are precursory. Calculation of the inverse moment rate and concepts from the material failure forecast method can lead to such insights. The temporal evolution of the inverse moment rate is observed to differ for precursory and non-precursory sequences, and multiple earthquake sequences may occur concurrently. These observations suggest that sequences may occur in different locations or through different processes. We developed an automated repeating earthquake sequence detector and near real-time alarm to send alerts when an in-progress sequence is identified. Near real-time inverse moment rate measurements can further improve our ability to forecast eruptions by allowing for characterization of sequences. We apply the detector to eruptions of two Alaskan volcanoes: Bogoslof in 2016-2017 and Redoubt Volcano in 2009. The Bogoslof eruption produced almost 40 repeating earthquake sequences between its start in mid-December 2016 and early June 2017, 21 of which preceded an explosive eruption, and 2 sequences in the months before eruptive activity. Three of the sequences occurred after the implementation of the alarm in late March 2017 and successfully triggered alerts. The nearest seismometers to Bogoslof are over 45 km away, requiring a detector that can work with few stations and a relatively low signal-to-noise ratio. During the Redoubt eruption, earthquake sequences were observed in the months leading up to the eruptive activity beginning in March 2009 as well as immediately preceding 7 of the 19 explosive events. In contrast to Bogoslof, Redoubt has a local monitoring network which allows for better detection and more detailed analysis of the repeating earthquake sequences.

  13. WIPCast: Probabilistic Forecasting for Aviation Decision Aid Applications

    DTIC Science & Technology

    2011-06-01

    traders, or families planning an outing – manage weather-related risk. By quantifying risk , probabilistic forecasting enables optimization of actions via...confidence interval to the user’s risk tolerance helps drive highly effective and innovative decision support mechanisms for visually quantifying risk for

  14. Probabilistic forecasting for extreme NO2 pollution episodes.

    PubMed

    Aznarte, José L

    2017-10-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Forecasting the Onset Time of Volcanic Eruptions Using Ground Deformation Data

    NASA Astrophysics Data System (ADS)

    Blake, S.; Cortes, J. A.

    2016-12-01

    The pre-eruptive inflation of the ground surface is a well-known phenomenon at many volcanoes. In a number of intensively studied cases, elevation and/or radial tilt increase with time (t) towards a limiting value by following a decaying exponential with characteristic timescale τ (Kilauea and Mauna Loa: Dvorak and Okamura 1987, Lengliné et al., 2008) or, after sufficiently long times, by following the sum of two such functions such that two timescales, τ1 and τ2, are required to describe the temporal pattern of inflation (Axial Seamount: Nooner and Chadwick, 2009). We have used the Levenberg-Marquardt non-linear fit algorithm to analyse data for 18 inflation periods at Krafla volcano, Iceland, (Björnsson and Eysteinsson, 1998) and found the same functional relationship. Pooling all of the available data from 25 eruptions at 4 volcanoes shows that the duration of inflation before an eruption or shallow intrusion (t*) is comparable to τ (or the longer of τ1 and τ2) and follows an almost 1:1 linear relationship (r2 0.8). We also find that this scaling is replicated by Monte Carlo simulations of physics-based forward models of hydraulically connected dual magma chamber systems which erupt when the chamber pressure reaches a threshold value. These results lead to a new forecasting method which we describe and assess here: if τ can be constrained during an on-going inflation period, then the statistical distribution of t*/τ values calibrated from other pre-eruptive inflation periods allows the probability of an eruption starting before (or after) a specified time to be estimated. The time at which there is a specified probability of an eruption starting can also be forecast. These approaches rely on fitting deformation data up to time t in order to obtain τ(t) which is then used to forecast t*. Forecasts can be updated after each new deformation measurement.

  16. Real Time Volcanic Cloud Products and Predictions for Aviation Alerts

    NASA Technical Reports Server (NTRS)

    Krotkov, Nickolay A.; Habib, Shahid; da Silva, Arlindo; Hughes, Eric; Yang, Kai; Brentzel, Kelvin; Seftor, Colin; Li, Jason Y.; Schneider, David; Guffanti, Marianne; hide

    2014-01-01

    Volcanic eruptions can inject significant amounts of sulfur dioxide (SO2) and volcanic ash into the atmosphere, posing a substantial risk to aviation safety. Ingesting near-real time and Direct Readout satellite volcanic cloud data is vital for improving reliability of volcanic ash forecasts and mitigating the effects of volcanic eruptions on aviation and the economy. NASA volcanic products from the Ozone Monitoring Insrument (OMI) aboard the Aura satellite have been incorporated into Decision Support Systems of many operational agencies. With the Aura mission approaching its 10th anniversary, there is an urgent need to replace OMI data with those from the next generation operational NASA/NOAA Suomi National Polar Partnership (SNPP) satellite. The data provided from these instruments are being incorporated into forecasting models to provide quantitative ash forecasts for air traffic management. This study demonstrates the feasibility of the volcanic near-real time and Direct Readout data products from the new Ozone Monitoring and Profiling Suite (OMPS) ultraviolet sensor onboard SNPP for monitoring and forecasting volcanic clouds. The transition of NASA data production to our operational partners is outlined. Satellite observations are used to constrain volcanic cloud simulations and improve estimates of eruption parameters, resulting in more accurate forecasts. This is demonstrated for the 2012 eruption of Copahue. Volcanic eruptions are modeled using the Goddard Earth Observing System, Version 5 (GEOS-5) and the Goddard Chemistry Aerosol and Radiation Transport (GOCART) model. A hindcast of the disruptive eruption from Iceland's Eyjafjallajokull is used to estimate aviation re-routing costs using Metron Aviation's ATM Tools.

  17. Forecasting eruption size: what we know, what we don't know

    NASA Astrophysics Data System (ADS)

    Papale, Paolo

    2017-04-01

    Any eruption forecast includes an evaluation of the expected size of the forthcoming eruption, usually expressed as the probability associated to given size classes. Such evaluation is mostly based on the previous volcanic history at the specific volcano, or it is referred to a broader class of volcanoes constituting "analogues" of the one under specific consideration. In any case, use of knowledge from past eruptions implies considering the completeness of the reference catalogue, and most importantly, the existence of systematic biases in the catalogue, that may affect probability estimates and translate into biased volcanic hazard forecasts. An analysis of existing catalogues, with major reference to the catalogue from the Smithsonian Global Volcanism Program, suggests that systematic biases largely dominate at global, regional and local scale: volcanic histories reconstructed at individual volcanoes, often used as a reference for volcanic hazard forecasts, are the result of systematic loss of information with time and poor sample representativeness. That situation strictly requires the use of techniques to complete existing catalogues, as well as careful consideration of the uncertainties deriving from inadequate knowledge and model-dependent data elaboration. A reconstructed global eruption size distribution, obtained by merging information from different existing catalogues, shows a mode in the VEI 1-2 range, <0.1% incidence of eruptions with VEI 7 or larger, and substantial uncertainties associated with individual VEI frequencies. Even larger uncertainties are expected to derive from application to individual volcanoes or classes of analogue volcanoes, suggesting large to very large uncertainties associated to volcanic hazard forecasts virtually at any individual volcano worldwide.

  18. Probabilistic forecasting of extreme weather events based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Van De Vyver, Hans; Van Schaeybroeck, Bert

    2016-04-01

    Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.

  19. Evaluation of ensemble forecast uncertainty using a new proper score: application to medium-range and seasonal forecasts

    NASA Astrophysics Data System (ADS)

    Christensen, Hannah; Moroz, Irene; Palmer, Tim

    2015-04-01

    Forecast verification is important across scientific disciplines as it provides a framework for evaluating the performance of a forecasting system. In the atmospheric sciences, probabilistic skill scores are often used for verification as they provide a way of unambiguously ranking the performance of different probabilistic forecasts. In order to be useful, a skill score must be proper -- it must encourage honesty in the forecaster, and reward forecasts which are reliable and which have good resolution. A new score, the Error-spread Score (ES), is proposed which is particularly suitable for evaluation of ensemble forecasts. It is formulated with respect to the moments of the forecast. The ES is confirmed to be a proper score, and is therefore sensitive to both resolution and reliability. The ES is tested on forecasts made using the Lorenz '96 system, and found to be useful for summarising the skill of the forecasts. The European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system (EPS) is evaluated using the ES. Its performance is compared to a perfect statistical probabilistic forecast -- the ECMWF high resolution deterministic forecast dressed with the observed error distribution. This generates a forecast that is perfectly reliable if considered over all time, but which does not vary from day to day with the predictability of the atmospheric flow. The ES distinguishes between the dynamically reliable EPS forecasts and the statically reliable dressed deterministic forecasts. Other skill scores are tested and found to be comparatively insensitive to this desirable forecast quality. The ES is used to evaluate seasonal range ensemble forecasts made with the ECMWF System 4. The ensemble forecasts are found to be skilful when compared with climatological or persistence forecasts, though this skill is dependent on region and time of year.

  20. The longevity of lava dome eruptions

    NASA Astrophysics Data System (ADS)

    Wolpert, Robert L.; Ogburn, Sarah E.; Calder, Eliza S.

    2016-02-01

    Understanding the duration of past, ongoing, and future volcanic eruptions is an important scientific goal and a key societal need. We present a new methodology for forecasting the duration of ongoing and future lava dome eruptions based on a database (DomeHaz) recently compiled by the authors. The database includes duration and composition for 177 such eruptions, with "eruption" defined as the period encompassing individual episodes of dome growth along with associated quiescent periods during which extrusion pauses but unrest continues. In a key finding, we show that probability distributions for dome eruption durations are both heavy tailed and composition dependent. We construct objective Bayesian statistical models featuring heavy-tailed Generalized Pareto distributions with composition-specific parameters to make forecasts about the durations of new and ongoing eruptions that depend on both eruption duration to date and composition. Our Bayesian predictive distributions reflect both uncertainty about model parameter values (epistemic uncertainty) and the natural variability of the geologic processes (aleatoric uncertainty). The results are illustrated by presenting likely trajectories for 14 dome-building eruptions ongoing in 2015. Full representation of the uncertainty is presented for two key eruptions, Soufriére Hills Volcano in Montserrat (10-139 years, median 35 years) and Sinabung, Indonesia (1-17 years, median 4 years). Uncertainties are high but, importantly, quantifiable. This work provides for the first time a quantitative and transferable method and rationale on which to base long-term planning decisions for lava dome-forming volcanoes, with wide potential use and transferability to forecasts of other types of eruptions and other adverse events across the geohazard spectrum.

  1. Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)

    NASA Astrophysics Data System (ADS)

    OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.

    2016-02-01

    Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.

  2. Improved water allocation utilizing probabilistic climate forecasts: Short-term water contracts in a risk management framework

    NASA Astrophysics Data System (ADS)

    Sankarasubramanian, A.; Lall, Upmanu; Souza Filho, Francisco Assis; Sharma, Ashish

    2009-11-01

    Probabilistic, seasonal to interannual streamflow forecasts are becoming increasingly available as the ability to model climate teleconnections is improving. However, water managers and practitioners have been slow to adopt such products, citing concerns with forecast skill. Essentially, a management risk is perceived in "gambling" with operations using a probabilistic forecast, while a system failure upon following existing operating policies is "protected" by the official rules or guidebook. In the presence of a prescribed system of prior allocation of releases under different storage or water availability conditions, the manager has little incentive to change. Innovation in allocation and operation is hence key to improved risk management using such forecasts. A participatory water allocation process that can effectively use probabilistic forecasts as part of an adaptive management strategy is introduced here. Users can express their demand for water through statements that cover the quantity needed at a particular reliability, the temporal distribution of the "allocation," the associated willingness to pay, and compensation in the event of contract nonperformance. The water manager then assesses feasible allocations using the probabilistic forecast that try to meet these criteria across all users. An iterative process between users and water manager could be used to formalize a set of short-term contracts that represent the resulting prioritized water allocation strategy over the operating period for which the forecast was issued. These contracts can be used to allocate water each year/season beyond long-term contracts that may have precedence. Thus, integrated supply and demand management can be achieved. In this paper, a single period multiuser optimization model that can support such an allocation process is presented. The application of this conceptual model is explored using data for the Jaguaribe Metropolitan Hydro System in Ceara, Brazil. The performance relative to the current allocation process is assessed in the context of whether such a model could support the proposed short-term contract based participatory process. A synthetic forecasting example is also used to explore the relative roles of forecast skill and reservoir storage in this framework.

  3. A temporal-spatial postprocessing model for probabilistic run-off forecast. With a case study from Ulla-Førre with five catchments and ten lead times

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.

    2012-04-01

    This work is driven by the needs of next generation short term optimization methodology for hydro power production. Stochastic optimization are about to be introduced; i.e. optimizing when available resources (water) and utility (prices) are uncertain. In this paper we focus on the available resources, i.e. water, where uncertainty mainly comes from uncertainty in future runoff. When optimizing a water system all catchments and several lead times have to be considered simultaneously. Depending on the system of hydropower reservoirs, it might be a set of headwater catchments, a system of upstream /downstream reservoirs where water used from one catchment /dam arrives in a lower catchment maybe days later, or a combination of both. The aim of this paper is therefore to construct a simultaneous probabilistic forecast for several catchments and lead times, i.e. to provide a predictive distribution for the forecasts. Stochastic optimization methods need samples/ensembles of run-off forecasts as input. Hence, it should also be possible to sample from our probabilistic forecast. A post-processing approach is taken, and an error model based on Box- Cox transformation, power transform and a temporal-spatial copula model is used. It accounts for both between catchment and between lead time dependencies. In operational use it is strait forward to sample run-off ensembles from this models that inherits the catchment and lead time dependencies. The methodology is tested and demonstrated in the Ulla-Førre river system, and simultaneous probabilistic forecasts for five catchments and ten lead times are constructed. The methodology has enough flexibility to model operationally important features in this case study such as hetroscadasety, lead-time varying temporal dependency and lead-time varying inter-catchment dependency. Our model is evaluated using CRPS for marginal predictive distributions and energy score for joint predictive distribution. It is tested against deterministic run-off forecast, climatology forecast and a persistent forecast, and is found to be the better probabilistic forecast for lead time grater then two. From an operational point of view the results are interesting as the between catchment dependency gets stronger with longer lead-times.

  4. Multi-parametric variational data assimilation for hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  5. Hazard Monitoring of Growing Lava Flow Fields Using Seismic Tremor

    NASA Astrophysics Data System (ADS)

    Eibl, E. P. S.; Bean, C. J.; Jónsdottir, I.; Hoskuldsson, A.; Thordarson, T.; Coppola, D.; Witt, T.; Walter, T. R.

    2017-12-01

    An effusive eruption in 2014/15 created a 85 km2 large lava flow field in a remote location in the Icelandic highlands. The lava flows did not threaten any settlements or paved roads but they were nevertheless interdisciplinarily monitored in detail. Images from satellites and aircraft, ground based video monitoring, GPS and seismic recordings allowed the monitoring and reconstruction of a detailed time series of the growing lava flow field. While the use of satellite images and probabilistic modelling of lava flows are quite common tools to monitor the current and forecast the future growth direction, here we show that seismic recordings can be of use too. We installed a cluster of seismometers at 15 km from the vents and recorded the ground vibrations associated with the eruption. This seismic tremor was not only generated below the vents, but also at the edges of the growing lava flow field and indicated the parts of the lava flow field that were most actively growing. Whilst the time resolution is in the range of days for satellites, seismic stations easily sample continuously at 100 Hz and could therefore provide a much better resolution and estimate of the lava flow hazard in real-time.

  6. An operational hydrological ensemble prediction system for the city of Zurich (Switzerland): assessing the added value of probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Addor, N.; Jaun, S.; Fundel, F.; Zappa, M.

    2012-04-01

    The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This model chain relies on deterministic (COSMO-7) and probabilistic (COSMO-LEPS) atmospheric forecasts, which are used to force a semi-distributed hydrological model (PREVAH) coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework with which we assessed the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added value conveyed by the probability information, a 31-month reforecast was produced for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain is of up to 2 days lead time for the catchment considered. Brier skill scores show that probabilistic hydrological forecasts outperform their deterministic counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. We finally highlight challenges for making decisions on the basis of hydrological predictions, and discuss the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment.

  7. Visualisation and communication of probabilistic climate forecasts to renewable-energy policy makers

    NASA Astrophysics Data System (ADS)

    Steffen, Sophie; Lowe, Rachel; Davis, Melanie; Doblas-Reyes, Francisco J.; Rodó, Xavier

    2014-05-01

    Despite the strong dependence on weather and climate variability of the renewable-energy industry, and the existence of several initiatives towards demonstrating the added benefits of integrating probabilistic forecasts into energy decision-making processes, weather and climate forecasts are still under-utilised within the sector. Improved communication is fundamental to stimulate the use of climate forecast information within decision-making processes, in order to adapt to a highly climate dependent renewable-energy industry. This work focuses on improving the visualisation of climate forecast information, paying special attention to seasonal time scales. This activity is central to enhance climate services for renewable energy and to optimise the usefulness and usability of inherently complex climate information. In the realm of the Global Framework for Climate Services (GFCS) initiative, and subsequent European projects: Seasonal-to-Decadal Climate Prediction for the Improvement of European Climate Service (SPECS) and the European Provision of Regional Impacts Assessment in Seasonal and Decadal Timescales (EUPORIAS), this paper investigates the visualisation and communication of seasonal forecasts with regards to their usefulness and usability, to enable the development of a European climate service. The target end user is the group of renewable-energy policy makers, who are central to enhance climate services for the energy industry. The overall objective is to promote the wide-range dissemination and exchange of actionable climate information based on seasonal forecasts from Global Producing Centres (GPCs). It examines the existing main barriers and deficits. Examples of probabilistic climate forecasts from different GPC's are used to make a catalogue of current approaches, to assess their advantages and limitations and, finally, to recommend better alternatives. Interviews have been conducted with renewable-energy stakeholders to receive feedback for the improvement of existing visualisation techniques of forecasts. The overall aim is to establish a communication protocol for the visualisation of probabilistic climate forecasts, which does not currently exist. GPCs show their own probabilistic forecasts with limited consistency in their communication across different centres, which complicates the understanding for the end user. The recommended communication protocol for both the visualisation and description of climate forecasts can help to introduce a standard format and message to end users from several climate-sensitive sectors, such as energy, tourism, agriculture and health.

  8. New Aspects of Probabilistic Forecast Verification Using Information Theory

    NASA Astrophysics Data System (ADS)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  9. Communicating weather forecast uncertainty: Do individual differences matter?

    PubMed

    Grounds, Margaret A; Joslyn, Susan L

    2018-03-01

    Research suggests that people make better weather-related decisions when they are given numeric probabilities for critical outcomes (Joslyn & Leclerc, 2012, 2013). However, it is unclear whether all users can take advantage of probabilistic forecasts to the same extent. The research reported here assessed key cognitive and demographic factors to determine their relationship to the use of probabilistic forecasts to improve decision quality. In two studies, participants decided between spending resources to prevent icy conditions on roadways or risk a larger penalty when freezing temperatures occurred. Several forecast formats were tested, including a control condition with the night-time low temperature alone and experimental conditions that also included the probability of freezing and advice based on expected value. All but those with extremely low numeracy scores made better decisions with probabilistic forecasts. Importantly, no groups made worse decisions when probabilities were included. Moreover, numeracy was the best predictor of decision quality, regardless of forecast format, suggesting that the advantage may extend beyond understanding the forecast to general decision strategy issues. This research adds to a growing body of evidence that numerical uncertainty estimates may be an effective way to communicate weather danger to general public end users. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Anticipatory Competence and Ability to Probabilistic Forecasting in Adolescents: Research Results

    ERIC Educational Resources Information Center

    Akhmetzyanova, Anna I.

    2016-01-01

    The relevance of this problem is related to the urgent need to explain peculiarities of anticipation and probabilistic forecasting in adolescence. It has revealed a contradiction: on the one hand, the problem of anticipation in ontogenesis is well developed, and, on the other hand, there remain understudied mechanisms of anticipation in…

  11. Probabilistic accounting of uncertainty in forecasts of species distributions under climate change

    Treesearch

    Seth J. Wenger; Nicholas A. Som; Daniel C. Dauwalter; Daniel J. Isaak; Helen M. Neville; Charles H. Luce; Jason B. Dunham; Michael K. Young; Kurt D. Fausch; Bruce E. Rieman

    2013-01-01

    Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing...

  12. Forecasting volcanic unrest using seismicity: The good, the bad and the time consuming

    NASA Astrophysics Data System (ADS)

    Salvage, Rebecca; Neuberg, Jurgen W.

    2013-04-01

    Volcanic eruptions are inherently unpredictable in nature, with scientists struggling to forecast the type and timing of events, in particular in real time scenarios. Current understanding suggests that the use of statistical patterns within precursory datasets of seismicity prior to eruptive events could hold the potential to be used as real time forecasting tools. They allow us to determine times of clear deviation in data, which might be indicative of volcanic unrest. The identification of low frequency seismic swarms and the acceleration of this seismicity prior to observed volcanic unrest may be key in developing forecasting tools. The development of these real time forecasting models which can be implemented at volcano observatories is of particular importance since the identification of early warning signals allows danger to the proximal population to be minimized. We concentrate on understanding the significance and development of these seismic swarms as unrest develops at the volcano. In particular, analysis of accelerations in event rate, amplitude and energy rates released by seismicity prior to eruption suggests that these are important indicators of developing unrest. Real time analysis of these parameters simultaneously allows possible improvements to forecasting models. Although more time and computationally intense, cross correlation techniques applied to continuous seismicity prior to volcanic unrest scenarios allows all significant seismic events to be analysed, rather than only those which can be detected by an automated identification system. This may allow a more accurate forecast since all precursory seismicity can be taken into account. In addition, the classification of seismic events based on spectral characteristics may allow us to isolate individual types of signals which are responsible for certain types of unrest. In this way, we may be able to better forecast the type of eruption that may ensue, or at least some of its prevailing characteristics.

  13. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  14. Forecasting the climate response to volcanic eruptions: prediction skill related to stratospheric aerosol forcing

    NASA Astrophysics Data System (ADS)

    Ménégoz, M.; Bilbao, R.; Bellprat, O.; Guemas, V.; Doblas-Reyes, F. J.

    2018-06-01

    The last major volcanic eruptions, the Agung in 1963, El Chichon in 1982 and Pinatubo in 1991, were each associated with a cooling of the troposphere that has been observed over large continental areas and over the Western Pacific, the Indian Ocean and the Southern Atlantic. Simultaneously, Eastern tropical Pacific temperatures increased due to prevailing El Niño conditions. Here we show that the pattern of these near-surface temperature anomalies is partly reproduced with decadal simulations of the EC-Earth model initialised with climate observations and forced with an estimate of the observed volcanic aerosol optical thickness. Sensitivity experiments highlight a cooling induced by the volcanic forcing, whereas El Niño events following the eruptions would have occurred even without volcanic eruptions. Focusing on the period 1961–2001, the main source of skill of this decadal forecast system during the first 2 years is related to the initialisation of the model. The contribution of the initialisation to the skill becomes smaller than the contribution of the volcanic forcing after two years, the latter being substantial in the Western Pacific, the Indian Ocean and the Western Atlantic. Two simple protocols for real time forecasts are investigated: using the forcing of a past volcanic eruption to simulate the forcing of a new one, and applying a two-year exponential decay to the initial stratospheric aerosol load observed at the beginning of the forecast. This second protocol applied in retrospective forecasts allows a partial reproduction of the skill attained with observed forcing.

  15. Sensitivity analysis and uncertainty estimation in ash concentration simulations and tephra deposit daily forecasted at Mt. Etna, in Italy

    NASA Astrophysics Data System (ADS)

    Prestifilippo, Michele; Scollo, Simona; Tarantola, Stefano

    2015-04-01

    The uncertainty in volcanic ash forecasts may depend on our knowledge of the model input parameters and our capability to represent the dynamic of an incoming eruption. Forecasts help governments to reduce risks associated with volcanic eruptions and for this reason different kinds of analysis that help to understand the effect that each input parameter has on model outputs are necessary. We present an iterative approach based on the sequential combination of sensitivity analysis, parameter estimation procedure and Monte Carlo-based uncertainty analysis, applied to the lagrangian volcanic ash dispersal model PUFF. We modify the main input parameters as the total mass, the total grain-size distribution, the plume thickness, the shape of the eruption column, the sedimentation models and the diffusion coefficient, perform thousands of simulations and analyze the results. The study is carried out on two different Etna scenarios: the sub-plinian eruption of 22 July 1998 that formed an eruption column rising 12 km above sea level and lasted some minutes and the lava fountain eruption having features similar to the 2011-2013 events that produced eruption column high up to several kilometers above sea level and lasted some hours. Sensitivity analyses and uncertainty estimation results help us to address the measurements that volcanologists should perform during volcanic crisis to reduce the model uncertainty.

  16. A global empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  17. An empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  18. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  19. The Hawaiian Volcano Observatory's current approach to forecasting lava flow hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Kauahikaua, J. P.

    2013-12-01

    Hawaiian Volcanoes are best known for their frequent basaltic eruptions, which typically start with fast-moving channelized `a`a flows fed by high eruptions rates. If the flows continue, they generally transition into pahoehoe flows, fed by lower eruption rates, after a few days to weeks. Kilauea Volcano's ongoing eruption illustrates this--since 1986, effusion at Kilauea has mostly produced pahoehoe. The current state of lava flow simulation is quite advanced, but the simplicity of the models mean that they are most appropriately used during the first, most vigorous, days to weeks of an eruption - during the effusion of `a`a flows. Colleagues at INGV in Catania have shown decisively that MAGFLOW simulations utilizing satellite-derived eruption rates can be effective at estimating hazards during the initial periods of an eruption crisis. However, the algorithms do not simulate the complexity of pahoehoe flows. Forecasts of lava flow hazards are the most common form of volcanic hazard assessments made in Hawai`i. Communications with emergency managers over the last decade have relied on simple steepest-descent line maps, coupled with empirical lava flow advance rate information, to portray the imminence of lava flow hazard to nearby communities. Lavasheds, calculated as watersheds, are used as a broader context for the future flow paths and to advise on the utility of diversion efforts, should they be contemplated. The key is to communicate the uncertainty of any approach used to formulate a forecast and, if the forecast uses simple tools, these communications can be fairly straightforward. The calculation of steepest-descent paths and lavasheds relies on the accuracy of the digital elevation model (DEM) used, so the choice of DEM is critical. In Hawai`i, the best choice is not the most recent but is a 1980s-vintage 10-m DEM--more recent LIDAR and satellite radar DEM are referenced to the ellipsoid and include vegetation effects. On low-slope terrain, steepest descent lines calculated on a geoid-based DEM may differ significantly from those calculated on an ellipsoid-based DEM. Good estimates of lava flow advance rates can be obtained from empirical compilations of historical advance rates of Hawaiian lava flows. In this way, rates appropriate for observed flow types (`a`a or pahoehoe, channelized or not) can be applied. Eruption rate is arguably the most important factor, while slope is also significant for low eruption rates. Eruption rate, however, remains the most difficult parameter to estimate during an active eruption. The simplicity of the HVO approach is its major benefit. How much better can lava-flow advance be forecast for all types of lava flows? Will the improvements outweigh the increased uncertainty propagated through the simulation calculations? HVO continues to improve and evaluate its lava flow forecasting tools to provide better hazard assessments to emergency personnel.

  20. Evaluating sub-seasonal skill in probabilistic forecasts of Atmospheric Rivers and associated extreme events

    NASA Astrophysics Data System (ADS)

    Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.

    2017-12-01

    Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.

  1. The Ongoing 2011 Eruption of Cordón Caulle (Southern Andes) and its Related Hazards

    NASA Astrophysics Data System (ADS)

    Amigo, A.; Lara, L. E.; Silva, C.; Orozco, G.; Bertin, D.

    2011-12-01

    On June 4, 2011, at 18:45 UTC, Cordón Caulle volcano (Southern Andes, 40.52S, 72.14W) erupted explosively after 51 years of quiescence. The last eruption occurred in 1960 and was triggered by the great Mw 9.5 Chile earthquake. The ongoing eruption started after 2 months of increased shallow seismicity as recorded by OVDAS (the volcano observatory at Sernageomin). This close monitoring effort allowed a timely eruption forecast with at least 3 hours of warning, which facilitated the crisis response. In addition to this successful performance, for the first time in Chile volcanic hazards were assessed in advance supporting the emergency management. In particular, tephra dispersal was daily forecasted using the ASHFALL advection-diffusion model and potential lahars and PDC impact zones were delineated according to numerical approaches. The first eruptive stage lasted 27 hours. It was characterized by ca. 15-km strong Plinian-like column, associated with the emission of 0.2 - 0.4 km3 of magma (DRE). Tephra fallout mostly occurred in Chile and Argentina, although fine particles and aerosols circumnavigated the globe twice, causing disruptions on air navigation across the Southern Hemisphere. The second ongoing eruptive stage has been characterized by persistent weak plumes and lava emission at effusion rates in the range of 20 and 60 m3/s, which total volume is estimated <0.20 km3 (at the end of July 2011). Eruptive products have virtually the same bulk composition as those of the historical 1921 and 1960 eruptions, corresponding to phenocryst-poor rhyodacites (67 - 70% SiO2) for what a pre-eruptive temperature of ca. 920C could be inferred. In contrast to the previous eruptive cycles, the ongoing eruption has not evolved (at the time of writing) as a fissure eruption although the vent is atop of fault scarp that borders the Pleistocene-Holocene extensional graben of the Cordón Caulle. This episode is a good case of successful eruption forecast and hazards assessment but it is also an important case-study of silicic eruptions in an arc segment where mostly mafic magmas have been erupted during the Holocene.

  2. Real-time eruption forecasting using the material Failure Forecast Method with a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.

    2015-04-01

    Many attempts for deterministic forecasting of eruptions and landslides have been performed using the material Failure Forecast Method (FFM). This method consists in adjusting an empirical power law on precursory patterns of seismicity or deformation. Until now, most of the studies have presented hindsight forecasts based on complete time series of precursors and do not evaluate the ability of the method for carrying out real-time forecasting with partial precursory sequences. In this study, we present a rigorous approach of the FFM designed for real-time applications on volcano-seismic precursors. We use a Bayesian approach based on the FFM theory and an automatic classification of seismic events. The probability distributions of the data deduced from the performance of this classification are used as input. As output, it provides the probability of the forecast time at each observation time before the eruption. The spread of the a posteriori probability density function of the prediction time and its stability with respect to the observation time are used as criteria to evaluate the reliability of the forecast. We test the method on precursory accelerations of long-period seismicity prior to vulcanian explosions at Volcán de Colima (Mexico). For explosions preceded by a single phase of seismic acceleration, we obtain accurate and reliable forecasts using approximately 80% of the whole precursory sequence. It is, however, more difficult to apply the method to multiple acceleration patterns.

  3. Improving medium-range ensemble streamflow forecasts through statistical post-processing

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.

  4. The visualisation and communication of probabilistic climate forecasts to renewable energy policy makers

    NASA Astrophysics Data System (ADS)

    Doblas-Reyes, F.; Steffen, S.; Lowe, R.; Davis, M.; Rodó, X.

    2013-12-01

    Despite the strong dependence of weather and climate variability on the renewable energy industry, and several initiatives towards demonstrating the added benefits of integrating probabilistic forecasts into energy decision making process, they are still under-utilised within the sector. Improved communication is fundamental to stimulate the use of climate forecast information within decision-making processes, in order to adapt to a highly climate dependent renewable energy industry. This paper focuses on improving the visualisation of climate forecast information, paying special attention to seasonal to decadal (s2d) timescales. This is central to enhance climate services for renewable energy, and optimise the usefulness and usability of inherently complex climate information. In the realm of the Global Framework for Climate Services (GFCS) initiative, and subsequent European projects: Seasonal-to-Decadal Climate Prediction for the Improvement of European Climate Service (SPECS) and the European Provision of Regional Impacts Assessment in Seasonal and Decadal Timescales (EUPORIAS), this paper investigates the visualisation and communication of s2d forecasts with regards to their usefulness and usability, to enable the development of a European climate service. The target end user will be renewable energy policy makers, who are central to enhance climate services for the energy industry. The overall objective is to promote the wide-range dissemination and exchange of actionable climate information based on s2d forecasts from Global Producing Centres (GPC's). Therefore, it is crucial to examine the existing main barriers and deficits. Examples of probabilistic climate forecasts from different GPC's were used to prepare a catalogue of current approaches, to assess their advantages and limitations and finally to recommend better alternatives. In parallel, interviews were conducted with renewable energy stakeholders to receive feedback for the improvement of existing visualisation techniques of forecasts. The overall aim is to establish a communication protocol for the visualisation of probabilistic climate forecasts, which does not currently exist. Global Producing Centres show their own probabilistic forecasts with limited consistency in their communication across different centres, which complicates the understanding for the end user. A communication protocol for both the visualisation and description of climate forecasts can help to introduce a standard format and message to end users from several climate-sensitive sectors, such as energy, tourism, agriculture and health. It is hoped that this work will facilitate the improvement of decision-making processes relying on forecast information and enable their wide-range dissemination based on a standardised approach.

  5. Added value of non-calibrated and BMA calibrated AEMET-SREPS probabilistic forecasts: the 24 January 2009 extreme wind event over Catalonia

    NASA Astrophysics Data System (ADS)

    Escriba, P. A.; Callado, A.; Santos, D.; Santos, C.; Simarro, J.; García-Moya, J. A.

    2009-09-01

    At 00 UTC 24 January 2009 an explosive ciclogenesis originated over the Atlantic Ocean reached its maximum intensity with observed surface pressures lower than 970 hPa on its center and placed at Gulf of Vizcaya. During its path through southern France this low caused strong westerly and north-westerly winds over the Iberian Peninsula higher than 150 km/h at some places. These extreme winds leaved 10 casualties in Spain, 8 of them in Catalonia. The aim of this work is to show whether exists an added value in the short range prediction of the 24 January 2009 strong winds when using the Short Range Ensemble Prediction System (SREPS) of the Spanish Meteorological Agency (AEMET), with respect to the operational forecasting tools. This study emphasizes two aspects of probabilistic forecasting: the ability of a 3-day forecast of warn an extreme windy event and the ability of quantifying the predictability of the event so that giving value to deterministic forecast. Two type of probabilistic forecasts of wind are carried out, a non-calibrated and a calibrated one using Bayesian Model Averaging (BMA). AEMET runs daily experimentally SREPS twice a day (00 and 12 UTC). This system consists of 20 members that are constructed by integrating 5 local area models, COSMO (COSMO), HIRLAM (HIRLAM Consortium), HRM (DWD), MM5 (NOAA) and UM (UKMO), at 25 km of horizontal resolution. Each model uses 4 different initial and boundary conditions, the global models GFS (NCEP), GME (DWD), IFS (ECMWF) and UM. By this way it is obtained a probabilistic forecast that takes into account the initial, the contour and the model errors. BMA is a statistical tool for combining predictive probability functions from different sources. The BMA predictive probability density function (PDF) is a weighted average of PDFs centered on the individual bias-corrected forecasts. The weights are equal to posterior probabilities of the models generating the forecasts and reflect the skill of the ensemble members. Here BMA is applied to provide probabilistic forecasts of wind speed. In this work several forecasts for different time ranges (H+72, H+48 and H+24) of 10 meters wind speed over Catalonia are verified subjectively at one of the instants of maximum intensity, 12 UTC 24 January 2009. On one hand, three probabilistic forecasts are compared, ECMWF EPS, non-calibrated SREPS and calibrated SREPS. On the other hand, the relationship between predictability and skill of deterministic forecast is studied by looking at HIRLAM 0.16 deterministic forecasts of the event. Verification is focused on location and intensity of 10 meters wind speed and 10-minutal measures from AEMET automatic ground stations are used as observations. The results indicate that SREPS is able to forecast three days ahead mean winds higher than 36 km/h and that correctly localizes them with a significant probability of ocurrence in the affected area. The probability is higher after BMA calibration of the ensemble. The fact that probability of strong winds is high allows us to state that the predictability of the event is also high and, as a consequence, deterministic forecasts are more reliable. This is confirmed when verifying HIRLAM deterministic forecasts against observed values.

  6. Bayesian Probabilistic Projections of Life Expectancy for All Countries

    PubMed Central

    Raftery, Adrian E.; Chunn, Jennifer L.; Gerland, Patrick; Ševčíková, Hana

    2014-01-01

    We propose a Bayesian hierarchical model for producing probabilistic forecasts of male period life expectancy at birth for all the countries of the world from the present to 2100. Such forecasts would be an input to the production of probabilistic population projections for all countries, which is currently being considered by the United Nations. To evaluate the method, we did an out-of-sample cross-validation experiment, fitting the model to the data from 1950–1995, and using the estimated model to forecast for the subsequent ten years. The ten-year predictions had a mean absolute error of about 1 year, about 40% less than the current UN methodology. The probabilistic forecasts were calibrated, in the sense that (for example) the 80% prediction intervals contained the truth about 80% of the time. We illustrate our method with results from Madagascar (a typical country with steadily improving life expectancy), Latvia (a country that has had a mortality crisis), and Japan (a leading country). We also show aggregated results for South Asia, a region with eight countries. Free publicly available R software packages called bayesLife and bayesDem are available to implement the method. PMID:23494599

  7. On the skill of various ensemble spread estimators for probabilistic short range wind forecasting

    NASA Astrophysics Data System (ADS)

    Kann, A.

    2012-05-01

    A variety of applications ranging from civil protection associated with severe weather to economical interests are heavily dependent on meteorological information. For example, a precise planning of the energy supply with a high share of renewables requires detailed meteorological information on high temporal and spatial resolution. With respect to wind power, detailed analyses and forecasts of wind speed are of crucial interest for the energy management. Although the applicability and the current skill of state-of-the-art probabilistic short range forecasts has increased during the last years, ensemble systems still show systematic deficiencies which limit its practical use. This paper presents methods to improve the ensemble skill of 10-m wind speed forecasts by combining deterministic information from a nowcasting system on very high horizontal resolution with uncertainty estimates from a limited area ensemble system. It is shown for a one month validation period that a statistical post-processing procedure (a modified non-homogeneous Gaussian regression) adds further skill to the probabilistic forecasts, especially beyond the nowcasting range after +6 h.

  8. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  9. The Forecast Interpretation Tool—a Monte Carlo technique for blending climatic distributions with probabilistic forecasts

    USGS Publications Warehouse

    Husak, Gregory J.; Michaelsen, Joel; Kyriakidis, P.; Verdin, James P.; Funk, Chris; Galu, Gideon

    2011-01-01

    Probabilistic forecasts are produced from a variety of outlets to help predict rainfall, and other meteorological events, for periods of 1 month or more. Such forecasts are expressed as probabilities of a rainfall event, e.g. being in the upper, middle, or lower third of the relevant distribution of rainfall in the region. The impact of these forecasts on the expectation for the event is not always clear or easily conveyed. This article proposes a technique based on Monte Carlo simulation for adjusting existing climatologic statistical parameters to match forecast information, resulting in new parameters defining the probability of events for the forecast interval. The resulting parameters are shown to approximate the forecasts with reasonable accuracy. To show the value of the technique as an application for seasonal rainfall, it is used with consensus forecast developed for the Greater Horn of Africa for the 2009 March-April-May season. An alternative, analytical approach is also proposed, and discussed in comparison to the first simulation-based technique.

  10. Benefits of volcano monitoring far outweigh costs - the case of Mount Pinatubo

    USGS Publications Warehouse

    Newhall, Chris G.; Hendley, James W.; Stauffer, Peter H.

    1997-01-01

    The climactic June 1991 eruption of Mount Pinatubo, Philippines, was the largest volcanic eruption in this century to affect a heavily populated area. Because it was forecast by scientists from the Philippine Institute of Volcanology and Seismology and the U.S. Geological Survey, civil and military leaders were able to order massive evacuations and take measures to protect property before the eruption. Thousands of lives were saved and hundreds of millions of dollars in property losses averted. The savings in property alone were many times the total costs of the forecasting and evacuations.

  11. Evaluation of the Plant-Craig stochastic convection scheme in an ensemble forecasting system

    NASA Astrophysics Data System (ADS)

    Keane, R. J.; Plant, R. S.; Tennant, W. J.

    2015-12-01

    The Plant-Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic element only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant-Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant-Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.

  12. Forecasts and predictions of eruptive activity at Mount St. Helens, USA: 1975-1984

    USGS Publications Warehouse

    Swanson, D.A.; Casadevall, T.J.; Dzurisin, D.; Holcomb, R.T.; Newhall, C.G.; Malone, S.D.; Weaver, C.S.

    1985-01-01

    Public statements about volcanic activity at Mount St. Helens include factual statements, forecasts, and predictions. A factual statement describes current conditions but does not anticipate future events. A forecast is a comparatively imprecise statement of the time, place, and nature of expected activity. A prediction is a comparatively precise statement of the time, place, and ideally, the nature and size of impending activity. A prediction usually covers a shorter time period than a forecast and is generally based dominantly on interpretations and measurements of ongoing processes and secondarily on a projection of past history. The three types of statements grade from one to another, and distinctions are sometimes arbitrary. Forecasts and predictions at Mount St. Helens became increasingly precise from 1975 to 1982. Stratigraphic studies led to a long-range forecast in 1975 of renewed eruptive activity at Mount St. Helens, possibly before the end of the century. On the basis of seismic, geodetic and geologic data, general forecasts for a landslide and eruption were issued in April 1980, before the catastrophic blast and landslide on 18 May 1980. All extrusions except two from June 1980 to the end of 1984 were predicted on the basis of integrated geophysical, geochemical, and geologic monitoring. The two extrusions that were not predicted were preceded by explosions that removed a substantial part of the dome, reducing confining pressure and essentially short-circuiting the normal precursors. ?? 1985.

  13. The VUELCO project consortium: new interdisciplinary research for improved risk mitigation and management during volcanic unrest

    NASA Astrophysics Data System (ADS)

    Gottsmann, J.

    2012-04-01

    Volcanic unrest is a complex multi-hazard phenomenon of volcanism. The fact that unrest may, but not necessarily must lead to an imminent eruption contributes significant uncertainty to short-term hazard assessment of volcanic activity world-wide. Although it is reasonable to assume that all eruptions are associated with precursory activity of some sort, the knowledge of the causative links between subsurface processes, resulting unrest signals and imminent eruption is, today, inadequate to deal effectively with crises of volcanic unrest. This results predominantly from the uncertainties in identifying the causative processes of unrest and as a consequence in forecasting its short-term evolution. However, key for effective risk mitigation and management during unrest is the early and reliable identification of changes in the subsurface dynamics of a volcano and their assessment as precursors to an impending eruption. The VUELCO project consortium has come together for a multi-disciplinary attack on the origin, nature and significance of volcanic unrest from the scientific contributions generated by collaboration of ten partners in Europe and Latin America. Dissecting the science of monitoring data from unrest periods at six type volcanoes in Italy, Spain, the West Indies, Mexico and Ecuador the consortium will create global strategies for 1) enhanced monitoring capacity and value, 2) mechanistic data interpretation and 3) identification of reliable eruption precursors; all from the geophysical, geochemical and geodetic fingerprints of unrest episodes. Experiments will establish a mechanistic understanding of subsurface processes capable of inducing unrest and aid in identifying key volcano monitoring parameters indicative of the nature of unrest processes. Numerical models will help establish a link between the processes and volcano monitoring data to inform on the causes of unrest and its short-term evolution. Using uncertainty assessment and new short-term probabilistic hazard forecasting tools the scientific knowledge base will provide the crucial parameters for a comprehensive and best-practice approach to 1) risk mitigation, 2) communication, 3) decision-making and 4) crisis management during unrest periods. The VUELCO project consortium efforts will generate guidance in the definition and implementation of strategic options for effective risk mitigation, management and governance during unrest episodes. Such a mechanistic platform of understanding, impacting on the synergy of scientists, policy-makers, civil protection authorities, decision-makers, and the public, will place volcanic unrest management on a new basis, with European expertise at its peak. The project is financed by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment".

  14. Improving Global Forecast System of extreme precipitation events with regional statistical model: Application of quantile-based probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar

    2017-02-01

    Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.

  15. IEA Wind Task 36 Forecasting

    NASA Astrophysics Data System (ADS)

    Giebel, Gregor; Cline, Joel; Frank, Helmut; Shaw, Will; Pinson, Pierre; Hodge, Bri-Mathias; Kariniotakis, Georges; Sempreviva, Anna Maria; Draxl, Caroline

    2017-04-01

    Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Wind Power Forecasting tries to organise international collaboration, among national weather centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, UK MetOffice, …) and operational forecaster and forecast users. The Task is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets for verification. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts aiming at industry and forecasters alike. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions, especially probabilistic ones. The Operating Agent is Gregor Giebel of DTU, Co-Operating Agent is Joel Cline of the US Department of Energy. Collaboration in the task is solicited from everyone interested in the forecasting business. We will collaborate with IEA Task 31 Wakebench, which developed the Windbench benchmarking platform, which this task will use for forecasting benchmarks. The task runs for three years, 2016-2018. Main deliverables are an up-to-date list of current projects and main project results, including datasets which can be used by researchers around the world to improve their own models, an IEA Recommended Practice on performance evaluation of probabilistic forecasts, a position paper regarding the use of probabilistic forecasts, and one or more benchmark studies implemented on the Windbench platform hosted at CENER. Additionally, spreading of relevant information in both the forecasters and the users community is paramount. The poster also shows the work done in the first half of the Task, e.g. the collection of available datasets and the learnings from a public workshop on 9 June in Barcelona on Experiences with the Use of Forecasts and Gaps in Research. Participation is open for all interested parties in member states of the IEA Annex on Wind Power, see ieawind.org for the up-to-date list. For collaboration, please contact the author grgi@dtu.dk).

  16. Probabilistic evaluation of the physical impact of future tephra fallout events for the Island of Vulcano, Italy

    NASA Astrophysics Data System (ADS)

    Biass, Sebastien; Bonadonna, Costanza; di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino

    2016-05-01

    A first probabilistic scenario-based hazard assessment for tephra fallout is presented for La Fossa volcano (Vulcano Island, Italy) and subsequently used to assess the impact on the built environment. Eruption scenarios are based upon the stratigraphy produced by the last 1000 years of activity at Vulcano and include long-lasting Vulcanian and sub-Plinian eruptions. A new method is proposed to quantify the evolution through time of the hazard associated with pulsatory Vulcanian eruptions lasting from weeks to years, and the increase in hazard related to typical rainfall events around Sicily is also accounted for. The impact assessment on the roofs is performed by combining a field characterization of the buildings with the composite European vulnerability curves for typical roofing stocks. Results show that a sub-Plinian eruption of VEI 2 is not likely to affect buildings, whereas a sub-Plinian eruption of VEI 3 results in 90 % of the building stock having a ≥12 % probability of collapse. The hazard related to long-lasting Vulcanian eruptions evolves through time, and our analysis shows that the town of Il Piano, located downwind of the preferential wind patterns, is likely to reach critical tephra accumulations for roof collapse 5-9 months after the onset of the eruption. If no cleaning measures are taken, half of the building stock has a probability >20 % of suffering roof collapse.

  17. Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, S.; Klein, B.

    2017-11-01

    Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.

  18. A probabilistic verification score for contours demonstrated with idealized ice-edge forecasts

    NASA Astrophysics Data System (ADS)

    Goessling, Helge; Jung, Thomas

    2017-04-01

    We introduce a probabilistic verification score for ensemble-based forecasts of contours: the Spatial Probability Score (SPS). Defined as the spatial integral of local (Half) Brier Scores, the SPS can be considered the spatial analog of the Continuous Ranked Probability Score (CRPS). Applying the SPS to idealized seasonal ensemble forecasts of the Arctic sea-ice edge in a global coupled climate model, we demonstrate that the SPS responds properly to ensemble size, bias, and spread. When applied to individual forecasts or ensemble means (or quantiles), the SPS is reduced to the 'volume' of mismatch, in case of the ice edge corresponding to the Integrated Ice Edge Error (IIEE).

  19. Probabilistic Volcanic Multi-Hazard Assessment at Somma-Vesuvius (Italy): coupling Bayesian Belief Networks with a physical model for lahar propagation

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry

    2017-04-01

    Volcanoes are extremely complex physico-chemical systems where magma formed at depth breaks into the planet's surface resulting in major hazards from local to global scales. Volcano physics are dominated by non-linearities, and complicated spatio-temporal interrelationships which make volcanic hazards stochastic (i.e. not deterministic) by nature. In this context, probabilistic assessments are required to quantify the large uncertainties related to volcanic hazards. Moreover, volcanoes are typically multi-hazard environments where different hazardous processes can occur whether simultaneously or in succession. In particular, explosive volcanoes are able to accumulate, through tephra fallout and Pyroclastic Density Currents (PDCs), large amounts of pyroclastic material into the drainage basins surrounding the volcano. This addition of fresh particulate material alters the local/regional hydrogeological equilibrium and increases the frequency and magnitude of sediment-rich aqueous flows, commonly known as lahars. The initiation and volume of rain-triggered lahars may depend on: rainfall intensity and duration; antecedent rainfall; terrain slope; thickness, permeability and hydraulic diffusivity of the tephra deposit; etc. Quantifying these complex interrelationships (and their uncertainties), in a tractable manner, requires a structured but flexible probabilistic approach. A Bayesian Belief Network (BBN) is a directed acyclic graph that allows the representation of the joint probability distribution for a set of uncertain variables in a compact and efficient way, by exploiting unconditional and conditional independences between these variables. Once constructed and parametrized, the BBN uses Bayesian inference to perform causal (e.g. forecast) and/or evidential reasoning (e.g. explanation) about query variables, given some evidence. In this work, we illustrate how BBNs can be used to model the influence of several variables on the generation of rain-triggered lahars and, finally, assess the probability of occurrence of lahars of different volumes. The information utilized to parametrize the BBNs includes: (1) datasets of lahar observations; (2) numerical modelling of tephra fallout and PDCs; and (3) literature data. The BBN framework provides an opportunity to quantitatively combine these different types of evidence and use them to derive a rational approach to lahar forecasting. Lastly, we couple the BBN assessments with a shallow-water physical model for lahar propagation in order to attach probabilities to the simulated hazard footprints. We develop our methodology at Somma-Vesuvius (Italy), an explosive volcano prone to rain-triggered lahars or debris flows whether right after an eruption or during inter-eruptive periods. Accounting for the variability in tephra-fallout and dense-PDC propagation and the main geomorphological features of the catchments around Somma-Vesuvius, the areas most likely of forming medium-large lahars are the flanks of the volcano and the Sarno mountains towards the east.

  20. Exploring the influence of vent location and eruption style on tephra fall hazard from the Okataina Volcanic Centre, New Zealand

    NASA Astrophysics Data System (ADS)

    Thompson, Mary Anne; Lindsay, Jan M.; Sandri, Laura; Biass, Sébastien; Bonadonna, Costanza; Jolly, Gill; Marzocchi, Warner

    2015-05-01

    Uncertainties in modelling volcanic hazards are often amplified in geographically large systems which have a diverse eruption history that comprises variable eruption styles from many different vent locations. The ~700 km2 Okataina Volcanic Centre (OVC) is a caldera complex in New Zealand which has displayed a range of eruption styles and compositions over its current phase of activity (26 ka-present), including one basaltic maar-forming eruption, one basaltic Plinian eruption and nine rhyolitic Plinian eruptions. All three of these eruption styles occurred within the past 3.5 ky, and any of these styles could occur in the event of a future eruption. The location of a future eruption is also unknown. Future vents could potentially open in one of three different areas which have been activated in the past 26 ky at the OVC: the Tarawera linear vent zone (LVZ) (five eruptions), the Haroharo LVZ (five eruptions) or outside of these LVZs (one eruption). A future rhyolitic or basaltic Plinian eruption from the OVC is likely to generate widespread tephra fall in loads that will cause significant disruption and have severe socio-economic impacts. Past OVC tephra hazard studies have focused on evaluating hazard from a rhyolitic Plinian eruption at select vent locations in the OVC's Tarawera LVZ. Here, we expand upon past studies by evaluating tephra hazard for all possible OVC eruption vent areas and for both rhyolitic and basaltic Plinian eruption styles, and explore how these parameters influence tephra hazard forecasts. Probabilistic volcanic hazard model BET_VH and advection-diffusion model TEPHRA2 were used to assess the hazard of accumulating ≥10 kg m-2 of tephra from both basaltic Plinian and rhyolitic Plinian eruption styles, occurring from within the Tarawera LVZ, the Haroharo LVZ or other potential vent areas within the caldera. Our results highlight the importance of considering all the potential vent locations of a volcanic system, in order to capture the full eruption catalogue in analyses (e.g. 11 eruptions over 26 ky for the OVC versus only five eruptions over 26 ky for the Tarawera LVZ), as well as the full spatial distribution of tephra hazard. Although the Tarawera LVZ has been prominently discussed in studies of OVC hazard because of its recent activity (1886 and ~1315 ad), we find that in the event of a future eruption, the estimated likelihood of a vent opening within the Haroharo LVZ (last eruption 5.6 ka) is equivalent (<1 % difference) to that for the Tarawera LVZ (31.8 compared to 32.5 %). Including both the Haroharo LVZ and the Tarawera LVZ as possible source areas in the hazard analysis allows us to assess the full spatial extent of OVC tephra fall hazard. By considering both basaltic Plinian and rhyolitic Plinian eruption styles, as well as multiple vent location areas, we present a hazard assessment which aims to reduce bias through incorporating a greater range of eruption variables.

  1. Meta-heuristic CRPS minimization for the calibration of short-range probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Mohammadi, Seyedeh Atefeh; Rahmani, Morteza; Azadi, Majid

    2016-08-01

    This paper deals with the probabilistic short-range temperature forecasts over synoptic meteorological stations across Iran using non-homogeneous Gaussian regression (NGR). NGR creates a Gaussian forecast probability density function (PDF) from the ensemble output. The mean of the normal predictive PDF is a bias-corrected weighted average of the ensemble members and its variance is a linear function of the raw ensemble variance. The coefficients for the mean and variance are estimated by minimizing the continuous ranked probability score (CRPS) during a training period. CRPS is a scoring rule for distributional forecasts. In the paper of Gneiting et al. (Mon Weather Rev 133:1098-1118, 2005), Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is used to minimize the CRPS. Since BFGS is a conventional optimization method with its own limitations, we suggest using the particle swarm optimization (PSO), a robust meta-heuristic method, to minimize the CRPS. The ensemble prediction system used in this study consists of nine different configurations of the weather research and forecasting model for 48-h forecasts of temperature during autumn and winter 2011 and 2012. The probabilistic forecasts were evaluated using several common verification scores including Brier score, attribute diagram and rank histogram. Results show that both BFGS and PSO find the optimal solution and show the same evaluation scores, but PSO can do this with a feasible random first guess and much less computational complexity.

  2. Forecasting volcanic eruptions and other material failure phenomena: An evaluation of the failure forecast method

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.

    2011-08-01

    Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.

  3. The quality and value of seasonal precipitation forecasts for an early warning of large-scale droughts and floods in West Africa

    NASA Astrophysics Data System (ADS)

    Bliefernicht, Jan; Seidel, Jochen; Salack, Seyni; Waongo, Moussa; Laux, Patrick; Kunstmann, Harald

    2017-04-01

    Seasonal precipitation forecasts are a crucial source of information for an early warning of hydro-meteorological extremes in West Africa. However, the current seasonal forecasting system used by the West African weather services in the framework of the West African Climate Outlook forum (PRESAO) is limited to probabilistic precipitation forecasts of 1-month lead time. To improve this provision, we use an ensemble-based quantile-quantile transformation for bias correction of precipitation forecasts provided by a global seasonal ensemble prediction system, the Climate Forecast System Version 2 (CFS2). The statistical technique eliminates systematic differences between global forecasts and observations with the potential to preserve the signal from the model. The technique has also the advantage that it can be easily implemented at national weather services with low capacities. The statistical technique is used to generate probabilistic forecasts of monthly and seasonal precipitation amount and other precipitation indices useful for an early warning of large-scale drought and floods in West Africa. The evaluation of the statistical technique is done using CFS hindcasts (1982 to 2009) in a cross-validation mode to determine the performance of the precipitation forecasts for several lead times focusing on drought and flood events depicted over the Volta and Niger basins. In addition, operational forecasts provided by PRESAO are analyzed from 1998 to 2015. The precipitation forecasts are compared to low-skill reference forecasts generated from gridded observations (i.e. GPCC, CHIRPS) and a novel in-situ gauge database from national observation networks (see Poster EGU2017-10271). The forecasts are evaluated using state-of-the-art verification techniques to determine specific quality attributes of probabilistic forecasts such as reliability, accuracy and skill. In addition, cost-loss approaches are used to determine the value of probabilistic forecasts for multiple users in warning situations. The outcomes of the hindcasts experiment for the Volta basin illustrate that the statistical technique can clearly improve the CFS precipitation forecasts with the potential to provide skillful and valuable early precipitation warnings for large-scale drought and flood situations several months in ahead. In this presentation we give a detailed overview about the ensemble-based quantile-quantile-transformation, its validation and verification and the possibilities of this technique to complement PRESAO. We also highlight the performance of this technique for extremes such as the Sahel drought in the 80ties and in comparison to the various reference data sets (e.g. CFS2, PRESAO, observational data sets) used in this study.

  4. An online tool for Operational Probabilistic Drought Forecasting System (OPDFS): a Statistical-Dynamical Framework

    NASA Astrophysics Data System (ADS)

    Zarekarizi, M.; Moradkhani, H.; Yan, H.

    2017-12-01

    The Operational Probabilistic Drought Forecasting System (OPDFS) is an online tool recently developed at Portland State University for operational agricultural drought forecasting. This is an integrated statistical-dynamical framework issuing probabilistic drought forecasts monthly for the lead times of 1, 2, and 3 months. The statistical drought forecasting method utilizes copula functions in order to condition the future soil moisture values on the antecedent states. Due to stochastic nature of land surface properties, the antecedent soil moisture states are uncertain; therefore, data assimilation system based on Particle Filtering (PF) is employed to quantify the uncertainties associated with the initial condition of the land state, i.e. soil moisture. PF assimilates the satellite soil moisture data to Variable Infiltration Capacity (VIC) land surface model and ultimately updates the simulated soil moisture. The OPDFS builds on the NOAA's seasonal drought outlook by offering drought probabilities instead of qualitative ordinal categories and provides the user with the probability maps associated with a particular drought category. A retrospective assessment of the OPDFS showed that the forecasting of the 2012 Great Plains and 2014 California droughts were possible at least one month in advance. The OPDFS offers a timely assistance to water managers, stakeholders and decision-makers to develop resilience against uncertain upcoming droughts.

  5. Naples between two fires: eruptive scenarios for the next eruptions by an integrated volcanological-probabilistic approach.

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; de Natale, G.; Troise, C.; Rossano, S.; Panizza, A.

    2009-04-01

    Probabilistic approaches based on available volcanological data from real eruptions of Campi Flegrei and Somma-Vesuvius, are assembled in a comprehensive assessment of volcanic hazards at the Neapolitan area. This allows to compare the volcanic hazards related to the different types of events, which can be used for evaluating the conditional probability of flows and falls hazard in case of a volcanic crisis. Hazard maps are presented, based on a rather complete set of numerical simulations, produced using field and laboratory data as input parameters relative to a large range (VEI 1 to 5) of fallout and pyroclastic-flow events and their relative occurrence. The results allow us to quantitatively evaluate and compare the hazard related to pyroclastic fallout and density currents (PDCs) at the Neapolitan volcanoes and their surroundings, including the city of Naples. Due to its position between the two volcanic areas, the city of Naples is particularly exposed to volcanic risk from VEI>2 eruptions, as recorded in the local volcanic succession. Because dominant wind directions, the area of Naples is particularly prone to fallout hazard from Campi Flegrei caldera eruptions in the VEI range 2-5. The hazard from PDCs decreases roughly radially with distance from the eruptive vents and is strongly controlled by the topographic heights. Campi Flegrei eruptions are particularly hazardous for Naples, although the Camaldoli and Posillipo hills produce an effective barrier to propagation to the very central part of Naples. PDCs from Vesuvius eruptions with VEI>4 can cover the city of Naples, whereas even VEI>3 eruptions have a moderate fallout hazard there.

  6. Evaluation of the Plant-Craig stochastic convection scheme (v2.0) in the ensemble forecasting system MOGREPS-R (24 km) based on the Unified Model (v7.3)

    NASA Astrophysics Data System (ADS)

    Keane, Richard J.; Plant, Robert S.; Tennant, Warren J.

    2016-05-01

    The Plant-Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant-Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant-Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.

  7. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil.

    PubMed

    Lowe, Rachel; Coelho, Caio As; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-02-24

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics.

  8. On a Possible Unified Scaling Law for Volcanic Eruption Durations

    PubMed Central

    Cannavò, Flavio; Nunnari, Giuseppe

    2016-01-01

    Volcanoes constitute dissipative systems with many degrees of freedom. Their eruptions are the result of complex processes that involve interacting chemical-physical systems. At present, due to the complexity of involved phenomena and to the lack of precise measurements, both analytical and numerical models are unable to simultaneously include the main processes involved in eruptions thus making forecasts of volcanic dynamics rather unreliable. On the other hand, accurate forecasts of some eruption parameters, such as the duration, could be a key factor in natural hazard estimation and mitigation. Analyzing a large database with most of all the known volcanic eruptions, we have determined that the duration of eruptions seems to be described by a universal distribution which characterizes eruption duration dynamics. In particular, this paper presents a plausible global power-law distribution of durations of volcanic eruptions that holds worldwide for different volcanic environments. We also introduce a new, simple and realistic pipe model that can follow the same found empirical distribution. Since the proposed model belongs to the family of the self-organized systems it may support the hypothesis that simple mechanisms can lead naturally to the emergent complexity in volcanic behaviour. PMID:26926425

  9. On a Possible Unified Scaling Law for Volcanic Eruption Durations.

    PubMed

    Cannavò, Flavio; Nunnari, Giuseppe

    2016-03-01

    Volcanoes constitute dissipative systems with many degrees of freedom. Their eruptions are the result of complex processes that involve interacting chemical-physical systems. At present, due to the complexity of involved phenomena and to the lack of precise measurements, both analytical and numerical models are unable to simultaneously include the main processes involved in eruptions thus making forecasts of volcanic dynamics rather unreliable. On the other hand, accurate forecasts of some eruption parameters, such as the duration, could be a key factor in natural hazard estimation and mitigation. Analyzing a large database with most of all the known volcanic eruptions, we have determined that the duration of eruptions seems to be described by a universal distribution which characterizes eruption duration dynamics. In particular, this paper presents a plausible global power-law distribution of durations of volcanic eruptions that holds worldwide for different volcanic environments. We also introduce a new, simple and realistic pipe model that can follow the same found empirical distribution. Since the proposed model belongs to the family of the self-organized systems it may support the hypothesis that simple mechanisms can lead naturally to the emergent complexity in volcanic behaviour.

  10. The climatic effect of explosive volcanic activity: Analysis of the historical data

    NASA Technical Reports Server (NTRS)

    Bryson, R. A.; Goodman, B. M.

    1982-01-01

    By using the most complete available records of direct beam radiation and volcanic eruptions, an historical analysis of the role of the latter in modulating the former was made. A very simple fallout and dispersion model was applied to the historical chronology of explosive eruptions. The resulting time series explains about 77 percent of the radiation variance, as well as suggests that tropical and subpolar eruptions are more important than mid-latitude eruptions in their impact on the stratospheric aerosol optical depth. The simpler climatic models indicate that past hemispheric temperature can be stimulated very well with volcanic and CO2 inputs and suggest that climate forecasting will also require volcano forecasting. There is some evidence that this is possible some years in advance.

  11. New insights into the Kawah Ijen hydrothermal system from geophysical data

    USGS Publications Warehouse

    Caudron, Corentin; Mauri, G.; Williams-Jones, Glyn; Lecocq, Thomas; Syahbana, Devy Kamil; de Plaen, Raphael; Peiffer, Loic; Bernard, Alain; Saracco, Ginette

    2017-01-01

    Volcanoes with crater lakes and/or extensive hydrothermal systems pose significant challenges with respect to monitoring and forecasting eruptions, but they also provide new opportunities to enhance our understanding of magmatic–hydrothermal processes. Their lakes and hydrothermal systems serve as reservoirs for magmatic heat and fluid emissions, filtering and delaying the surface expressions of magmatic unrest and eruption, yet they also enable sampling and monitoring of geochemical tracers. Here, we describe the outcomes of a highly focused international experimental campaign and workshop carried out at Kawah Ijen volcano, Indonesia, in September 2014, designed to answer fundamental questions about how to improve monitoring and eruption forecasting at wet volcanoes.

  12. The game of making decisions under uncertainty: How sure must one be?

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Verkade, Jan; Wetterhall, Fredrik; van Andel, Schalk-Jan; Ramos, Maria-Helena

    2016-04-01

    Probabilistic hydrometeorological forecasting is now widely accepted to be more skillful than deterministic forecasts, and is increasingly being integrated into operational practice. Provided they are reliable and unbiased, probabilistic forecasts have the advantage that they give decision maker not only the forecast value, but also the uncertainty associated to that prediction. Though that information provides more insight, it does now leave the forecaster/decision maker with the challenge of deciding at what level of probability of a threshold being exceeded the decision to act should be taken. According to the cost-loss theory, that probability should be related to the impact of the threshold being exceeded. However, it is not entirely clear how easy it is for decision makers to follow that rule, even when the impact of a threshold being exceeded, and the actions to choose from are known. To continue the tradition in the "Ensemble Hydrometeorological Forecast" session, we will address the challenge of making decisions based on probabilistic forecasts through a game to be played with the audience. We will explore how decisions made differ depending on the known impacts of the forecasted events. Participants will be divided into a number of groups with differing levels of impact, and will be faced with a number of forecast situations. They will be asked to make decisions and record the consequence of those decisions. A discussion of the differences in the decisions made will be presented at the end of the game, with a fuller analysis later posted on the HEPEX web site blog (www.hepex.org).

  13. Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2017-03-01

    Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.

  14. Predictability of short-range forecasting: a multimodel approach

    NASA Astrophysics Data System (ADS)

    García-Moya, Jose-Antonio; Callado, Alfons; Escribà, Pau; Santos, Carlos; Santos-Muñoz, Daniel; Simarro, Juan

    2011-05-01

    Numerical weather prediction (NWP) models (including mesoscale) have limitations when it comes to dealing with severe weather events because extreme weather is highly unpredictable, even in the short range. A probabilistic forecast based on an ensemble of slightly different model runs may help to address this issue. Among other ensemble techniques, Multimodel ensemble prediction systems (EPSs) are proving to be useful for adding probabilistic value to mesoscale deterministic models. A Multimodel Short Range Ensemble Prediction System (SREPS) focused on forecasting the weather up to 72 h has been developed at the Spanish Meteorological Service (AEMET). The system uses five different limited area models (LAMs), namely HIRLAM (HIRLAM Consortium), HRM (DWD), the UM (UKMO), MM5 (PSU/NCAR) and COSMO (COSMO Consortium). These models run with initial and boundary conditions provided by five different global deterministic models, namely IFS (ECMWF), UM (UKMO), GME (DWD), GFS (NCEP) and CMC (MSC). AEMET-SREPS (AE) validation on the large-scale flow, using ECMWF analysis, shows a consistent and slightly underdispersive system. For surface parameters, the system shows high skill forecasting binary events. 24-h precipitation probabilistic forecasts are verified using an up-scaling grid of observations from European high-resolution precipitation networks, and compared with ECMWF-EPS (EC).

  15. Seismic energy data analysis of Merapi volcano to test the eruption time prediction using materials failure forecast method (FFM)

    NASA Astrophysics Data System (ADS)

    Anggraeni, Novia Antika

    2015-04-01

    The test of eruption time prediction is an effort to prepare volcanic disaster mitigation, especially in the volcano's inhabited slope area, such as Merapi Volcano. The test can be conducted by observing the increase of volcanic activity, such as seismicity degree, deformation and SO2 gas emission. One of methods that can be used to predict the time of eruption is Materials Failure Forecast Method (FFM). Materials Failure Forecast Method (FFM) is a predictive method to determine the time of volcanic eruption which was introduced by Voight (1988). This method requires an increase in the rate of change, or acceleration of the observed volcanic activity parameters. The parameter used in this study is the seismic energy value of Merapi Volcano from 1990 - 2012. The data was plotted in form of graphs of seismic energy rate inverse versus time with FFM graphical technique approach uses simple linear regression. The data quality control used to increase the time precision employs the data correlation coefficient value of the seismic energy rate inverse versus time. From the results of graph analysis, the precision of prediction time toward the real time of eruption vary between -2.86 up to 5.49 days.

  16. Seismic energy data analysis of Merapi volcano to test the eruption time prediction using materials failure forecast method (FFM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anggraeni, Novia Antika, E-mail: novia.antika.a@gmail.com

    The test of eruption time prediction is an effort to prepare volcanic disaster mitigation, especially in the volcano’s inhabited slope area, such as Merapi Volcano. The test can be conducted by observing the increase of volcanic activity, such as seismicity degree, deformation and SO2 gas emission. One of methods that can be used to predict the time of eruption is Materials Failure Forecast Method (FFM). Materials Failure Forecast Method (FFM) is a predictive method to determine the time of volcanic eruption which was introduced by Voight (1988). This method requires an increase in the rate of change, or acceleration ofmore » the observed volcanic activity parameters. The parameter used in this study is the seismic energy value of Merapi Volcano from 1990 – 2012. The data was plotted in form of graphs of seismic energy rate inverse versus time with FFM graphical technique approach uses simple linear regression. The data quality control used to increase the time precision employs the data correlation coefficient value of the seismic energy rate inverse versus time. From the results of graph analysis, the precision of prediction time toward the real time of eruption vary between −2.86 up to 5.49 days.« less

  17. Eruption-induced modifications to volcanic seismicity at Ruapehu, New Zealand, and its implications for eruption forecasting

    USGS Publications Warehouse

    Bryan, C.J.; Sherburn, S.

    2003-01-01

    Broadband seismic data collected on Ruapehu volcano, New Zealand, in 1994 and 1998 show that the 1995-1996 eruptions of Ruapehu resulted in a significant change in the frequency content of tremor and volcanic earthquakes at the volcano. The pre-eruption volcanic seismicity was characterized by several independent dominant frequencies, with a 2 Hz spectral peak dominating the strongest tremor and volcanic earthquakes and higher frequencies forming the background signal. The post-eruption volcanic seismicity was dominated by a 0.8-1.4 Hz spectral peak not seen before the eruptions. The 2 Hz and higher frequency signals remained, but were subordinate to the 0.8-1.4 Hz energy. That the dominant frequencies of volcanic tremor and volcanic earthquakes were identical during the individual time periods prior to and following the 1995-1996 eruptions suggests that during each of these time periods the volcanic tremor and earthquakes were generated by the same source process. The overall change in the frequency content, which occurred during the 1995-1996 eruptions and remains as of the time of the writing of this paper, most likely resulted from changes in the volcanic plumbing system and has significant implications for forecasting and real-time assessment of future eruptive activity at Ruapehu.

  18. Changes in shear-wave splitting before volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Liu, Sha; Crampin, Stuart

    2015-04-01

    We have shown that observations of shear-wave splitting (SWS) monitor stress-accumulation and stress-relaxation before earthquakes which allows the time, magnitude, and in some circumstances fault-plane of impending earthquakes to be stress-forecast. (We call this procedure stress-forecasting rather than predicting or forecasting to emphasise the different formalism.) We have stress-forecast these parameters successfully three-days before a 1988 M5 earthquake in SW Iceland, and identified characteristic anomalies retrospectively before ~16 other earthquakes in Iceland and elsewhere. SWS monitors microcrack geometry and shows that microcracks are so closely spaced that they verge on fracturing and earthquakes. Phenomena verging on failure in this way are critical-systems with 'butterfly wings' sensitivity. Such critical-systems are very common. The Earth is an archetypal complex heterogeneous interactive phenomenon and must be expected to be a critical-system. We claim this critical system as a New Geophysics of a critically-microcracked rock mass. Such critical systems impose a range of fundamentally-new properties on conventional sub-critical physics/geophysics, one of which is universality. Consequently it is expected that we observe similar stress-accumulation and stress-relaxation before volcanic eruptions to those before earthquakes. There are three eruptions where appropriate changes in SWS have been observed similar to those observed before earthquakes. These are: the 1996 Gjálp fissure eruption, Vatnajökull, Iceland; a 2001 flank eruption on Mount Etna, Sicily (reported by Francesca Bianco, INGV, Naples); and the 2010 Eyjafjajökull ash-cloud eruption, SW Iceland. These will be presented in the same normalised format as is used before earthquakes. The 1996 Gjálp eruption showed a 2½-month stress-accumulation, and a ~1-year stress-relaxation (attributed to the North Atlantic Ridge adjusting to the magma injection beneath the Vatnajökull Ice Cap). The 2001 flank eruption of Etna showed stress-accumulation and stress-relaxation typical of a small earthquake. However, the changes in SWS before the 2010 Eyjafjajökull Eruption, SW Iceland, showed the most distinctive correlations with earthquakes, as it was only ~90km-west of the 1988 M5 in SW Iceland, which was successfully stress-forecast. The behaviour of SWS before the M5 earthquake and the Eyjafjajökull flank (ash-cloud) eruption is almost identical both showing linear stress-accumulation increases, and linear stress-relaxation decreases to the earthquake and the onset of the flank eruption, respectively. There are comparable slopes and durations. We consider this strong confirmation of the universality property of the New Geophysics of a critically-microcracked Earth. Papers referring to these developments can be found in geos.ed.ac.uk/home/scrampin/opinion. Also see abstracts in EGU2015 Sessions: Crampin & Gao (SM1.1), Gao & Crampin (SM3.1), and Crampin & Gao (GD.1).

  19. Identifying open and closed system behaviors at Tungurahua volcano (Ecuador) using SO2 and seismo-acoustic measurements

    NASA Astrophysics Data System (ADS)

    Hidalgo, Silvana; Battaglia, Jean; Bernard, Benjamin; Steele, Alexander; Arellano, Santiago; Galle, Bo

    2014-05-01

    Tungurahua is one of the most active volcanoes in Ecuador. It is located in Central Ecuador, 160 km South of Quito and 8 km South of the touristic town of Baños. Tungurahua had one eruption every century since 1500, with an activity characterized by ash fallouts and pyroclastic and lava flows. The current eruptive period of Tungurahua began in 1999 with multiple episodes of explosive activity that have threatened the local population. The monitoring network is constituted by 5 short period and 5 broadband seismic stations, 4 DOAS permanent instruments, 4 tiltmeters, 2 permanent high resolution GPS, 4 digital cameras and 10 acoustic flow monitors. The correct interpretation of the different data acquired by this network allows a better understanding of the eruptive behavior of Tungurahua in order to provide early warning to the local population. Tungurahua changed its behavior from a continuously erupting volcano, as it was until 2008, to a sporadically erupting one, showing clear quiescence phases lasting from 40 to 184 days, and intense activity phases lasting from 15 to 70 days. Activity phases are characterized by Strombolian and Vulcanian eruptive styles, producing ash fallouts and in a few occasions pyroclastic flows. In terms of hazard to the local population, one of the goals of monitoring Tungurahura is to forecast the onset and evolution of eruptive phases. In particular the occurrence of large Vulcanian explosions which occur when the conduit is closed is a major issue. Since 2010 we focused our study on the relation between SO2 gas emissions, the seismic and acoustic energies of explosions and the tremor amplitudes. The first observation of comparing these different datasets is that the correlation between seismic and SO2 degassing is not straightforward, and actually the relation reflects the conditions at the vent: open or closed. The onset of eruptive phases in open conduit conditions can be identified which leads to an effective eruption forecasting. An example of this behavior is the eruptive phase between December 2009 and March 2010 when SO2 measurements increased 4 days before the amplitude of tremor and 9 days before the occurrence of the first explosions. Conversely, if the vent is closed at the beginning of a phase and no evident seismic precursors are observed forecasting is hardly possible. During an ongoing eruptive phase, the relation between these parameters allows to identify periods when the conduit is totally open as degassing may occur almost without generating any seismicity. Therefore the forecasting of escalating open conduit activity or a partial closing of the system is possible. Such a case was observed and forecasted on December 2011. In this work, we present observational evidence of these mechanisms which are used to identify possible patterns of evolution of the activity, contributing to a more effective volcanic hazard assessment.

  20. Leveraging Past and Current Measurements to Probabilistically Nowcast Low Visibility Procedures at an Airport

    NASA Astrophysics Data System (ADS)

    Mayr, G. J.; Kneringer, P.; Dietz, S. J.; Zeileis, A.

    2016-12-01

    Low visibility or low cloud ceiling reduce the capacity of airports by requiring special low visibility procedures (LVP) for incoming/departing aircraft. Probabilistic forecasts when such procedures will become necessary help to mitigate delays and economic losses.We compare the performance of probabilistic nowcasts with two statistical methods: ordered logistic regression, and trees and random forests. These models harness historic and current meteorological measurements in the vicinity of the airport and LVP states, and incorporate diurnal and seasonal climatological information via generalized additive models (GAM). The methods are applied at Vienna International Airport (Austria). The performance is benchmarked against climatology, persistence and human forecasters.

  1. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  2. Probabilistic Hazard Estimation at a Densely Urbanised Area: the Neaples Volcanoes

    NASA Astrophysics Data System (ADS)

    de Natale, G.; Mastrolorenzo, G.; Panizza, A.; Pappalardo, L.; Claudia, T.

    2005-12-01

    The Neaples volcanic area (Southern Italy), including Vesuvius, Campi Flegrei caldera and Ischia island, is the highest risk one in the World, where more than 2 million people live within about 10 km from an active volcanic vent. Such an extreme risk calls for accurate methodologies aimed to quantify it, in a probabilistic way, considering all the available volcanological information as well as modelling results. In fact, simple hazard maps based on the observation of deposits from past eruptions have the major problem that eruptive history generally samples a very limited number of possible outcomes, thus resulting almost meaningless to get the event probability in the area. This work describes a methodology making the best use (from a Bayesian point of view) of volcanological data and modelling results, to compute probabilistic hazard maps from multi-vent explosive eruptions. The method, which follows an approach recently developed by the same authors for pyroclastic flows hazard, has been here improved and extended to compute also fall-out hazard. The application of the method to the Neapolitan volcanic area, including the densely populated city of Naples, allows, for the first time, to get a global picture of the areal distribution for the main hazards from multi-vent explosive eruptions. From a joint consideration of the hazard contributions from all the three volcanic areas, new insight on the volcanic hazard distribution emerges, which will have strong implications for urban and emergency planning in the area.

  3. Enhancing Community Based Early Warning Systems in Nepal with Flood Forecasting Using Local and Global Models

    NASA Astrophysics Data System (ADS)

    Dugar, Sumit; Smith, Paul; Parajuli, Binod; Khanal, Sonu; Brown, Sarah; Gautam, Dilip; Bhandari, Dinanath; Gurung, Gehendra; Shakya, Puja; Kharbuja, RamGopal; Uprety, Madhab

    2017-04-01

    Operationalising effective Flood Early Warning Systems (EWS) in developing countries like Nepal poses numerous challenges, with complex topography and geology, sparse network of river and rainfall gauging stations and diverse socio-economic conditions. Despite these challenges, simple real-time monitoring based EWSs have been in place for the past decade. A key constraint of these simple systems is the very limited lead time for response - as little as 2-3 hours, especially for rivers originating from steep mountainous catchments. Efforts to increase lead time for early warning are focusing on imbedding forecasts into the existing early warning systems. In 2016, the Nepal Department of Hydrology and Meteorology (DHM) piloted an operational Probabilistic Flood Forecasting Model in major river basins across Nepal. This comprised a low data approach to forecast water levels, developed jointly through a research/practitioner partnership with Lancaster University and WaterNumbers (UK) and the International NGO Practical Action. Using Data-Based Mechanistic Modelling (DBM) techniques, the model assimilated rainfall and water levels to generate localised hourly flood predictions, which are presented as probabilistic forecasts, increasing lead times from 2-3 hours to 7-8 hours. The Nepal DHM has simultaneously started utilizing forecasts from the Global Flood Awareness System (GLoFAS) that provides streamflow predictions at the global scale based upon distributed hydrological simulations using numerical ensemble weather forecasts from the ECMWF (European Centre for Medium-Range Weather Forecasts). The aforementioned global and local models have already affected the approach to early warning in Nepal, being operational during the 2016 monsoon in the West Rapti basin in Western Nepal. On 24 July 2016, GLoFAS hydrological forecasts for the West Rapti indicated a sharp rise in river discharge above 1500 m3/sec (equivalent to the river warning level at 5 meters) with 53% probability of exceeding the Medium Level Alert in two days. Rainfall stations upstream of the West Rapti catchment recorded heavy rainfall on 26 July, and localized forecasts from the probabilistic model at 8 am suggested that the water level would cross a pre-determined warning level in the next 3 hours. The Flood Forecasting Section at DHM issued a flood advisory, and disseminated SMS flood alerts to more than 13,000 at-risk people residing along the floodplains. Water levels crossed the danger threshold (5.4 meters) at 11 am, peaking at 8.15 meters at 10 pm. Extension of the warning lead time from probabilistic forecasts was significant in minimising the risk to lives and livelihoods as communities gained extra time to prepare, evacuate and respond. Likewise, longer timescale forecasts from GLoFAS could be potentially linked with no-regret early actions leading to improved preparedness and emergency response. These forecasting tools have contributed to enhance the effectiveness and efficiency of existing community based systems, increasing the lead time for response. Nevertheless, extensive work is required on appropriate ways to interpret and disseminate probabilistic forecasts having longer (2-14 days) and shorter (3-5 hours) time horizon for operational deployment as there are numerous uncertainties associated with predictions.

  4. Towards a Proactive Risk Mitigation Strategy at La Fossa Volcano, Vulcano Island

    NASA Astrophysics Data System (ADS)

    Biass, S.; Gregg, C. E.; Frischknecht, C.; Falcone, J. L.; Lestuzzi, P.; di Traglia, F.; Rosi, M.; Bonadonna, C.

    2014-12-01

    A comprehensive risk assessment framework was built to develop proactive risk reduction measures for Vulcano Island, Italy. This framework includes identification of eruption scenarios; probabilistic hazard assessment, quantification of hazard impacts on the built environment, accessibility assessment on the island and risk perception study. Vulcano, a 21 km2 island with two primary communities host to 900 permanent residents and up to 10,000 visitors during summer, shows a strong dependency on the mainland for basic needs (water, energy) and relies on a ~2 month tourism season for its economy. The recent stratigraphy reveals a dominance of vulcanian and subplinian eruptions, producing a range of hazards acting at different time scales. We developed new methods to probabilistically quantify the hazard related to ballistics, lahars and tephra for all eruption styles. We also elaborated field- and GIS- based methods to assess the physical vulnerability of the built environment and created dynamic models of accessibility. Results outline the difference of hazard between short and long-lasting eruptions. A subplinian eruption has a 50% probability of impacting ~30% of the buildings within days after the eruption, but the year-long damage resulting from a long-lasting vulcanian eruption is similar if tephra is not removed from rooftops. Similarly, a subplinian eruption results in a volume of 7x105 m3 of material potentially remobilized into lahars soon after the eruption. Similar volumes are expected for a vulcanian activity over years, increasing the hazard of small lahars. Preferential lahar paths affect critical infrastructures lacking redundancy, such as the road network, communications systems, the island's only gas station, and access to the island's two evacuation ports. Such results from hazard, physical and systemic vulnerability help establish proactive volcanic risk mitigation strategies and may be applicable in other island settings.

  5. Calibration of decadal ensemble predictions

    NASA Astrophysics Data System (ADS)

    Pasternack, Alexander; Rust, Henning W.; Bhend, Jonas; Liniger, Mark; Grieger, Jens; Müller, Wolfgang; Ulbrich, Uwe

    2017-04-01

    Decadal climate predictions are of great socio-economic interest due to the corresponding planning horizons of several political and economic decisions. Due to uncertainties of weather and climate, forecasts (e.g. due to initial condition uncertainty), they are issued in a probabilistic way. One issue frequently observed for probabilistic forecasts is that they tend to be not reliable, i.e. the forecasted probabilities are not consistent with the relative frequency of the associated observed events. Thus, these kind of forecasts need to be re-calibrated. While re-calibration methods for seasonal time scales are available and frequently applied, these methods still have to be adapted for decadal time scales and its characteristic problems like climate trend and lead time dependent bias. Regarding this, we propose a method to re-calibrate decadal ensemble predictions that takes the above mentioned characteristics into account. Finally, this method will be applied and validated to decadal forecasts from the MiKlip system (Germany's initiative for decadal prediction).

  6. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    A search for space and time regularities in volcanic and seismic events for the purpose of forecast method development seems to be of current concern, both scientifically and practically. The seismic and volcanic processes take place in the Earth's field of gravity which in turn is closely related to gravitational fields of the Moon, the Sun, and the planets of the Solar System. It is mostly gravity and tidal forces that exercise control over the Earth's configuration and relief. Dynamic gravitational interaction between the Earth and other celestial bodies makes itself evident in tidal phenomena and other effects in the geospheres (including the Earth's crust). Dynamics of the tidal and attractive forces is responsible for periodical changes in gravity force, both in value and direction [Darwin, 1965], in the rate of rotation and orbital speed; that implies related changes in the endogenic activity of the Earth. The Earth's rotation in the alternating gravitational field accounts to a considerable extent for regular pattern of crustal deformations and dislocations; it is among principal factors that control the Earth's form and structure, distribution of oceans and continents and, probably, continental drift [Peive, 1969; Khain, 1973; Kosygin, 1983]. The energy of gravitational interaction is transmitted through the tidal energy to planetary spheres and feeds various processes there, including volcanic and seismic ones. To determine degree, character and special features of tidal force contribution to the volcanic and seismic processes is of primary importance for understanding of genetic and dynamic aspects of volcanism and seismicity. Both volcanic and seismic processes are involved in evolution of celestial bodies; they are operative on the planets of the Earth group and many satellites [Essays…, 1981; Lukashov, 1996]. From this standpoint, studies of those processes are essential with a view to development of scenarios of the Earth's evolution as a celestial body, as well as to forecast of changes in its relief. As the volcanic and seismic processes are of cosmic nature and occurrence, it seems logical to investigate their chronological structure in terms of astronomical time reference system or in parameters of the Earth orbital movement. Gravitational interaction of the Earth with the moon, the Sun and planets of the Solar system forms the physical basis of this multidimensional system; it manifests itself in tidal deformations of the Earth's lithosphere and in periodical changes in the planet rotation and orbital speed. A search for chronological correlation between the Earth's volcanism and seismicity on one hand and the orbital parameters dynamic on the other shows a certain promise in relation to prognostic decisions. It should be kept in mind that the calculation of astronomical characteristics (Ephemerides), which is one of the main lines in theoretical astronomy, spans many years both in the past and in future. It seems appropriate therefore to apply the astronomical time reference system to investigations of chronological structure of volcanic and seismic processes from the methodical viewpoint, as well as for retrospective and prognostic analyses. To investigate temporal pattern of the volcanic and seismic processes and to find a degree of their dependence on tidal forces, we used the astronomical time reference system as related to the Earth's orbital movement. The system is based on substitution of calendar dates of eruption and earthquakes for corresponding values of known astronomical characteristics, such as the Earth to Sun and Earth to Moon distances, ecliptic latitude of the Moon, etc. In coordinates of astronomical parameters (JPL Planetary and Lunar Efemerides, 1997, as compiled by the Jet Propulsion Laboratory, California Institute of Technology, on the basis of DE 406 block developed by NASA), we analyzed massifs of information, both volcanological (Catalogue of the World volcanic eruptions by I.I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3. Features of spatial distribution of volcanic eruptions and earthquakes of magnitude 7 were analyzed, and those related to the Earth rotation identified. Frequencies of their spatial distribution are calculated. Using those parameters as the base, a scheme (algorithm) of probabilistic monitoring (long-range forecast) has been developed for volcanic and seismic events. Refereces (in Russian): 1. Fedorov V.M. Gravitational factors and astronomy-based chronology of processes in geospheres. Moscow University Publishing House, 2000. 368 p. 2. Fedorov V.M. Comparison between chronology of the Earth volcanic activity and characteristics of its orbital motion // Vulkanologiya i seismologiya, № 5, 2001, p. 65-67. 3. Fedorov V.M. Specific features of latitudinal distribution of volcanic eruptions// Vulkanologiya i seismologiya, № 4, 2002, p.39-43. 4. Fedorov V.M. Specific features of latitudinal distribution of endogenic relief-forming processes and the rotation of the Earth // Geomorphologiya, № 1, 2003, p.3-9. 5. Fedorov V.M. Comparison between chronology of the Earth volcanic and seismic activity and characteristics of its orbital motion // Izvestiya RAS. Ser. Geogr. № 5, 2003, p.16-20. 6. Fedorov V.M. Chronological structure and probability of volcanic events as related to tidal deformation of lithosphere // Vulkanologiya i seismologiya, № 1, 2005, p.44-50. 7. Fedorov V.M. Multidimensional analysis and a probabilistic model of the activity of endogenic relief-forming processes // Geomorphology, № 2, 2007, p. 37 - 48. 8. Fedorov V.M. Multidimensional analysis - is a spatiotemporal structure of the geodynamic activity of Earth// Vestnik Moskovskogo Universiteta; Ser. 4. Geology, № 4, 2007, p. 24-31.

  7. Testing for ontological errors in probabilistic forecasting models of natural systems

    PubMed Central

    Marzocchi, Warner; Jordan, Thomas H.

    2014-01-01

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  8. Forecasting magma-chamber rupture at Santorini volcano, Greece.

    PubMed

    Browning, John; Drymoni, Kyriaki; Gudmundsson, Agust

    2015-10-28

    How much magma needs to be added to a shallow magma chamber to cause rupture, dyke injection, and a potential eruption? Models that yield reliable answers to this question are needed in order to facilitate eruption forecasting. Development of a long-lived shallow magma chamber requires periodic influx of magmas from a parental body at depth. This redistribution process does not necessarily cause an eruption but produces a net volume change that can be measured geodetically by inversion techniques. Using continuum-mechanics and fracture-mechanics principles, we calculate the amount of magma contained at shallow depth beneath Santorini volcano, Greece. We demonstrate through structural analysis of dykes exposed within the Santorini caldera, previously published data on the volume of recent eruptions, and geodetic measurements of the 2011-2012 unrest period, that the measured 0.02% increase in volume of Santorini's shallow magma chamber was associated with magmatic excess pressure increase of around 1.1 MPa. This excess pressure was high enough to bring the chamber roof close to rupture and dyke injection. For volcanoes with known typical extrusion and intrusion (dyke) volumes, the new methodology presented here makes it possible to forecast the conditions for magma-chamber failure and dyke injection at any geodetically well-monitored volcano.

  9. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil

    PubMed Central

    Lowe, Rachel; Coelho, Caio AS; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-01-01

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics. DOI: http://dx.doi.org/10.7554/eLife.11285.001 PMID:26910315

  10. A Local Forecast of Land Surface Wetness Conditions, Drought, and St. Louis Encephalitis Virus Transmission Derived from Seasonal Climate Predictions

    NASA Astrophysics Data System (ADS)

    Shaman, J.; Stieglitz, M.; Zebiak, S.; Cane, M.; Day, J. F.

    2002-12-01

    We present an ensemble local hydrologic forecast derived from the seasonal forecasts of the International Research Institute (IRI) for Climate Prediction. Three- month seasonal forecasts were used to resample historical meteorological conditions and generate ensemble forcing datasets for a TOPMODEL-based hydrology model. Eleven retrospective forecasts were run at a Florida and New York site. Forecast skill was assessed for mean area modeled water table depth (WTD), i.e. near surface soil wetness conditions, and compared with WTD simulated with observed data. Hydrology model forecast skill was evident at the Florida site but not at the New York site. At the Florida site, persistence of hydrologic conditions and local skill of the IRI seasonal forecast contributed to the local hydrologic forecast skill. This forecast will permit probabilistic prediction of future hydrologic conditions. At the Florida site, we have also quantified the link between modeled WTD (i.e. drought) and the amplification and transmission of St. Louis Encephalitis virus (SLEV). We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission associated with human clinical cases. We then combine the seasonal forecasts of local, modeled WTD with this empirical relationship and produce retrospective probabilistic seasonal forecasts of epidemic SLEV transmission in Florida. Epidemic SLEV transmission forecast skill is demonstrated. These findings will permit real-time forecast of drought and resultant SLEV transmission in Florida.

  11. Parametric decadal climate forecast recalibration (DeFoReSt 1.0)

    NASA Astrophysics Data System (ADS)

    Pasternack, Alexander; Bhend, Jonas; Liniger, Mark A.; Rust, Henning W.; Müller, Wolfgang A.; Ulbrich, Uwe

    2018-01-01

    Near-term climate predictions such as decadal climate forecasts are increasingly being used to guide adaptation measures. For near-term probabilistic predictions to be useful, systematic errors of the forecasting systems have to be corrected. While methods for the calibration of probabilistic forecasts are readily available, these have to be adapted to the specifics of decadal climate forecasts including the long time horizon of decadal climate forecasts, lead-time-dependent systematic errors (drift) and the errors in the representation of long-term changes and variability. These features are compounded by small ensemble sizes to describe forecast uncertainty and a relatively short period for which typically pairs of reforecasts and observations are available to estimate calibration parameters. We introduce the Decadal Climate Forecast Recalibration Strategy (DeFoReSt), a parametric approach to recalibrate decadal ensemble forecasts that takes the above specifics into account. DeFoReSt optimizes forecast quality as measured by the continuous ranked probability score (CRPS). Using a toy model to generate synthetic forecast observation pairs, we demonstrate the positive effect on forecast quality in situations with pronounced and limited predictability. Finally, we apply DeFoReSt to decadal surface temperature forecasts from the MiKlip prototype system and find consistent, and sometimes considerable, improvements in forecast quality compared with a simple calibration of the lead-time-dependent systematic errors.

  12. Probabilistic Space Weather Forecasting: a Bayesian Perspective

    NASA Astrophysics Data System (ADS)

    Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.

    2017-12-01

    Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.

  13. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    USGS Publications Warehouse

    Bonasia, Rosanna; Scaini, Chirara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2013-01-01

    Popocatépetl is one of Mexico’s most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene–Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl’s reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the “Ochre Pumice” Plinian eruption (4965 14C yr BP). FALL3D model input eruptive parameters are constrained through an inversion method carried out with the semi-analytical HAZMAP model and are varied by sampling them using probability density functions. We analyze the influence of seasonal variations on ash dispersal and estimate the average persistence of critical ash concentrations at relevant locations and airports. This study assesses the impact that a Plinian eruption similar to the Ochre Pumice eruption would have on the main airports of Mexico and adjacent areas. The hazard maps presented here can support long-term planning that would help minimize the impacts of such an eruption on civil aviation.

  14. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    NASA Astrophysics Data System (ADS)

    Bonasia, Rosanna; Scaini, Chiara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2014-01-01

    Popocatépetl is one of Mexico's most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene-Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl's reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the "Ochre Pumice" Plinian eruption (4965 14C yr BP). FALL3D model input eruptive parameters are constrained through an inversion method carried out with the semi-analytical HAZMAP model and are varied by sampling them using probability density functions. We analyze the influence of seasonal variations on ash dispersal and estimate the average persistence of critical ash concentrations at relevant locations and airports. This study assesses the impact that a Plinian eruption similar to the Ochre Pumice eruption would have on the main airports of Mexico and adjacent areas. The hazard maps presented here can support long-term planning that would help minimize the impacts of such an eruption on civil aviation.

  15. Near-term probabilistic forecast of significant wildfire events for the Western United States

    Treesearch

    Haiganoush K. Preisler; Karin L. Riley; Crystal S. Stonesifer; Dave E. Calkin; Matt Jolly

    2016-01-01

    Fire danger and potential for large fires in the United States (US) is currently indicated via several forecasted qualitative indices. However, landscape-level quantitative forecasts of the probability of a large fire are currently lacking. In this study, we present a framework for forecasting large fire occurrence - an extreme value event - and evaluating...

  16. The longevity of lava dome eruptions: analysis of the global DomeHaz database

    NASA Astrophysics Data System (ADS)

    Ogburn, S. E.; Wolpert, R.; Calder, E.; Pallister, J. S.; Wright, H. M. N.

    2015-12-01

    The likely duration of ongoing volcanic eruptions is a topic of great interest to volcanologists, volcano observatories, and communities near volcanoes. Lava dome forming eruptions can last from days to centuries, and can produce violent, difficult-to-forecast activity including vulcanian to plinian explosions and pyroclastic density currents. Periods of active dome extrusion are often interspersed with periods of relative quiescence, during which extrusion may slow or pause altogether, but persistent volcanic unrest continues. This contribution focuses on the durations of these longer-term unrest phases, hereafter eruptions, that include periods of both lava extrusion and quiescence. A new database of lava dome eruptions, DomeHaz, provides characteristics of 228 eruptions at 127 volcanoes; for which 177 have duration information. We find that while 78% of dome-forming eruptions do not continue for more than 5 years, the remainder can be very long-lived. The probability distributions of eruption durations are shown to be heavy-tailed and vary by magma composition. For this reason, eruption durations are modeled with generalized Pareto distributions whose governing parameters depend on each volcano's composition and eruption duration to date. Bayesian predictive distributions and associated uncertainties are presented for the remaining duration of ongoing eruptions of specified composition and duration to date. Forecasts of such natural events will always have large uncertainties, but the ability to quantify such uncertainty is key to effective communication with stakeholders and to mitigation of hazards. Projections are made for the remaining eruption durations of ongoing eruptions, including those at Soufrière Hills Volcano, Montserrat and Sinabung, Indonesia. This work provides a quantitative, transferable method and rationale on which to base long-term planning decisions for dome forming volcanoes of different compositions, regardless of the quality of an individual volcano's eruptive record, by leveraging a global database.

  17. Satellite-driven modeling approach for monitoring lava flow hazards during the 2017 Etna eruption

    NASA Astrophysics Data System (ADS)

    Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.; Zago, V.

    2017-12-01

    The integration of satellite data and modeling represents an efficient strategy that may provide immediate answers to the main issues raised at the onset of a new effusive eruption. Satellite-based thermal remote sensing of hotspots related to effusive activity can effectively provide a variety of products suited to timing, locating, and tracking the radiant character of lava flows. Hotspots show the location and occurrence of eruptive events (vents). Discharge rate estimates may indicate the current intensity (effusion rate) and potential magnitude (volume). High-spatial resolution multispectral satellite data can complement field observations for monitoring the front position (length) and extension of flows (area). Physics-based models driven, or validated, by satellite-derived parameters are now capable of fast and accurate forecast of lava flow inundation scenarios (hazard). Here, we demonstrate the potential of the integrated application of satellite remote-sensing techniques and lava flow models during the 2017 effusive eruption at Mount Etna in Italy. This combined approach provided insights into lava flow field evolution by supplying detailed views of flow field construction (e.g., the opening of ephemeral vents) that were useful for more accurate and reliable forecasts of eruptive activity. Moreover, we gave a detailed chronology of the lava flow activity based on field observations and satellite images, assessed the potential extent of impacted areas, mapped the evolution of lava flow field, and executed hazard projections. The underside of this combination is the high sensitivity of lava flow inundation scenarios to uncertainties in vent location, discharge rate, and other parameters, which can make interpreting hazard forecasts difficult during an effusive crisis. However, such integration at last makes timely forecasts of lava flow hazards during effusive crises possible at the great majority of volcanoes for which no monitoring exists.

  18. Ensemble superparameterization versus stochastic parameterization: A comparison of model uncertainty representation in tropical weather prediction

    NASA Astrophysics Data System (ADS)

    Subramanian, Aneesh C.; Palmer, Tim N.

    2017-06-01

    Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.Plain Language SummaryProbabilistic weather forecasts, especially for tropical weather, is still a significant challenge for global weather forecasting systems. Expressing uncertainty along with weather forecasts is important for informed decision making. Hence, we explore the use of a relatively new approach in using super-parameterization, where a cloud resolving model is embedded within a global model, in probabilistic tropical weather forecasts at medium range. We show that this approach helps improve modeling uncertainty in forecasts of certain features such as precipitation magnitude and location better, but forecasts of tropical winds are not necessarily improved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014GeoRL..41.2637B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014GeoRL..41.2637B"><span>Improving volcanic sulfur dioxide cloud dispersal forecasts by progressive assimilation of satellite observations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Boichu, Marie; Clarisse, Lieven; Khvorostyanov, Dmitry; Clerbaux, Cathy</p> <p>2014-04-01</p> <p>Forecasting the dispersal of volcanic clouds during an eruption is of primary importance, especially for ensuring aviation safety. As volcanic emissions are characterized by rapid variations of emission rate and height, the (generally) high level of uncertainty in the emission parameters represents a critical issue that limits the robustness of volcanic cloud dispersal forecasts. An inverse modeling scheme, combining satellite observations of the volcanic cloud with a regional chemistry-transport model, allows reconstructing this source term at high temporal resolution. We demonstrate here how a progressive assimilation of freshly acquired satellite observations, via such an inverse modeling procedure, allows for delivering robust sulfur dioxide (SO2) cloud dispersal forecasts during the eruption. This approach provides a computationally cheap estimate of the expected location and mass loading of volcanic clouds, including the identification of SO2-rich parts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5565406','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5565406"><span>Empirical prediction intervals improve energy forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kaack, Lynn H.; Apt, Jay; Morgan, M. Granger; McSharry, Patrick</p> <p>2017-01-01</p> <p>Hundreds of organizations and analysts use energy projections, such as those contained in the US Energy Information Administration (EIA)’s Annual Energy Outlook (AEO), for investment and policy decisions. Retrospective analyses of past AEO projections have shown that observed values can differ from the projection by several hundred percent, and thus a thorough treatment of uncertainty is essential. We evaluate the out-of-sample forecasting performance of several empirical density forecasting methods, using the continuous ranked probability score (CRPS). The analysis confirms that a Gaussian density, estimated on past forecasting errors, gives comparatively accurate uncertainty estimates over a variety of energy quantities in the AEO, in particular outperforming scenario projections provided in the AEO. We report probabilistic uncertainties for 18 core quantities of the AEO 2016 projections. Our work frames how to produce, evaluate, and rank probabilistic forecasts in this setting. We propose a log transformation of forecast errors for price projections and a modified nonparametric empirical density forecasting method. Our findings give guidance on how to evaluate and communicate uncertainty in future energy outlooks. PMID:28760997</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li class="active"><span>9</span></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_9 --> <div id="page_10" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li class="active"><span>10</span></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="181"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ACP....18.4019M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ACP....18.4019M"><span>Volcanic ash modeling with the NMMB-MONARCH-ASH model: quantification of offline modeling errors</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Marti, Alejandro; Folch, Arnau</p> <p>2018-03-01</p> <p>Volcanic ash modeling systems are used to simulate the atmospheric dispersion of volcanic ash and to generate forecasts that quantify the impacts from volcanic eruptions on infrastructures, air quality, aviation, and climate. The efficiency of response and mitigation actions is directly associated with the accuracy of the volcanic ash cloud detection and modeling systems. Operational forecasts build on offline coupled modeling systems in which meteorological variables are updated at the specified coupling intervals. Despite the concerns from other communities regarding the accuracy of this strategy, the quantification of the systematic errors and shortcomings associated with the offline modeling systems has received no attention. This paper employs the NMMB-MONARCH-ASH model to quantify these errors by employing different quantitative and categorical evaluation scores. The skills of the offline coupling strategy are compared against those from an online forecast considered to be the best estimate of the true outcome. Case studies are considered for a synthetic eruption with constant eruption source parameters and for two historical events, which suitably illustrate the severe aviation disruptive effects of European (2010 Eyjafjallajökull) and South American (2011 Cordón Caulle) volcanic eruptions. Evaluation scores indicate that systematic errors due to the offline modeling are of the same order of magnitude as those associated with the source term uncertainties. In particular, traditional offline forecasts employed in operational model setups can result in significant uncertainties, failing to reproduce, in the worst cases, up to 45-70 % of the ash cloud of an online forecast. These inconsistencies are anticipated to be even more relevant in scenarios in which the meteorological conditions change rapidly in time. The outcome of this paper encourages operational groups responsible for real-time advisories for aviation to consider employing computationally efficient online dispersal models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.1414S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.1414S"><span>Seasonal streamflow prediction using ensemble streamflow prediction technique for the Rangitata and Waitaki River basins on the South Island of New Zealand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Singh, Shailesh Kumar</p> <p>2014-05-01</p> <p>Streamflow forecasts are essential for making critical decision for optimal allocation of water supplies for various demands that include irrigation for agriculture, habitat for fisheries, hydropower production and flood warning. The major objective of this study is to explore the Ensemble Streamflow Prediction (ESP) based forecast in New Zealand catchments and to highlights the present capability of seasonal flow forecasting of National Institute of Water and Atmospheric Research (NIWA). In this study a probabilistic forecast framework for ESP is presented. The basic assumption in ESP is that future weather pattern were experienced historically. Hence, past forcing data can be used with current initial condition to generate an ensemble of prediction. Small differences in initial conditions can result in large difference in the forecast. The initial state of catchment can be obtained by continuously running the model till current time and use this initial state with past forcing data to generate ensemble of flow for future. The approach taken here is to run TopNet hydrological models with a range of past forcing data (precipitation, temperature etc.) with current initial conditions. The collection of runs is called the ensemble. ESP give probabilistic forecasts for flow. From ensemble members the probability distributions can be derived. The probability distributions capture part of the intrinsic uncertainty in weather or climate. An ensemble stream flow prediction which provide probabilistic hydrological forecast with lead time up to 3 months is presented for Rangitata, Ahuriri, and Hooker and Jollie rivers in South Island of New Zealand. ESP based seasonal forecast have better skill than climatology. This system can provide better over all information for holistic water resource management.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14..528L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14..528L"><span>Probabilistic forecasts based on radar rainfall uncertainty</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liguori, S.; Rico-Ramirez, M. A.</p> <p>2012-04-01</p> <p>The potential advantages resulting from integrating weather radar rainfall estimates in hydro-meteorological forecasting systems is limited by the inherent uncertainty affecting radar rainfall measurements, which is due to various sources of error [1-3]. The improvement of quality control and correction techniques is recognized to play a role for the future improvement of radar-based flow predictions. However, the knowledge of the uncertainty affecting radar rainfall data can also be effectively used to build a hydro-meteorological forecasting system in a probabilistic framework. This work discusses the results of the implementation of a novel probabilistic forecasting system developed to improve ensemble predictions over a small urban area located in the North of England. An ensemble of radar rainfall fields can be determined as the sum of a deterministic component and a perturbation field, the latter being informed by the knowledge of the spatial-temporal characteristics of the radar error assessed with reference to rain-gauges measurements. This approach is similar to the REAL system [4] developed for use in the Southern-Alps. The radar uncertainty estimate can then be propagated with a nowcasting model, used to extrapolate an ensemble of radar rainfall forecasts, which can ultimately drive hydrological ensemble predictions. A radar ensemble generator has been calibrated using radar rainfall data made available from the UK Met Office after applying post-processing and corrections algorithms [5-6]. One hour rainfall accumulations from 235 rain gauges recorded for the year 2007 have provided the reference to determine the radar error. Statistics describing the spatial characteristics of the error (i.e. mean and covariance) have been computed off-line at gauges location, along with the parameters describing the error temporal correlation. A system has then been set up to impose the space-time error properties to stochastic perturbations, generated in real-time at gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.ars.usda.gov/research/publications/publication/?seqNo115=330499','TEKTRAN'); return false;" href="http://www.ars.usda.gov/research/publications/publication/?seqNo115=330499"><span>Predicting the US Drought Monitor (USDM) using precipitation, soil noisture, and evapotranspiration anomalies, Part II: Intraseasonal drought intensification forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ars.usda.gov/research/publications/find-a-publication/">USDA-ARS?s Scientific Manuscript database</a></p> <p></p> <p></p> <p>Probabilistic forecasts of US Drought Monitor (USDM) intensification over two, four and eight week time periods are developed based on recent anomalies in precipitation, evapotranspiration and soil moisture. These statistical forecasts are computed using logistic regression with cross validation. Wh...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1513596D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1513596D"><span>A Decision Support System for effective use of probability forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>De Kleermaeker, Simone; Verkade, Jan</p> <p>2013-04-01</p> <p>Often, water management decisions are based on hydrological forecasts. These forecasts, however, are affected by inherent uncertainties. It is increasingly common for forecasting agencies to make explicit estimates of these uncertainties and thus produce probabilistic forecasts. Associated benefits include the decision makers' increased awareness of forecasting uncertainties and the potential for risk-based decision-making. Also, a stricter separation of responsibilities between forecasters and decision maker can be made. However, simply having probabilistic forecasts available is not sufficient to realise the associated benefits. Additional effort is required in areas such as forecast visualisation and communication, decision making in uncertainty and forecast verification. Also, revised separation of responsibilities requires a shift in institutional arrangements and responsibilities. A recent study identified a number of additional issues related to the effective use of probability forecasts. When moving from deterministic to probability forecasting, a dimension is added to an already multi-dimensional problem; this makes it increasingly difficult for forecast users to extract relevant information from a forecast. A second issue is that while probability forecasts provide a necessary ingredient for risk-based decision making, other ingredients may not be present. For example, in many cases no estimates of flood damage, of costs of management measures and of damage reduction are available. This paper presents the results of the study, including some suggestions for resolving these issues and the integration of those solutions in a prototype decision support system (DSS). A pathway for further development of the DSS is outlined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70175554','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70175554"><span>Volcano-tectonic earthquakes: A new tool for estimating intrusive volumes and forecasting eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>White, Randall A.; McCausland, Wendy</p> <p>2016-01-01</p> <p>Notable cases in which distal VT events preceded eruptions at long-dormant volcanoes include: Nevado del Ruiz (1984–1985), Pinatubo (1991), Unzen (1989–1995), Soufriere Hills (1995), Shishaldin (1989–1999), Tacana' (1985–1986), Pacaya (1980–1984), Rabaul (1994), and Cotopaxi (2001). Additional cases are recognized at frequently active volcanoes including Popocateptl (2001–2003) and Mauna Loa (1984). We present four case studies (Pinatubo, Soufriere Hills, Unzen, and Tacana') in which we demonstrate the above mentioned VT characteristics prior to eruptions. Using regional data recorded by NEIC, we recognized in near-real time that a huge distal VT swarm was occurring, deduced that a proportionately huge magmatic intrusion was taking place beneath the long dormant Sulu Range, New Britain Island, Papua New Guinea, that it was likely to lead to eruptive activity, and warned Rabaul Volcano Observatory days before a phreatic eruption occurred. This confirms the value of this technique for eruption forecasting. We also present a counter-example where we deduced that a VT swarm at Volcan Cosiguina, Nicaragua, indicated a small intrusion, insufficient to reach the surface and erupt. Finally, we discuss limitations of the method and propose a mechanism by which this distal VT seismicity is triggered by magmatic intrusion.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.7286T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.7286T"><span>Generalization of information-based concepts in forecast verification</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tödter, J.; Ahrens, B.</p> <p>2012-04-01</p> <p>This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008JGRB..113.7203M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008JGRB..113.7203M"><span>Probabilistic tephra hazard maps for the Neapolitan area: Quantitative volcanological study of Campi Flegrei eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.</p> <p>2008-07-01</p> <p>Tephra fall is a relevant hazard of Campi Flegrei caldera (Southern Italy), due to the high vulnerability of Naples metropolitan area to such an event. Here, tephra derive from magmatic as well as phreatomagmatic activity. On the basis of both new and literature data on known, past eruptions (Volcanic Explosivity Index (VEI), grain size parameters, velocity at the vent, column heights and erupted mass), and factors controlling tephra dispersion (wind velocity and direction), 2D numerical simulations of fallout dispersion and deposition have been performed for a large number of case events. A bayesian inversion has been applied to retrieve the best values of critical parameters (e.g., vertical mass distribution, diffusion coefficients, velocity at the vent), not directly inferable by volcanological study. Simulations are run in parallel on multiple processors to allow a fully probabilistic analysis, on a very large catalogue preserving the statistical proprieties of past eruptive history. Using simulation results, hazard maps have been computed for different scenarios: upper limit scenario (worst-expected scenario), eruption-range scenario, and whole-eruption scenario. Results indicate that although high hazard characterizes the Campi Flegrei caldera, the territory to the east of the caldera center, including the whole district of Naples, is exposed to high hazard values due to the dominant westerly winds. Consistently with the stratigraphic evidence of nature of past eruptions, our numerical simulations reveal that even in the case of a subplinian eruption (VEI = 3), Naples is exposed to tephra fall thicknesses of some decimeters, thereby exceeding the critical limit for roof collapse. Because of the total number of people living in Campi Flegrei and the city of Naples (ca. two million of inhabitants), the tephra fallout risk related to a plinian eruption of Campi Flegrei largely matches or exceeds the risk related to a similar eruption at Vesuvius.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMGC53G1293H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMGC53G1293H"><span>Some Advances in Downscaling Probabilistic Climate Forecasts for Agricultural Decision Support</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Han, E.; Ines, A.</p> <p>2015-12-01</p> <p>Seasonal climate forecasts, commonly provided in tercile-probabilities format (below-, near- and above-normal), need to be translated into more meaningful information for decision support of practitioners in agriculture. In this paper, we will present two new novel approaches to temporally downscale probabilistic seasonal climate forecasts: one non-parametric and another parametric method. First, the non-parametric downscaling approach called FResampler1 uses the concept of 'conditional block sampling' of weather data to create daily weather realizations of a tercile-based seasonal climate forecasts. FResampler1 randomly draws time series of daily weather parameters (e.g., rainfall, maximum and minimum temperature and solar radiation) from historical records, for the season of interest from years that belong to a certain rainfall tercile category (e.g., being below-, near- and above-normal). In this way, FResampler1 preserves the covariance between rainfall and other weather parameters as if conditionally sampling maximum and minimum temperature and solar radiation if that day is wet or dry. The second approach called predictWTD is a parametric method based on a conditional stochastic weather generator. The tercile-based seasonal climate forecast is converted into a theoretical forecast cumulative probability curve. Then the deviates for each percentile is converted into rainfall amount or frequency or intensity to downscale the 'full' distribution of probabilistic seasonal climate forecasts. Those seasonal deviates are then disaggregated on a monthly basis and used to constrain the downscaling of forecast realizations at different percentile values of the theoretical forecast curve. As well as the theoretical basis of the approaches we will discuss sensitivity analysis (length of data and size of samples) of them. In addition their potential applications for managing climate-related risks in agriculture will be shown through a couple of case studies based on actual seasonal climate forecasts for: rice cropping in the Philippines and maize cropping in India and Kenya.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMGC43A0679A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMGC43A0679A"><span>Probabilistic Weather Information Tailored to the Needs of Transmission System Operators</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Alberts, I.; Stauch, V.; Lee, D.; Hagedorn, R.</p> <p>2014-12-01</p> <p>Reliable and accurate forecasts for wind and photovoltaic (PV) power production are essential for stable transmission systems. A high potential for improving the wind and PV power forecasts lies in optimizing the weather forecasts, since these energy sources are highly weather dependent. For this reason the main objective of the German research project EWeLiNE is to improve the quality the underlying numerical weather predictions towards energy operations. In this project, the German Meteorological Service (DWD), the Fraunhofer Institute for Wind Energy and Energy System Technology, and three of the German transmission system operators (TSOs) are working together to improve the weather and power forecasts. Probabilistic predictions are of particular interest, as the quantification of uncertainties provides an important tool for risk management. Theoretical considerations suggest that it can be advantageous to use probabilistic information to represent and respond to the remaining uncertainties in the forecasts. However, it remains a challenge to integrate this information into the decision making processes related to market participation and power systems operations. The project is planned and carried out in close cooperation with the involved TSOs in order to ensure the usability of the products developed. It will conclude with a demonstration phase, in which the improved models and newly developed products are combined into a process chain and used to provide information to TSOs in a real-time decision support tool. The use of a web-based development platform enables short development cycles and agile adaptation to evolving user needs. This contribution will present the EWeLiNE project and discuss ideas on how to incorporate probabilistic information into the users' current decision making processes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70095772','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70095772"><span>Volcanology: Look up for magma insights</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Segall, Paul; Anderson, Kyle</p> <p>2014-01-01</p> <p>Volcanic plumes can be hazardous to aircraft. A correlation between plume height and ground deformation during an eruption of Grímsvötn Volcano, Iceland, allows us to peer into the properties of the magma chamber and may improve eruption forecasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005AGUSM.H51D..02J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005AGUSM.H51D..02J"><span>Ensemble Streamflow Prediction in Korea: Past and Future 5 Years</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jeong, D.; Kim, Y.; Lee, J.</p> <p>2005-05-01</p> <p>The Ensemble Streamflow Prediction (ESP) approach was first introduced in 2000 by the Hydrology Research Group (HRG) at Seoul National University as an alternative probabilistic forecasting technique for improving the 'Water Supply Outlook' That is issued every month by the Ministry of Construction and Transportation in Korea. That study motivated the Korea Water Resources Corporation (KOWACO) to establish their seasonal probabilistic forecasting system for the 5 major river basins using the ESP approach. In cooperation with the HRG, the KOWACO developed monthly optimal multi-reservoir operating systems for the Geum river basin in 2004, which coupled the ESP forecasts with an optimization model using sampling stochastic dynamic programming. The user interfaces for both ESP and SSDP have also been designed for the developed computer systems to become more practical. More projects for developing ESP systems to the other 3 major river basins (i.e. the Nakdong, Han and Seomjin river basins) was also completed by the HRG and KOWACO at the end of December 2004. Therefore, the ESP system has become the most important mid- and long-term streamflow forecast technique in Korea. In addition to the practical aspects, resent research experience on ESP has raised some concerns into ways of improving the accuracy of ESP in Korea. Jeong and Kim (2002) performed an error analysis on its resulting probabilistic forecasts and found that the modeling error is dominant in the dry season, while the meteorological error is dominant in the flood season. To address the first issue, Kim et al. (2004) tested various combinations and/or combining techniques and showed that the ESP probabilistic accuracy could be improved considerably during the dry season when the hydrologic models were combined and/or corrected. In addition, an attempt was also made to improve the ESP accuracy for the flood season using climate forecast information. This ongoing project handles three types of climate forecast information: (1) the Monthly Industrial Meteorology Information Magazine (MIMIM) of the Korea Meteorological Administration (2) the Global Data Assimilation Prediction System (GDAPS), and (3) the US National Centers for Environmental Prediction (NCEP). Each of these forecasts is issued in a unique format: (1) MIMIM is a most-probable-event forecast, (2) GDAPS is a single series of deterministic forecasts, and (3) NCEP is an ensemble of deterministic forecasts. Other minor issues include how long the initial conditions influences the ESP accuracy, and how many ESP scenarios are needed to obtain the best accuracy. This presentation also addresses some future research that is needed for ESP in Korea.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFM.V51D2058M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFM.V51D2058M"><span>Fracture Mechanics Approach to Forecasting Volcanic Eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Matthews, C.; Sammonds, P.; Kilburn, C.; Woo, G.</p> <p>2008-12-01</p> <p>A medium to short-term increase in the rate of volcano-tectonic earthquake events provides one of the most useful and promising tools for eruption forecasting, particularly at subduction-zone volcanoes reawakening after a long repose interval. Two basic patterns of accelerating seismicity observed prior to eruptions are exponential and faster than exponential increases with time. While theoretical and empirical models exist that can explain these observed trends, less is known about seismic unrest at volcanoes that does not end in eruption. A comprehensive model of fracturing and failure within an edifice must also explain why volcanoes do not erupt. We have developed a numerical fracture mechanical model for simulating precursory seismic sequences, associated with the opening of a new magmatic pathway to the surface. The model reproduces the basic patterns of precursory seismicity and shows that the signals produced vary according to changes in the extent of damage and in the mechanical properties of the host rock. Local stress conditions and material property distributions exist under which the model is also able to produce seismic swarms that do not lead to failure and eruption. It can therefore provide insight into factors determining whether or not a seismic crisis leads to eruption. Critically, when combined with field data this may provide information on how often 'failed' eruptions can be expected, or suggest a step towards an observational method for distinguishing between a seismic swarm leading to quiescence and a pre-eruptive seismic sequence.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1812217O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1812217O"><span>Climatological attribution of wind power ramp events in East Japan and their probabilistic forecast based on multi-model ensembles downscaled by analog ensemble using self-organizing maps</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji</p> <p>2016-04-01</p> <p>Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1711727H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1711727H"><span>Trends in the predictive performance of raw ensemble weather forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hemri, Stephan; Scheuerer, Michael; Pappenberger, Florian; Bogner, Konrad; Haiden, Thomas</p> <p>2015-04-01</p> <p>Over the last two decades the paradigm in weather forecasting has shifted from being deterministic to probabilistic. Accordingly, numerical weather prediction (NWP) models have been run increasingly as ensemble forecasting systems. The goal of such ensemble forecasts is to approximate the forecast probability distribution by a finite sample of scenarios. Global ensemble forecast systems, like the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble, are prone to probabilistic biases, and are therefore not reliable. They particularly tend to be underdispersive for surface weather parameters. Hence, statistical post-processing is required in order to obtain reliable and sharp forecasts. In this study we apply statistical post-processing to ensemble forecasts of near-surface temperature, 24-hour precipitation totals, and near-surface wind speed from the global ECMWF model. Our main objective is to evaluate the evolution of the difference in skill between the raw ensemble and the post-processed forecasts. The ECMWF ensemble is under continuous development, and hence its forecast skill improves over time. Parts of these improvements may be due to a reduction of probabilistic bias. Thus, we first hypothesize that the gain by post-processing decreases over time. Based on ECMWF forecasts from January 2002 to March 2014 and corresponding observations from globally distributed stations we generate post-processed forecasts by ensemble model output statistics (EMOS) for each station and variable. Parameter estimates are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over rolling training periods that consist of the n days preceding the initialization dates. Given the higher average skill in terms of CRPS of the post-processed forecasts for all three variables, we analyze the evolution of the difference in skill between raw ensemble and EMOS forecasts. The fact that the gap in skill remains almost constant over time, especially for near-surface wind speed, suggests that improvements to the atmospheric model have an effect quite different from what calibration by statistical post-processing is doing. That is, they are increasing potential skill. Thus this study indicates that (a) further model development is important even if one is just interested in point forecasts, and (b) statistical post-processing is important because it will keep adding skill in the foreseeable future.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4623603','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4623603"><span>Forecasting magma-chamber rupture at Santorini volcano, Greece</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Browning, John; Drymoni, Kyriaki; Gudmundsson, Agust</p> <p>2015-01-01</p> <p>How much magma needs to be added to a shallow magma chamber to cause rupture, dyke injection, and a potential eruption? Models that yield reliable answers to this question are needed in order to facilitate eruption forecasting. Development of a long-lived shallow magma chamber requires periodic influx of magmas from a parental body at depth. This redistribution process does not necessarily cause an eruption but produces a net volume change that can be measured geodetically by inversion techniques. Using continuum-mechanics and fracture-mechanics principles, we calculate the amount of magma contained at shallow depth beneath Santorini volcano, Greece. We demonstrate through structural analysis of dykes exposed within the Santorini caldera, previously published data on the volume of recent eruptions, and geodetic measurements of the 2011–2012 unrest period, that the measured 0.02% increase in volume of Santorini’s shallow magma chamber was associated with magmatic excess pressure increase of around 1.1 MPa. This excess pressure was high enough to bring the chamber roof close to rupture and dyke injection. For volcanoes with known typical extrusion and intrusion (dyke) volumes, the new methodology presented here makes it possible to forecast the conditions for magma-chamber failure and dyke injection at any geodetically well-monitored volcano. PMID:26507183</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015QSRv..123...58H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015QSRv..123...58H"><span>Tools and techniques for developing tephra stratigraphies in lake cores: A case study from the basaltic Auckland Volcanic Field, New Zealand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hopkins, Jenni L.; Millet, Marc-Alban; Timm, Christian; Wilson, Colin J. N.; Leonard, Graham S.; Palin, J. Michael; Neil, Helen</p> <p>2015-09-01</p> <p>Probabilistic hazard forecasting for a volcanic region relies on understanding and reconstructing the eruptive record (derived potentially from proximal as well as distal volcanoes). Tephrostratigraphy is commonly used as a reconstructive tool by cross-correlating tephra deposits to create a stratigraphic framework that can be used to assess magnitude-frequency relationships for eruptive histories. When applied to widespread rhyolitic deposits, tephra identifications and correlations have been successful; however, the identification and correlation of basaltic tephras are more problematic. Here, using tephras in drill cores from six maars in the Auckland Volcanic Field (AVF), New Zealand, we show how X-ray density scanning coupled with magnetic susceptibility analysis can be used to accurately and reliably identify basaltic glass shard-bearing horizons in lacustrine sediments and which, when combined with the major and trace element signatures of the tephras, can be used to distinguish primary from reworked layers. After reliably identifying primary vs. reworked basaltic horizons within the cores, we detail an improved method for cross-core correlation based on stratigraphy and geochemical fingerprinting. We present major and trace element data for individual glass shards from 57 separate basaltic horizons identified within the cores. Our results suggest that in cases where major element compositions (SiO2, CaO, Al2O3, FeO, MgO) do not provide unambiguous correlations, trace elements (e.g. La, Gd, Yb, Zr, Nb, Nd) and trace element ratios (e.g. [La/Yb]N, [Gd/Yb]N, [Zr/Yb]N) are successful in improving the compositional distinction between the AVF basaltic tephra horizons, thereby allowing an improved eruptive history of the AVF to be reconstructed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.A43K..03S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.A43K..03S"><span>Extended Range Prediction of Indian Summer Monsoon: Current status</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sahai, A. K.; Abhilash, S.; Borah, N.; Joseph, S.; Chattopadhyay, R.; S, S.; Rajeevan, M.; Mandal, R.; Dey, A.</p> <p>2014-12-01</p> <p>The main focus of this study is to develop forecast consensus in the extended range prediction (ERP) of monsoon Intraseasonal oscillations using a suit of different variants of Climate Forecast system (CFS) model. In this CFS based Grand MME prediction system (CGMME), the ensemble members are generated by perturbing the initial condition and using different configurations of CFSv2. This is to address the role of different physical mechanisms known to have control on the error growth in the ERP in the 15-20 day time scale. The final formulation of CGMME is based on 21 ensembles of the standalone Global Forecast System (GFS) forced with bias corrected forecasted SST from CFS, 11 low resolution CFST126 and 11 high resolution CFST382. Thus, we develop the multi-model consensus forecast for the ERP of Indian summer monsoon (ISM) using a suite of different variants of CFS model. This coordinated international effort lead towards the development of specific tailor made regional forecast products over Indian region. Skill of deterministic and probabilistic categorical rainfall forecast as well the verification of large-scale low frequency monsoon intraseasonal oscillations has been carried out using hindcast from 2001-2012 during the monsoon season in which all models are initialized at every five days starting from 16May to 28 September. The skill of deterministic forecast from CGMME is better than the best participating single model ensemble configuration (SME). The CGMME approach is believed to quantify the uncertainty in both initial conditions and model formulation. Main improvement is attained in probabilistic forecast which is because of an increase in the ensemble spread, thereby reducing the error due to over-confident ensembles in a single model configuration. For probabilistic forecast, three tercile ranges are determined by ranking method based on the percentage of ensemble members from all the participating models falls in those three categories. CGMME further added value to both deterministic and probability forecast compared to raw SME's and this better skill is probably flows from large spread and improved spread-error relationship. CGMME system is currently capable of generating ER prediction in real time and successfully delivering its experimental operational ER forecast of ISM for the last few years.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=account+AND+information+AND+decision+AND+making&pg=7&id=EJ956136','ERIC'); return false;" href="https://eric.ed.gov/?q=account+AND+information+AND+decision+AND+making&pg=7&id=EJ956136"><span>Reducing Probabilistic Weather Forecasts to the Worst-Case Scenario: Anchoring Effects</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Joslyn, Susan; Savelli, Sonia; Nadav-Greenberg, Limor</p> <p>2011-01-01</p> <p>Many weather forecast providers believe that forecast uncertainty in the form of the worst-case scenario would be useful for general public end users. We tested this suggestion in 4 studies using realistic weather-related decision tasks involving high winds and low temperatures. College undergraduates, given the statistical equivalent of the…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.5905M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.5905M"><span>Probabilistic Storm Surge Forecast For Venice</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mel, Riccardo; Lionello, Piero</p> <p>2013-04-01</p> <p>This study describes an ensemble storm surge prediction procedure for the city of Venice, which is potentially very useful for its management, maintenance and for operating the movable barriers that are presently being built. Ensemble Prediction System (EPS) is meant to complement the existing SL forecast system by providing a probabilistic forecast and information on uncertainty of SL prediction. The procedure is applied to storm surge events in the period 2009-2010 producing for each of them an ensemble of 50 simulations. It is shown that EPS slightly increases the accuracy of SL prediction with respect to the deterministic forecast (DF) and it is more reliable than it. Though results are low biased and forecast uncertainty is underestimated, the probability distribution of maximum sea level produced by the EPS is acceptably realistic. The error of the EPS mean is shown to be correlated with the EPS spread. SL peaks correspond to maxima of uncertainty and uncertainty increases linearly with the forecast range. The quasi linear dynamics of the storm surges produces a modulation of the uncertainty after the SL peak with period corresponding to that of the main Adriatic seiche.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li class="active"><span>10</span></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_10 --> <div id="page_11" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="201"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014JVGR..286..101K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014JVGR..286..101K"><span>Influences on the variability of eruption sequences and style transitions in the Auckland Volcanic Field, New Zealand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kereszturi, Gábor; Németh, Károly; Cronin, Shane J.; Procter, Jonathan; Agustín-Flores, Javier</p> <p>2014-10-01</p> <p>Monogenetic basaltic volcanism is characterised by a complex array of eruptive behaviours, reflecting spatial and temporal variability of the magmatic properties (e.g. composition, eruptive volume, magma flux) as well as environmental factors at the vent site (e.g. availability of water, country rock geology, faulting). These combine to produce changes in eruption style over brief periods (minutes to days) in many eruption episodes. Monogenetic eruptions in some volcanic fields often start with a phreatomagmatic vent-opening phase that later transforms into "dry" magmatic explosive or effusive activity, with a strong variation in the duration and importance of this first phase. Such an eruption sequence pattern occurred in 83% of the known eruption in the 0.25 My-old Auckland Volcanic Field (AVF), New Zealand. In this investigation, the eruptive volumes were compared with the sequences of eruption styles preserved in the pyroclastic record at each volcano of the AVF, as well as environmental influencing factors, such as distribution and thickness of water-saturated semi- to unconsolidated sediments, topographic position, distances from known fault lines. The AVF showed that there is no correlation between ejecta ring volumes and environmental influencing factors that is valid for the entire AVF. In contrary, using a set of comparisons of single volcanoes with well-known and documented sequences, resultant eruption sequences could be explained by predominant patterns of the environment in which these volcanoes were erupted. Based on the spatial variability of these environmental factors, a first-order susceptibility hazard map was constructed for the AVF that forecasts areas of largest likelihood for phreatomagmatic eruptions by overlaying topographical and shallow geological information. Combining detailed phase-by-phase breakdowns of eruptive volumes and the event sequences of the AVF, along with the new susceptibility map, more realistic eruption scenarios can be developed for different parts of the volcanic field. This approach can be applied to tailoring field and sub-field specific hazard forecasting at similar volcanic fields worldwide.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JVGR..324..118S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JVGR..324..118S"><span>Using a cross correlation technique to refine the accuracy of the Failure Forecast Method: Application to Soufrière Hills volcano, Montserrat</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Salvage, R. O.; Neuberg, J. W.</p> <p>2016-09-01</p> <p>Prior to many volcanic eruptions, an acceleration in seismicity has been observed, suggesting the potential for this as a forecasting tool. The Failure Forecast Method (FFM) relates an accelerating precursor to the timing of failure by an empirical power law, with failure being defined in this context as the onset of an eruption. Previous applications of the FFM have used a wide variety of accelerating time series, often generating questionable forecasts with large misfits between data and the forecast, as well as the generation of a number of different forecasts from the same data series. Here, we show an alternative approach applying the FFM in combination with a cross correlation technique which identifies seismicity from a single active source mechanism and location at depth. Isolating a single system at depth avoids additional uncertainties introduced by averaging data over a number of different accelerating phenomena, and consequently reduces the misfit between the data and the forecast. Similar seismic waveforms were identified in the precursory accelerating seismicity to dome collapses at Soufrière Hills volcano, Montserrat in June 1997, July 2003 and February 2010. These events were specifically chosen since they represent a spectrum of collapse scenarios at this volcano. The cross correlation technique generates a five-fold increase in the number of seismic events which could be identified from continuous seismic data rather than using triggered data, thus providing a more holistic understanding of the ongoing seismicity at the time. The use of similar seismicity as a forecasting tool for collapses in 1997 and 2003 greatly improved the forecasted timing of the dome collapse, as well as improving the confidence in the forecast, thereby outperforming the classical application of the FFM. We suggest that focusing on a single active seismic system at depth allows a more accurate forecast of some of the major dome collapses from the ongoing eruption at Soufrière Hills volcano, and provides a simple addition to the well-used methodology of the FFM.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.1422H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.1422H"><span>Multivariate postprocessing techniques for probabilistic hydrological forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian</p> <p>2016-04-01</p> <p>Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power generation, Applied Energy, 96, 12-20, DOI: 10.1016/j.apenergy.2011.11.004. Schefzik, R., T. L. Thorarinsdottir, and T. Gneiting (2013), Uncertainty quantification in complex simulation models using ensemble copula coupling, Statistical Science, 28, 616-640, DOI: 10.1214/13-STS443.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JHyd..524..789H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JHyd..524..789H"><span>Ensemble Bayesian forecasting system Part I: Theory and algorithms</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Herr, Henry D.; Krzysztofowicz, Roman</p> <p>2015-05-01</p> <p>The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017GeoRL..44.9093L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017GeoRL..44.9093L"><span>ENSO-based probabilistic forecasts of March-May U.S. tornado and hail activity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lepore, Chiara; Tippett, Michael K.; Allen, John T.</p> <p>2017-09-01</p> <p>Extended logistic regression is used to predict March-May severe convective storm (SCS) activity based on the preceding December-February (DJF) El Niño-Southern Oscillation (ENSO) state. The spatially resolved probabilistic forecasts are verified against U.S. tornado counts, hail events, and two environmental indices for severe convection. The cross-validated skill is positive for roughly a quarter of the U.S. Overall, indices are predicted with more skill than are storm reports, and hail events are predicted with more skill than tornado counts. Skill is higher in the cool phase of ENSO (La Niña like) when overall SCS activity is higher. SCS forecasts based on the predicted DJF ENSO state from coupled dynamical models initialized in October of the previous year extend the lead time with only a modest reduction in skill compared to forecasts based on the observed DJF ENSO state.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMIN23A1493J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMIN23A1493J"><span>Classifying Volcanic Activity Using an Empirical Decision Making Algorithm</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Junek, W. N.; Jones, W. L.; Woods, M. T.</p> <p>2012-12-01</p> <p>Detection and classification of developing volcanic activity is vital to eruption forecasting. Timely information regarding an impending eruption would aid civil authorities in determining the proper response to a developing crisis. In this presentation, volcanic activity is characterized using an event tree classifier and a suite of empirical statistical models derived through logistic regression. Forecasts are reported in terms of the United States Geological Survey (USGS) volcano alert level system. The algorithm employs multidisciplinary data (e.g., seismic, GPS, InSAR) acquired by various volcano monitoring systems and source modeling information to forecast the likelihood that an eruption, with a volcanic explosivity index (VEI) > 1, will occur within a quantitatively constrained area. Logistic models are constructed from a sparse and geographically diverse dataset assembled from a collection of historic volcanic unrest episodes. Bootstrapping techniques are applied to the training data to allow for the estimation of robust logistic model coefficients. Cross validation produced a series of receiver operating characteristic (ROC) curves with areas ranging between 0.78-0.81, which indicates the algorithm has good predictive capabilities. The ROC curves also allowed for the determination of a false positive rate and optimum detection for each stage of the algorithm. Forecasts for historic volcanic unrest episodes in North America and Iceland were computed and are consistent with the actual outcome of the events.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.6575B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.6575B"><span>Bayesian quantitative precipitation forecasts in terms of quantiles</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bentzien, Sabrina; Friederichs, Petra</p> <p>2014-05-01</p> <p>Ensemble prediction systems (EPS) for numerical weather predictions on the mesoscale are particularly developed to obtain probabilistic guidance for high impact weather. An EPS not only issues a deterministic future state of the atmosphere but a sample of possible future states. Ensemble postprocessing then translates such a sample of forecasts into probabilistic measures. This study focus on probabilistic quantitative precipitation forecasts in terms of quantiles. Quantiles are particular suitable to describe precipitation at various locations, since no assumption is required on the distribution of precipitation. The focus is on the prediction during high-impact events and related to the Volkswagen Stiftung funded project WEX-MOP (Mesoscale Weather Extremes - Theory, Spatial Modeling and Prediction). Quantile forecasts are derived from the raw ensemble and via quantile regression. Neighborhood method and time-lagging are effective tools to inexpensively increase the ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Since an EPS provides a large amount of potentially informative predictors, a variable selection is required in order to obtain a stable statistical model. A Bayesian formulation of quantile regression allows for inference about the selection of predictive covariates by the use of appropriate prior distributions. Moreover, the implementation of an additional process layer for the regression parameters accounts for spatial variations of the parameters. Bayesian quantile regression and its spatially adaptive extension is illustrated for the German-focused mesoscale weather prediction ensemble COSMO-DE-EPS, which runs (pre)operationally since December 2010 at the German Meteorological Service (DWD). Objective out-of-sample verification uses the quantile score (QS), a weighted absolute error between quantile forecasts and observations. The QS is a proper scoring function and can be decomposed into reliability, resolutions and uncertainty parts. A quantile reliability plot gives detailed insights in the predictive performance of the quantile forecasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.3295J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.3295J"><span>Evaluation of probabilistic forecasts with the scoringRules package</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian</p> <p>2017-04-01</p> <p>Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70182770','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70182770"><span>Assessing North American multimodel ensemble (NMME) seasonal forecast skill to assist in the early warning of hydrometeorological extremes over East Africa</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Shukla, Shraddhanand; Roberts, Jason B.; Hoell. Andrew,; Funk, Chris; Robertson, Franklin R.; Kirtmann, Benjamin</p> <p>2016-01-01</p> <p>The skill of North American multimodel ensemble (NMME) seasonal forecasts in East Africa (EA), which encompasses one of the most food and water insecure areas of the world, is evaluated using deterministic, categorical, and probabilistic evaluation methods. The skill is estimated for all three primary growing seasons: March–May (MAM), July–September (JAS), and October–December (OND). It is found that the precipitation forecast skill in this region is generally limited and statistically significant over only a small part of the domain. In the case of MAM (JAS) [OND] season it exceeds the skill of climatological forecasts in parts of equatorial EA (Northern Ethiopia) [equatorial EA] for up to 2 (5) [5] months lead. Temperature forecast skill is generally much higher than precipitation forecast skill (in terms of deterministic and probabilistic skill scores) and statistically significant over a majority of the region. Over the region as a whole, temperature forecasts also exhibit greater reliability than the precipitation forecasts. The NMME ensemble forecasts are found to be more skillful and reliable than the forecast from any individual model. The results also demonstrate that for some seasons (e.g. JAS), the predictability of precipitation signals varies and is higher during certain climate events (e.g. ENSO). Finally, potential room for improvement in forecast skill is identified in some models by comparing homogeneous predictability in individual NMME models with their respective forecast skill.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.V31B3024P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.V31B3024P"><span>Magma Supply Rate Controls Vigor (And Longevity) of Kīlauea's Ongoing East Rift Zone Eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Poland, M. P.; Anderson, K. R.</p> <p>2015-12-01</p> <p>Since 1983, Kīlauea Volcano, Hawai'i, has erupted almost continuously from vents on the East Rift Zone—at 32 years and counting, this is the longest-duration eruption in the past 500 years. Although forecasting the onset of eruptive activity using geophysical, geochemical, and geological monitoring has been demonstrated repeatedly at Kīlauea and elsewhere, little progress has been made in forecasting an eruption's waning or end, particularly in the case of long-lived eruptions. This is especially important at Kīlauea for at least two reasons: (1) caldera formation at the end of another decades-long eruption, in the 15th century, raises the possibility of a link between eruption duration and caldera formation; and (2) long-lived eruptions can have an enduring effect on local population and infrastructure, as demonstrated by the repeated destruction of property by Kīlauea's ongoing rift zone eruption. Data from the past 15 years indicate that the magma supply rate to Kīlauea is an important control on eruptive activity. Joint inversions of geophysical, geochemical, and geological observations demonstrate that in 2006 the supply rate was nearly double that of 2000-2001, resulting in an increase in lava discharge, summit inflation, and the formation of new eruptive vents. In contrast, the magma supply during 2012, and likely through 2014, was less than that of 2000-2001. This lower supply rate was associated with a lower lava discharge and may have played a role in the stalling of lava flows above population centers in the Puna District during 2014-2015. Heightened eruptive vigor may be expected if magma supply increases in the future; however, a further decrease in supply rate—which is likely already below the long-term average—may result in cessation of the eruption. Multidisciplinary monitoring, and particularly tracking of CO2 emissions and surface deformation, should be able to detect changes in supply rate before they are strongly manifested at the surface.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMPA41D..03M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMPA41D..03M"><span>The Volcano Disaster Assistance Program: Working with International Partners to Reduce the Risk from Volcanic Eruptions Worldwide</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mayberry, G. C.; Pallister, J. S.</p> <p>2015-12-01</p> <p>The Volcano Disaster Assistance Program (VDAP) is a joint effort between USGS and the U.S. Agency for International Development's (USAID) Office of U.S. Foreign Disaster Assistance (OFDA). OFDA leads and coordinates disaster responses overseas for the U.S. government and is a unique stakeholder concerned with volcano disaster risk reduction as an international humanitarian assistance donor. One year after the tragic eruption of Nevado del Ruiz in 1985, OFDA began funding USGS to implement VDAP. VDAP's mission is to reduce the loss of life and property and limit the economic impact from foreign volcano crises, thereby preventing such crises from becoming disasters. VDAP fulfills this mission and complements OFDA's humanitarian assistance by providing crisis response, capacity-building, technical training, and hazard assessments to developing countries before, during, and after eruptions. During the past 30 years, VDAP has responded to more than 27 major volcanic crises, built capacity in 12+ countries, and helped counterparts save tens of thousands of lives and hundreds of millions of dollars in property. VDAP responses have evolved as host-country capabilities have grown, but the pace of work has not diminished; as a result of VDAP's work at 27 volcanoes in fiscal year 2014, more than 1.3 million people who could have been impacted by volcanic activity benefitted from VDAP assistance, 11 geological policies were modified, 188 scientists were trained, and several successful eruption forecasts were made. VDAP is developing new initiatives to help counterparts monitor volcanoes and communicate volcanic risk. These include developing the Eruption Forecasting Information System (EFIS) to learn from compiled crisis data from 30 years of VDAP responses, creating event trees to forecast eruptions at restless volcanoes, and exploring the use of unmanned aerial systems for monitoring. The use of these new methods, along with traditional VDAP assistance, has improved VDAP's ability to assist counterparts with preparing for eruptions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy..tmp..246T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy..tmp..246T"><span>Assessing probabilistic predictions of ENSO phase and intensity from the North American Multimodel Ensemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tippett, Michael K.; Ranganathan, Meghana; L'Heureux, Michelle; Barnston, Anthony G.; DelSole, Timothy</p> <p>2017-05-01</p> <p>Here we examine the skill of three, five, and seven-category monthly ENSO probability forecasts (1982-2015) from single and multi-model ensemble integrations of the North American Multimodel Ensemble (NMME) project. Three-category forecasts are typical and provide probabilities for the ENSO phase (El Niño, La Niña or neutral). Additional forecast categories indicate the likelihood of ENSO conditions being weak, moderate or strong. The level of skill observed for differing numbers of forecast categories can help to determine the appropriate degree of forecast precision. However, the dependence of the skill score itself on the number of forecast categories must be taken into account. For reliable forecasts with same quality, the ranked probability skill score (RPSS) is fairly insensitive to the number of categories, while the logarithmic skill score (LSS) is an information measure and increases as categories are added. The ignorance skill score decreases to zero as forecast categories are added, regardless of skill level. For all models, forecast formats and skill scores, the northern spring predictability barrier explains much of the dependence of skill on target month and forecast lead. RPSS values for monthly ENSO forecasts show little dependence on the number of categories. However, the LSS of multimodel ensemble forecasts with five and seven categories show statistically significant advantages over the three-category forecasts for the targets and leads that are least affected by the spring predictability barrier. These findings indicate that current prediction systems are capable of providing more detailed probabilistic forecasts of ENSO phase and amplitude than are typically provided.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H23C1672L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H23C1672L"><span>Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Li, Z.; Ghaith, M.</p> <p>2017-12-01</p> <p>Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.1940C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.1940C"><span>A Wind Forecasting System for Energy Application</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Courtney, Jennifer; Lynch, Peter; Sweeney, Conor</p> <p>2010-05-01</p> <p>Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFMNH43A1284S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFMNH43A1284S"><span>Validation of Volcanic Ash Forecasting Performed by the Washington Volcanic Ash Advisory Center</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Salemi, A.; Hanna, J.</p> <p>2009-12-01</p> <p>In support of NOAA’s mission to protect life and property, the Satellite Analysis Branch (SAB) uses satellite imagery to monitor volcanic eruptions and track volcanic ash. The Washington Volcanic Ash Advisory Center (VAAC) was established in late 1997 through an agreement with the International Civil Aviation Organization (ICAO). A volcanic ash advisory (VAA) is issued every 6 hours while an eruption is occurring. Information about the current location and height of the volcanic ash as well as any pertinent meteorological information is contained within the VAA. In addition, when ash is detected in satellite imagery, 6-, 12- and 18-hour forecasts of ash height and location are provided. This information is garnered from many sources including Meteorological Watch Offices (MWOs), pilot reports (PIREPs), model forecast winds, radiosondes and volcano observatories. The Washington VAAC has performed a validation of their 6, 12 and 18 hour airborne volcanic ash forecasts issued since October, 2007. The volcanic ash forecasts are viewed dichotomously (yes/no) with the frequency of yes and no events placed into a contingency table. A large variety of categorical statistics useful in describing forecast performance are then computed from the resulting contingency table.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMSM14A..04M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMSM14A..04M"><span>Ensemble flare forecasting: using numerical weather prediction techniques to improve space weather operations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Murray, S.; Guerra, J. A.</p> <p>2017-12-01</p> <p>One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.spc.noaa.gov/products/exper/fire_wx','SCIGOVWS'); return false;" href="http://www.spc.noaa.gov/products/exper/fire_wx"><span>Storm Prediction Center Day 3-8 Fire Weather Forecast Issued on May 27,</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>National RADAR Product Archive NOAA Weather Radio Research <em>Non</em>-op. Products Forecast Tools Svr. Tstm information in MS-Word or PDF. Note: Through September 29, 2015 the SPC will issue <em>Experimental</em> Probabilistic</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29670163','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29670163"><span>Contrasting catastrophic eruptions predicted by different intrusion and collapse scenarios.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rincón, M; Márquez, A; Herrera, R; Alonso-Torres, A; Granja-Bruña, J L; van Wyk de Vries, B</p> <p>2018-04-18</p> <p>Catastrophic volcanic eruptions triggered by landslide collapses can jet upwards or blast sideways. Magma intrusion is related to both landslide-triggered eruptive scenarios (lateral or vertical), but it is not clear how such different responses are produced, nor if any precursor can be used for forecasting them. We approach this problem with physical analogue modelling enhanced with X-ray Multiple Detector Computed Tomography scanning, used to track evolution of internal intrusion, and its related faulting and surface deformation. We find that intrusions produce three different volcano deformation patterns, one of them involving asymmetric intrusion and deformation, with the early development of a listric slump fault producing pronounced slippage of one sector. This previously undescribed early deep potential slip surface provides a unified explanation for the two different eruptive scenarios (lateral vs. vertical). Lateral blast only occurs in flank collapse when the intrusion has risen into the sliding block. Otherwise, vertical rather than lateral expansion of magma is promoted by summit dilatation and flank buttressing. The distinctive surface deformation evolution detected opens the possibility to forecast the possible eruptive scenarios: laterally directed blast should only be expected when surface deformation begins to develop oblique to the first major fault.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFM.V41E2326L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFM.V41E2326L"><span>Regional model studies of the atmospheric dispersion of fine volcanic ash after the eruption of Eyjafjallajoekull</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Langmann, B.; Hort, M. K.</p> <p>2010-12-01</p> <p>During the eruption of Eyjafjallajoekull on Iceland in April/May 2010 air traffic over Europe was repeatedly interrupted because of volcanic ash in the atmosphere. This completely unusual situation in Europe leads to the demand of improved crisis management, e.g. European wide regulations of volcanic ash thresholds and improved forecasts of theses thresholds. However, the quality of the forecast of fine volcanic ash concentrations in the atmosphere depends to a great extent on a realistic description of the erupted mass flux of fine ash particles, which is rather uncertain. Numerous aerosol measurements (ground based and satellite remote sensing, and in situ measurements) all over Europe have tracked the volcanic ash clouds during the eruption of Eyjafjallajoekull offering the possibility for an interdisciplinary effort between volcanologists and aerosol researchers to analyse the release and dispersion of fine volcanic ash in order to better understand the needs for realistic volcanic ash forecasts. This contribution describes the uncertainties related to the amount of fine volcanic ash released from Eyjafjallajoekull and its influence on the dispersion of volcanic ash over Europe by numerical modeling. We use the three-dimensional Eulerian atmosphere-chemistry/aerosol model REMOTE (Langmann et al., 2008) to simulate the distribution of volcanic ash as well as its deposition after the eruptions of Eyjafjallajoekull during April and May 2010. The model has been used before to simulate the fate of the volcanic ash after the volcanic eruptions of Kasatochi in 2008 (Langmann et al., 2010) and Mt. Pinatubo in 1991. Comparing our model results with available measurements for the Eyjafjallajoekull eruption we find a quite good agreement with available ash concentrations data measured over Europe as well as with the results from other models. Langmann, B., K. Zakšek and M. Hort, Atmospheric distribution and removal of volcanic ash after the eruption of Kasatochi volcano: A regional model study, J. Geophys. Res., 115, D00L06, doi:10.1029/2009JD013298, 2010. Langmann, B., S. Varghese, E. Marmer, E. Vignati, J. Wilson, P. Stier and C. O’Dowd, Aerosol distribution over Europe: A model evaluation study with detailed aerosol microphysics, Atmos. Chem. Phys. 8, 1591-1607, 2008.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1439836-stochastic-simulation-predictive-spacetime-scenarios-wind-speed-using-observations-physical-model-outputs','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1439836-stochastic-simulation-predictive-spacetime-scenarios-wind-speed-using-observations-physical-model-outputs"><span>Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai</p> <p></p> <p>We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_11 --> <div id="page_12" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="221"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1439836-stochastic-simulation-predictive-spacetime-scenarios-wind-speed-using-observations-physical-model-outputs','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1439836-stochastic-simulation-predictive-spacetime-scenarios-wind-speed-using-observations-physical-model-outputs"><span>Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai</p> <p>2018-03-01</p> <p>We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70016660','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70016660"><span>Mount St. Helens a decade after the 1980 eruptions: magmatic models, chemical cycles, and a revised hazards assessment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Pallister, J.S.; Hoblitt, R.P.; Crandell, D.R.; Mullineaux, D.R.</p> <p>1992-01-01</p> <p>Available geophysical and geologic data provide a simplified model of the current magmatic plumbing system of Mount St. Helens (MSH). This model and new geochemical data are the basis for the revised hazards assessment presented here. The assessment is weighted by the style of eruptions and the chemistry of magmas erupted during the past 500 years, the interval for which the most detailed stratigraphic and geochemical data are available. This interval includes the Kalama (A. D. 1480-1770s?), Goat Rocks (A.D. 1800-1857), and current eruptive periods. In each of these periods, silica content decreased, then increased. The Kalama is a large amplitude chemical cycle (SiO2: 57%-67%), produced by mixing of arc dacite, which is depleted in high field-strength and incompatible elements, with enriched (OIB-like) basalt. The Goat Rocks and current cycles are of small amplitude (SiO2: 61%-64% and 62%-65%) and are related to the fluid dynamics of magma withdrawal from a zoned reservoir. The cyclic behavior is used to forecast future activity. The 1980-1986 chemical cycle, and consequently the current eruptive period, appears to be virtually complete. This inference is supported by the progressively decreasing volumes and volatile contents of magma erupted since 1980, both changes that suggest a decreasing potential for a major explosive eruption in the near future. However, recent changes in seismicity and a series of small gas-release explosions (beginning in late 1989 and accompanied by eruption of a minor fraction of relatively low-silica tephra on 6 January and 5 November 1990) suggest that the current eruptive period may continue to produce small explosions and that a small amount of magma may still be present within the conduit. The gas-release explosions occur without warning and pose a continuing hazard, especially in the crater area. An eruption as large or larger than that of 18 May 1980 (???0.5 km3 dense-rock equivalent) probably will occur only if magma rises from an inferred deep (???7 km), relative large (5-7 km3) reservoir. A conservative approach to hazard assessment is to assume that this deep magma is rich in volatiles and capable of erupting explosively to produce voluminous fall deposits and pyroclastic flows. Warning of such an eruption is expectable, however, because magma ascent would probably be accompanied by shallow seismicity that could be detected by the existing seismic-monitoring system. A future large-volume eruption (???0.1 km3) is virtually certain; the eruptive history of the past 500 years indicates the probability of a large explosive eruption is at least 1% annually. Intervals between large eruptions at Mount St. Helens have varied widely; consequently, we cannot confidently forecast whether the next large eruption will be years decades, or farther in the future. However, we can forecast the types of hazards, and the areas that will be most affected by future large-volume eruptions, as well as hazards associated with the approaching end of the current eruptive period. ?? 1992 Springer-Verlag.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.5156M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.5156M"><span>Performance assessment of deterministic and probabilistic weather predictions for the short-term optimization of a tropical hydropower reservoir</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter</p> <p>2016-04-01</p> <p>Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of actual forecasts with shorter lead times of up to 15 days shows the practical benefit of actual operational data. It appears that the use of stochastic optimization combined with ensemble forecasts leads to a significant higher level of flood protection without compromising the HPP's energy production.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.4819E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.4819E"><span>Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Engeland, Kolbjorn; Steinsland, Ingelin</p> <p>2014-05-01</p> <p>This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005AGUFM.S51B1010M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005AGUFM.S51B1010M"><span>Three-dimensional Probabilistic Earthquake Location Applied to 2002-2003 Mt. Etna Eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mostaccio, A.; Tuve', T.; Zuccarello, L.; Patane', D.; Saccorotti, G.; D'Agostino, M.</p> <p>2005-12-01</p> <p>Recorded seismicity for the Mt. Etna volcano, occurred during the 2002-2003 eruption, has been relocated using a probabilistic, non-linear, earthquake location approach. We used the software package NonLinLoc (Lomax et al., 2000) adopting the 3D velocity model obtained by Cocina et al., 2005. We applied our data through different algorithms: (1) via a grid-search; (2) via a Metropolis-Gibbs; and (3) via an Oct-tree. The Oct-Tree algorithm gives efficient, faster and accurate mapping of the PDF (Probability Density Function) of the earthquake location problem. More than 300 seismic events were analyzed in order to compare non-linear location results with the ones obtained by using traditional, linearized earthquake location algorithm such as Hypoellipse, and a 3D linearized inversion (Thurber, 1983). Moreover, we compare 38 focal mechanisms, chosen following stricta criteria selection, with the ones obtained by the 3D and 1D results. Although the presented approach is more of a traditional relocation application, probabilistic earthquake location could be used in routinely survey.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1711853F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1711853F"><span>Simulating the propagation of sulphur dioxide emissions from the fissure eruption in the Holuhraun lava field (Iceland) with the EURAD-IM</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fröhlich, Luise; Franke, Philipp; Friese, Elmar; Haas, Sarah; Lange, Anne Caroline; Elbern, Hendrik</p> <p>2015-04-01</p> <p>In the emergency case of a volcano eruption accurate forecasts of the transport of ash and gas emissions are crucial for health protection and aviation safety. In the frame of Earth System Knowledge Platform (ESKP) near real-time forecasts of ash and SO2 dispersion emitted by active volcanoes are simulated by the European Air pollution Dispersion Inverse Model (EURAD-IM). The model is driven by the Weather Research and Forecasting Model (WRF) and includes detailed gas phase and particle dynamics modules, which allow for quantitative estimates of measured volcano releases. Former simulations, for example related to the Eyjafjallajökull outbreak in 2010, were in good agreement with measurement records of particle number and SO2 at several European stations. At the end of August 2014 an fissure eruption has begun on Iceland in the Holuhraun lava field to the north-east of the Bardarbunga volcano system. In contrast to the explosive eruption of the Eyjafjallajökull in 2010, the Holuhraun eruption is rather effusive with a large and continuous flow of lava and a significant release of sulphur dioxide (SO2) in the lower troposphere, while ash emissions are insignificant. Since the Holuhraun fissure eruption has started, daily forecasts of SO2 dispersion are produced for the European region (15 km horizontal resolution grid) and published on our website (http://apps.fz-juelich.de/iek-8/RIU/vorhersage_node.php). To simulate the transport of volcanic emissions, realistic source terms like mass release rates of ash and SO2 or plume heights are required. Since no representative measurements are currently available for the simulations, rough qualitative assumptions, based on reports from the Icelandic Met Office (IMO), are used. However, frequent comparisons with satellite observations show that the actual propagation of the volcanic emissions is generally well reflected by the model. In the middle of September 2014 several European measurement sides recorded extremely high SO2 concentrations at ground level which were predicted quite accurately in advance by the EURAD-IM. Further more, the simulations indicate that the unusual high SO2 values are due to the transport of sulphur dioxide rich air from the Bardarbunga towards continental Europe. Presently, SO2 dispersion forecasts are also conducted on a finer spatial resolution grid (1 km) for the Icelandic region. These simulations will be validated against measurements from different observation sides in Iceland.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018HESS...22.2073G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018HESS...22.2073G"><span>Long-term ensemble forecast of snowmelt inflow into the Cheboksary Reservoir under two different weather scenarios</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.</p> <p>2018-04-01</p> <p>A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013WRR....49.6744H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013WRR....49.6744H"><span>Simultaneous calibration of ensemble river flow predictions over an entire range of lead times</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hemri, S.; Fundel, F.; Zappa, M.</p> <p>2013-10-01</p> <p>Probabilistic estimates of future water levels and river discharge are usually simulated with hydrologic models using ensemble weather forecasts as main inputs. As hydrologic models are imperfect and the meteorological ensembles tend to be biased and underdispersed, the ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, in order to achieve both reliable and sharp predictions statistical postprocessing is required. In this work Bayesian model averaging (BMA) is applied to statistically postprocess ensemble runoff raw forecasts for a catchment in Switzerland, at lead times ranging from 1 to 240 h. The raw forecasts have been obtained using deterministic and ensemble forcing meteorological models with different forecast lead time ranges. First, BMA is applied based on mixtures of univariate normal distributions, subject to the assumption of independence between distinct lead times. Then, the independence assumption is relaxed in order to estimate multivariate runoff forecasts over the entire range of lead times simultaneously, based on a BMA version that uses multivariate normal distributions. Since river runoff is a highly skewed variable, Box-Cox transformations are applied in order to achieve approximate normality. Both univariate and multivariate BMA approaches are able to generate well calibrated probabilistic forecasts that are considerably sharper than climatological forecasts. Additionally, multivariate BMA provides a promising approach for incorporating temporal dependencies into the postprocessed forecasts. Its major advantage against univariate BMA is an increase in reliability when the forecast system is changing due to model availability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014HESSD..1112063G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014HESSD..1112063G"><span>Improving inflow forecasting into hydropower reservoirs through a complementary modelling framework</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.</p> <p>2014-10-01</p> <p>Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead-time is considered within the day-ahead (Elspot) market of the Nordic exchange market. We present here a new approach for issuing hourly reservoir inflow forecasts that aims to improve on existing forecasting models that are in place operationally, without needing to modify the pre-existing approach, but instead formulating an additive or complementary model that is independent and captures the structure the existing model may be missing. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. The procedure presented comprises an error model added on top of an un-alterable constant parameter conceptual model, the models being demonstrated with reference to the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead-times up to 17 h. Season based evaluations indicated that the improvement in inflow forecasts varies across seasons and inflow forecasts in autumn and spring are less successful with the 95% prediction interval bracketing less than 95% of the observations for lead-times beyond 17 h.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMPA13A2193G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMPA13A2193G"><span>Using NMME in Region-Specific Operational Seasonal Climate Forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gronewold, A.; Bolinger, R. A.; Fry, L. M.; Kompoltowicz, K.</p> <p>2015-12-01</p> <p>The National Oceanic and Atmospheric Administration's Climate Prediction Center (NOAA/CPC) provides access to a suite of real-time monthly climate forecasts that comprise the North American Multi-Model Ensemble (NMME) in an attempt to meet increasing demands for monthly to seasonal climate prediction. While the graphical map forecasts of the NMME are informative, there is a need to provide decision-makers with probabilistic forecasts specific to their region of interest. Here, we demonstrate the potential application of the NMME to address regional climate projection needs by developing new forecasts of temperature and precipitation for the North American Great Lakes, the largest system of lakes on Earth. Regional opertional water budget forecasts rely on these outlooks to initiate monthly forecasts not only of the water budget, but of monthly lake water levels as well. More specifically, we present an alternative for improving existing operational protocols that currently involve a relatively time-consuming and subjective procedure based on interpreting the maps of the NMME. In addition, all forecasts are currently presented in the NMME in a probabilistic format, with equal weighting given to each member of the ensemble. In our new evolution of this product, we provide historical context for the forecasts by superimposing them (in an on-line graphical user interface) with the historical range of observations. Implementation of this new tool has already led to noticeable advantages in regional water budget forecasting, and has the potential to be transferred to other regional decision-making authorities as well.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70017553','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70017553"><span>Precursory swarms of long-period events at Redoubt Volcano (1989-1990), Alaska: Their origin and use as a forecasting tool</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Chouet, B.A.; Page, R.A.; Stephens, C.D.; Lahr, J.C.; Power, J.A.</p> <p>1994-01-01</p> <p>During the eruption of Redoubt Volcano from December 1989 through April 1990, the Alaska Volcano Observatory issued advance warnings of several tephra eruptions based on changes in seismic activity related to the occurrence of precursory swarms of long-period (LP) seismic events (dominant period of about 0.5 s). The initial eruption on December 14 occurred after 23 years of quiescence and was heralded by a 23-hour swarm of LP events that ended abruptly with the eruption. After a series of vent-clearing explosions over the next few days, dome growth began on December 21. Another swarm, with LP events similar to those of the first, began on the 26th and ended in a major tephra eruption on January 2. Eruptions continued over the next two weeks and then ceased until February 15, when a large eruption initiated a long phase of repetitive dome-building and dome-destroying episodes that continued into April. Warnings were issued before the major events on December 14 and January 2, but as the eruptive sequence continued after January 2, the energy of the swarms decreased and forecasting became more difficult. A significant but less intense swarm preceded the February 15 eruption, which was not forecast. This eruption destroyed the only seismograph on the volcanic edifice and stymied forecasting until March 4, when the first of three new stations was installed within 3 km of the active vent. From March 4 to the end of the sequence on April 21, there were eight eruptions, six of which were preceded by detectable swarms of LP events. Although weak, these swarms provided the basis for warnings issued before the eruptions on March 23 and April 6. The initial swarm on December 13 had the following features: (1) short duration (23 hours); (2) a rapidly accelerating rate of seismic energy release over the first 18 hours of the swarm, followed by a decline of activity during the 5 hours preceding the eruption; (3) a magnitude range from -0.4 to 1.6; (4) nearly identical LP signatures with a dominant period near 0.5 s; (5) dilatational first motions everywhere; and (6) a stationary source location at a depth of 1.4 km beneath the crater. This occurrence of long-period events suggests a model involving the interaction of magma with groundwater in which magmatic gases, steam and water drive a fixed conduit at a stationary point throughout the swarm. The initiation of that sequence of events is analogous to the failure of a pressure-relief valve connecting a lower, supercharged magma-dominated reservoir to a shallow hydrothermal system. A three-dimensional model of a vibrating fluid-filled crack recently developed by Chouet is found to be compatible with the seismic data and yields the following parameters for the LP source: crack length, 280-380 m; crack width, 140-190 m; crack thickness, 0.05-0.20 m; crack stiffness, 100-200; sound speed of fluid, 0.8-1.3 km/s; compressional-wave speed of rock, 5.1 km/s; density ratio of fluid to rock, ???0.4; and ratio of bulk modulus of fluid to rigidity of rock, 0.03-0.07. The fluid-filled crack is excited intermittently by an impulsive pressure drop that varies in magnitude within the range of 0.4 to 40 bar. Such disturbance appears to be consistent with a triggering mechanism associated with choked flow conditions in the crack. ?? 1994.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFM.C31A0622S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFM.C31A0622S"><span>Probabilistic Forecasting of Arctic Sea Ice Extent</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Slater, A. G.</p> <p>2013-12-01</p> <p>Sea ice in the Arctic is changing rapidly. Most noticeable has been the series of record, or near-record, annual minimums in sea ice extent in the past six years. The changing regime of sea ice has prompted much interest in seasonal prediction of sea ice extent, particularly as opportunities for Arctic shipping and resource exploration or extraction increase. This study presents a daily sea ice extent probabilistic forecast method with a 50-day lead time. A base projection is made from historical data and near-real-time sea ice concentration is assimilated on the issue date of the forecast. When considering the September mean ice extent for the period 1995-2012, the performance of the 50-day lead time forecast is very good: correlation=0.94, Bias = 0.14 ×106 km^2 and RMSE = 0.36 ×106 km^2. Forecasts for the daily minimum contains equal skill levels. The system is highly competitive with any of the SEARCH Sea Ice Outlook estimates. The primary finding of this study is that large amounts of forecast skill can be gained from knowledge of the initial conditions of concentration (perhaps more than previously thought). Given the simplicity of the forecast model, improved skill should be available from system refinement and with suitable proxies for large scale atmosphere and ocean circulation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006AGUSM.H31B..01K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006AGUSM.H31B..01K"><span>Bayesian Processor of Output for Probabilistic Quantitative Precipitation Forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Krzysztofowicz, R.; Maranzano, C. J.</p> <p>2006-05-01</p> <p>The Bayesian Processor of Output (BPO) is a new, theoretically-based technique for probabilistic forecasting of weather variates. It processes output from a numerical weather prediction (NWP) model and optimally fuses it with climatic data in order to quantify uncertainty about a predictand. The BPO is being tested by producing Probabilistic Quantitative Precipitation Forecasts (PQPFs) for a set of climatically diverse stations in the contiguous U.S. For each station, the PQPFs are produced for the same 6-h, 12-h, and 24-h periods up to 84- h ahead for which operational forecasts are produced by the AVN-MOS (Model Output Statistics technique applied to output fields from the Global Spectral Model run under the code name AVN). The inputs into the BPO are estimated as follows. The prior distribution is estimated from a (relatively long) climatic sample of the predictand; this sample is retrieved from the archives of the National Climatic Data Center. The family of the likelihood functions is estimated from a (relatively short) joint sample of the predictor vector and the predictand; this sample is retrieved from the same archive that the Meteorological Development Laboratory of the National Weather Service utilized to develop the AVN-MOS system. This talk gives a tutorial introduction to the principles and procedures behind the BPO, and highlights some results from the testing: a numerical example of the estimation of the BPO, and a comparative verification of the BPO forecasts and the MOS forecasts. It concludes with a list of demonstrated attributes of the BPO (vis- à-vis the MOS): more parsimonious definitions of predictors, more efficient extraction of predictive information, better representation of the distribution function of predictand, and equal or better performance (in terms of calibration and informativeness).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014JHyd..519.2737F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014JHyd..519.2737F"><span>Assessment of SWE data assimilation for ensemble streamflow predictions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Franz, Kristie J.; Hogue, Terri S.; Barik, Muhammad; He, Minxue</p> <p>2014-11-01</p> <p>An assessment of data assimilation (DA) for Ensemble Streamflow Prediction (ESP) using seasonal water supply hindcasting in the North Fork of the American River Basin (NFARB) and the National Weather Service (NWS) hydrologic forecast models is undertaken. Two parameter sets, one from the California Nevada River Forecast Center (RFC) and one from the Differential Evolution Adaptive Metropolis (DREAM) algorithm, are tested. For each parameter set, hindcasts are generated using initial conditions derived with and without the inclusion of a DA scheme that integrates snow water equivalent (SWE) observations. The DREAM-DA scenario uses an Integrated Uncertainty and Ensemble-based data Assimilation (ICEA) framework that also considers model and parameter uncertainty. Hindcasts are evaluated using deterministic and probabilistic forecast verification metrics. In general, the impact of DA on the skill of the seasonal water supply predictions is mixed. For deterministic (ensemble mean) predictions, the Percent Bias (PBias) is improved with integration of the DA. DREAM-DA and the RFC-DA have the lowest biases and the RFC-DA has the lowest Root Mean Squared Error (RMSE). However, the RFC and DREAM-DA have similar RMSE scores. For the probabilistic predictions, the RFC and DREAM have the highest Continuous Ranked Probability Skill Scores (CRPSS) and the RFC has the best discrimination for low flows. Reliability results are similar between the non-DA and DA tests and the DREAM and DREAM-DA have better reliability than the RFC and RFC-DA for forecast dates February 1 and later. Despite producing improved streamflow simulations in previous studies, the hindcast analysis suggests that the DA method tested may not result in obvious improvements in streamflow forecasts. We advocate that integration of hindcasting and probabilistic metrics provides more rigorous insight on model performance for forecasting applications, such as in this study.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1211993C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1211993C"><span>A multidisciplinary system for monitoring and forecasting Etna volcanic plumes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Coltelli, Mauro; Prestifilippo, Michele; Spata, Gaetano; Scollo, Simona; Andronico, Daniele</p> <p>2010-05-01</p> <p>One of the most active volcanoes in the world is Mt. Etna, in Italy, characterized by frequent explosive activity from the central craters and from fractures opened along the volcano flanks which, during the last years, caused several damages to aviation and forced the closure of the Catania International Airport. To give precise warning to the aviation authorities and air traffic controller and to assist the work of VAACs, a novel system for monitoring and forecasting Etna volcanic plumes, was developed at the Istituto Nazionale di Geofisica e Vulcanologia, sezione di Catania, the managing institution for the surveillance of Etna volcano. Monitoring is carried out using multispectral infrared measurements from the Spin Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation geosynchronous satellite able to track the volcanic plume with a high time resolution, visual and thermal cameras used to monitor the explosive activity, three continuous wave X-band disdrometers which detect ash dispersal and fallout, sounding balloons used to evaluate the atmospheric fields, and finally field data collected after the end of the eruptive event needed to extrapolate important features of explosive activity. Forecasting is carried out daily using automatic procedures which download weather forecast data obtained by meteorological mesoscale models from the Italian Air Force national Meteorological Office and from the hydrometeorological service of ARPA-SIM; run four different tephra dispersal models using input parameters obtained by the analysis of the deposits collected after few hours since the eruptive event similar to 22 July 1998, 21-24 July 2001 and 2002-03 Etna eruptions; plot hazard maps on ground and in air and finally publish them on a web-site dedicated to the Italian Civil Protection. The system has been already tested successfully during several explosive events occurring at Etna in 2006, 2007 and 2008. These events produced eruption columns high up to several kilometers above sea level and, on the basis of parameters such as mass eruption rate and total grain-size distributions, showed different explosive style. The monitoring and forecasting system is going on developing through the installation of new instruments able to detect different features of the volcanic plumes (e.g. the dispersal and sedimentation processes) in order to reduce the uncertainty of the input parameters used in the modeling. This is crucial to perform a reliable forecasting. We show that multidisciplinary approaches can really give useful information on the presence of volcanic ash and consequently to prevent damages and airport disruptions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.V14B..02W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.V14B..02W"><span>Seismic Forecasting of Eruptions at Dormant StratoVolcanoes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>White, R. A.</p> <p>2015-12-01</p> <p>Seismic monitoring data provide important constraints on tracking magmatic ascent and eruption. Based on direct experience with over 25 and review of over 10 additional eruption sequences at 24 volcanoes, we have identified 4 phases of precursory seismicity. 1) Deep (>20 km) low frequency (DLF) earthquakes occur near the base of the crust as magma rises toward crustal reservoirs. This seismicity is the most difficult to observe, owing to generally small magnitudes (M<2.5) the significant depth. 2) Distal volcano-tectonic (DVT) earthquakes occur on tectonic faults from a 2 to 30+ km distance laterally from (not beneath) the eventual eruption site as magma intrudes into and rises out of upper crustal reservoirs to depths of 2-3 km. A survey of 111 eruptions of 83 previously dormant volcanoes, (including all eruptions of VEI >4 since 1955) shows they were all preceded by significant DVT seismicity, usually felt. This DVT seismicity is easily observed owing to magnitudes generally reaching M>3.5. The cumulative DVT energy correlates to the intruding magma volume. 3) Low frequency (LF) earthquakes, LF tremor and contained explosions occur as magma interacts with the shallow hydrothermal system (<2 km depth), while the distal seismicity dies off.4) Shortly after this, repetitive self-similar proximal seismicity may occur and may dominate the seismic records as magma rises to the surface. We present some examples of this seismic progression to demonstrate that data from a single short-period vertical station are often sufficient to forecast eruption onsets.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/10178303','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/10178303"><span>Ash cloud aviation advisories</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Sullivan, T.J.; Ellis, J.S.; Schalk, W.W.</p> <p>1992-06-25</p> <p>During the recent (12--22 June 1991) Mount Pinatubo volcano eruptions, the US Air Force Global Weather Central (AFGWC) requested assistance of the US Department of Energy`s Atmospheric Release Advisory Capability (ARAC) in creating volcanic ash cloud aviation advisories for the region of the Philippine Islands. Through application of its three-dimensional material transport and diffusion models using AFGWC meteorological analysis and forecast wind fields ARAC developed extensive analysis and 12-hourly forecast ash cloud position advisories extending to 48 hours for a period of five days. The advisories consisted of ``relative`` ash cloud concentrations in ten layers (surface-5,000 feet, 5,000--10,000 feet andmore » every 10,000 feet to 90,000 feet). The ash was represented as a log-normal size distribution of 10--200 {mu}m diameter solid particles. Size-dependent ``ashfall`` was simulated over time as the eruption clouds dispersed. Except for an internal experimental attempt to model one of the Mount Redoubt, Alaska, eruptions (12/89), ARAC had no prior experience in modeling volcanic eruption ash hazards. For the cataclysmic eruption of 15--16 June, the complex three-dimensional atmospheric structure of the region produced dramatically divergent ash cloud patterns. The large eruptions (> 7--10 km) produced ash plume clouds with strong westward transport over the South China Sea, Southeast Asia, India and beyond. The low-level eruptions (< 7 km) and quasi-steady-state venting produced a plume which generally dispersed to the north and east throughout the support period. Modeling the sequence of eruptions presented a unique challenge. Although the initial approach proved viable, further refinement is necessary and possible. A distinct need exists to quantify eruptions consistently such that ``relative`` ash concentrations relate to specific aviation hazard categories.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.7755R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.7755R"><span>Recurrent patterns in fluid geochemistry data prior to phreatic eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rouwet, Dmitri; Sandri, Laura; Todesco, Micol; Tonini, Roberto; Pecoraino, Giovannella; Diliberto, Iole Serena</p> <p>2016-04-01</p> <p>Not all volcanic eruptions are magma-driven: the sudden evaporation and expansion of heated groundwater may cause phreatic eruptions, where the magma involvement is absent or negligible. Active crater lakes top some of the volcanoes prone to phreatic activity. This kind of eruption may occur suddenly, and without clear warning: on September 27, 2014 a phreatic eruption of Ontake, Japan, occurred without timely precursors, killing 57 tourists near the volcano summit. Phreatic eruptions can thus be as fatal as higher VEI events, due to the lack of recognised precursory signals, and because of their explosive and violent nature. In this study, we tackle the challenge of recognising precursors to phreatic eruptions, by analysing the records of two "phreatically" active volcanoes in Costa Rica, i.e. Poás and Turrialba, respectively with and without a crater lake. These volcanoes cover a wide range of time scales in eruptive behaviour, possibly culminating into magmatic activity, and have a long-term multi-parameter dataset mostly describing fluid geochemistry. Such dataset is suitable for being analysed by objective pattern recognition techniques, in search for recurrent schemes. The aim is to verify the existence and nature of potential precursory patterns, which will improve our understanding of phreatic events, and allow the assessment of the associated hazard at other volcanoes, such as Campi Flegrei or Vulcano, in Italy. Quantitative forecast of phreatic activity will be performed with BET_UNREST, a Bayesian Event Tree tool recently developed within the framework of FP7 EU VUELCO project. The study will combine the analysis of fluid geochemistry data with pattern recognition and phreatic eruption forecast on medium and short-term. The study will also provide interesting hints on the features that promote or hinder phreatic activity in volcanoes that host well-developed hydrothermal circulation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMGC33C0531F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMGC33C0531F"><span>Medium Range Flood Forecasting for Agriculture Damage Reduction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fakhruddin, S. H. M.</p> <p>2014-12-01</p> <p>Early warning is a key element for disaster risk reduction. In recent decades, major advancements have been made in medium range and seasonal flood forecasting. This progress provides a great opportunity to reduce agriculture damage and improve advisories for early action and planning for flood hazards. This approach can facilitate proactive rather than reactive management of the adverse consequences of floods. In the agricultural sector, for instance, farmers can take a diversity of options such as changing cropping patterns, applying fertilizer, irrigating and changing planting timing. An experimental medium range (1-10 day) flood forecasting model has been developed for Bangladesh and Thailand. It provides 51 sets of discharge ensemble forecasts of 1-10 days with significant persistence and high certainty. This type of forecast could assist farmers and other stakeholders for differential preparedness activities. These ensembles probabilistic flood forecasts have been customized based on user-needs for community-level application focused on agriculture system. The vulnerabilities of agriculture system were calculated based on exposure, sensitivity and adaptive capacity. Indicators for risk and vulnerability assessment were conducted through community consultations. The forecast lead time requirement, user-needs, impacts and management options for crops were identified through focus group discussions, informal interviews and community surveys. This paper illustrates potential applications of such ensembles for probabilistic medium range flood forecasts in a way that is not commonly practiced globally today.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.emc.ncep.noaa.gov/GEFS/faq.php','SCIGOVWS'); return false;" href="http://www.emc.ncep.noaa.gov/GEFS/faq.php"><span>National Centers for Environmental Prediction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>ENSEMBLE PRODUCTS & DATA SOURCES Probabilistic Forecasts of <em>Quantitative</em> Precipitation from the NCEP Predictability <em>Research</em> with Indian Monsoon Examples - PDF - 28 Mar 2005 North American Ensemble Forecast System <em>QUANTITATIVE</em> PRECIPITATION *PQPF* In these charts, the probability that 24-hour precipitation amounts over a</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_12 --> <div id="page_13" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="241"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090034858','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090034858"><span>Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Duda, David P.; Minnis, Patrick</p> <p>2009-01-01</p> <p>Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMSH21B2412H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMSH21B2412H"><span>Verification of Space Weather Forecasts using Terrestrial Weather Approaches</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.</p> <p>2015-12-01</p> <p>The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.4568W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.4568W"><span>Optical properties of volcanic ash: improving remote sensing observations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Whelley, Patrick; Colarco, Peter; Aquila, Valentina; Krotkov, Nickolay; Bleacher, Jake; Garry, Brent; Young, Kelsey; Rocha Lima, Adriana; Martins, Vanderlei; Carn, Simon</p> <p>2016-04-01</p> <p>Many times each year explosive volcanic eruptions loft ash into the atmosphere. Global travel and trade rely on aircraft vulnerable to encounters with airborne ash. Volcanic ash advisory centers (VAACs) rely on dispersion forecasts and satellite data to issue timely warnings. To improve ash forecasts model developers and satellite data providers need realistic information about volcanic ash microphysical and optical properties. In anticipation of future large eruptions we can study smaller events to improve our remote sensing and modeling skills so when the next Pinatubo 1991 or larger eruption occurs, ash can confidently be tracked in a quantitative way. At distances >100km from their sources, drifting ash plumes, often above meteorological clouds, are not easily detected from conventional remote sensing platforms, save deriving their quantitative characteristics, such as mass density. Quantitative interpretation of these observations depends on a priori knowledge of the spectral optical properties of the ash in UV (>0.3μm) and TIR wavelengths (>10μm). Incorrect assumptions about the optical properties result in large errors in inferred column mass loading and size distribution, which misguide operational ash forecasts. Similarly, simulating ash properties in global climate models also requires some knowledge of optical properties to improve aerosol speciation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ESD.....9..701I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ESD.....9..701I"><span>Assessing the impact of a future volcanic eruption on decadal predictions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Illing, Sebastian; Kadow, Christopher; Pohlmann, Holger; Timmreck, Claudia</p> <p>2018-06-01</p> <p>The likelihood of a large volcanic eruption in the future provides the largest uncertainty concerning the evolution of the climate system on the timescale of a few years, but also an excellent opportunity to learn about the behavior of the climate system, and our models thereof. So the following question emerges: how predictable is the response of the climate system to future eruptions? By this we mean to what extent will the volcanic perturbation affect decadal climate predictions and how does the pre-eruption climate state influence the impact of the volcanic signal on the predictions? To address these questions, we performed decadal forecasts with the MiKlip prediction system, which is based on the MPI-ESM, in the low-resolution configuration for the initialization years 2012 and 2014, which differ in the Pacific Decadal Oscillation (PDO) and North Atlantic Oscillation (NAO) phase. Each forecast contains an artificial Pinatubo-like eruption starting in June of the first prediction year and consists of 10 ensemble members. For the construction of the aerosol radiative forcing, we used the global aerosol model ECHAM5-HAM in a version adapted for volcanic eruptions. We investigate the response of different climate variables, including near-surface air temperature, precipitation, frost days, and sea ice area fraction. Our results show that the average global cooling response over 4 years of about 0.2 K and the precipitation decrease of about 0.025 mm day-1 is relatively robust throughout the different experiments and seemingly independent of the initialization state. However, on a regional scale, we find substantial differences between the initializations. The cooling effect in the North Atlantic and Europe lasts longer and the Arctic sea ice increase is stronger in the simulations initialized in 2014. In contrast, the forecast initialized in 2012 with a negative PDO shows a prolonged cooling in the North Pacific basin.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.V41C4822M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.V41C4822M"><span>Advances in volcano monitoring and risk reduction in Latin America</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>McCausland, W. A.; White, R. A.; Lockhart, A. B.; Marso, J. N.; Assitance Program, V. D.; Volcano Observatories, L. A.</p> <p>2014-12-01</p> <p>We describe results of cooperative work that advanced volcanic monitoring and risk reduction. The USGS-USAID Volcano Disaster Assistance Program (VDAP) was initiated in 1986 after disastrous lahars during the 1985 eruption of Nevado del Ruiz dramatizedthe need to advance international capabilities in volcanic monitoring, eruption forecasting and hazard communication. For the past 28 years, VDAP has worked with our partners to improve observatories, strengthen monitoring networks, and train observatory personnel. We highlight a few of the many accomplishments by Latin American volcano observatories. Advances in monitoring, assessment and communication, and lessons learned from the lahars of the 1985 Nevado del Ruiz eruption and the 1994 Paez earthquake enabled the Servicio Geológico Colombiano to issue timely, life-saving warnings for 3 large syn-eruptive lahars at Nevado del Huila in 2007 and 2008. In Chile, the 2008 eruption of Chaitén prompted SERNAGEOMIN to complete a national volcanic vulnerability assessment that led to a major increase in volcano monitoring. Throughout Latin America improved seismic networks now telemeter data to observatories where the decades-long background rates and types of seismicity have been characterized at over 50 volcanoes. Standardization of the Earthworm data acquisition system has enabled data sharing across international boundaries, of paramount importance during both regional tectonic earthquakes and during volcanic crises when vulnerabilities cross international borders. Sharing of seismic forecasting methods led to the formation of the international organization of Latin American Volcano Seismologists (LAVAS). LAVAS courses and other VDAP training sessions have led to international sharing of methods to forecast eruptions through recognition of precursors and to reduce vulnerabilities from all volcano hazards (flows, falls, surges, gas) through hazard assessment, mapping and modeling. Satellite remote sensing data-sharing facilitatescross-border identification and warnings of ash plumes for aviation. Overall, long-term strategies of data collection and experience-sharing have helped Latin American observatories improve their monitoring and create informed communities cognizant of vulnerabilities inherent in living near volcanoes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1616139K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1616139K"><span>Super Ensemble-based Aviation Turbulence Guidance (SEATG) for Air Traffic Management (ATM)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, Jung-Hoon; Chan, William; Sridhar, Banavar; Sharman, Robert</p> <p>2014-05-01</p> <p>Super Ensemble (ensemble of ten turbulence metrics from time-lagged ensemble members of weather forecast data)-based Aviation Turbulence Guidance (SEATG) is developed using Weather Research and Forecasting (WRF) model and in-situ eddy dissipation rate (EDR) observations equipped on commercial aircraft over the contiguous United States. SEATG is a sequence of five procedures including weather modeling, calculating turbulence metrics, mapping EDR-scale, evaluating metrics, and producing final SEATG forecast. This uses similar methodology to the operational Graphic Turbulence Guidance (GTG) with three major improvements. First, SEATG use a higher resolution (3-km) WRF model to capture cloud-resolving scale phenomena. Second, SEATG computes turbulence metrics for multiple forecasts that are combined at the same valid time resulting in an time-lagged ensemble of multiple turbulence metrics. Third, SEATG provides both deterministic and probabilistic turbulence forecasts to take into account weather uncertainties and user demands. It is found that the SEATG forecasts match well with observed radar reflectivity along a surface front as well as convectively induced turbulence outside the clouds on 7-8 Sep 2012. And, overall performance skill of deterministic SEATG against the observed EDR data during this period is superior to any single turbulence metrics. Finally, probabilistic SEATG is used as an example application of turbulence forecast for air-traffic management. In this study, a simple Wind-Optimal Route (WOR) passing through the potential areas of probabilistic SEATG and Lateral Turbulence Avoidance Route (LTAR) taking into account the SEATG are calculated at z = 35000 ft (z = 12 km) from Los Angeles to John F. Kennedy international airports. As a result, WOR takes total of 239 minutes with 16 minutes of SEATG areas for 40% of moderate turbulence potential, while LTAR takes total of 252 minutes travel time that 5% of fuel would be additionally consumed to entirely avoid the moderate SEATG regions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015P%26SS..117..356L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015P%26SS..117..356L"><span>Probabilistic constraints from existing and future radar imaging on volcanic activity on Venus</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lorenz, Ralph D.</p> <p>2015-11-01</p> <p>We explore the quantitative limits that may be placed on Venus' present-day volcanic activity by radar imaging of surface landforms. The apparent nondetection of new lava flows in the areas observed twice by Magellan suggests that there is a ~60% chance that the eruption rate is ~1 km3/yr or less, using the eruption history and area/volume flow geometry of terrestrial volcanoes (Etna, Mauna Loa and Merapi) as a guide. However, if the detection probability of an individual flow is low (e.g. ~10%) due to poor resolution or quality and unmodeled viewing geometry effects, the constraint (<10 km3/yr) is not useful. Imaging at Magellan resolution or better of only ~10% of the surface area of Venus on a new mission (30 years after Magellan) would yield better than 99% chance of detecting a new lava flow, even if the volcanic activity is at the low end of predictions (~0.01 km3/yr) and is expressed through a single volcano with a stochastic eruption history. Closer re-examination of Magellan data may be worthwhile, both to search for new features, and to establish formal (location-dependent) limits on activity against which data from future missions can be tested. While Magellan-future and future-future comparisons should offer much lower detection thresholds for erupted volumes, a probabilistic approach will be required to properly understand the implications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3270390','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3270390"><span>Uncertainty in weather and climate prediction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Slingo, Julia; Palmer, Tim</p> <p>2011-01-01</p> <p>Following Lorenz's seminal work on chaos theory in the 1960s, probabilistic approaches to prediction have come to dominate the science of weather and climate forecasting. This paper gives a perspective on Lorenz's work and how it has influenced the ways in which we seek to represent uncertainty in forecasts on all lead times from hours to decades. It looks at how model uncertainty has been represented in probabilistic prediction systems and considers the challenges posed by a changing climate. Finally, the paper considers how the uncertainty in projections of climate change can be addressed to deliver more reliable and confident assessments that support decision-making on adaptation and mitigation. PMID:22042896</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015HESS...19.3695G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015HESS...19.3695G"><span>Improving real-time inflow forecasting into hydropower reservoirs through a complementary modelling framework</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.</p> <p>2015-08-01</p> <p>Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead time is considered within the day-ahead (Elspot) market of the Nordic exchange market. A complementary modelling framework presents an approach for improving real-time forecasting without needing to modify the pre-existing forecasting model, but instead formulating an independent additive or complementary model that captures the structure the existing operational model may be missing. We present here the application of this principle for issuing improved hourly inflow forecasts into hydropower reservoirs over extended lead times, and the parameter estimation procedure reformulated to deal with bias, persistence and heteroscedasticity. The procedure presented comprises an error model added on top of an unalterable constant parameter conceptual model. This procedure is applied in the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead times up to 17 h. Evaluation of the percentage of observations bracketed in the forecasted 95 % confidence interval indicated that the degree of success in containing 95 % of the observations varies across seasons and hydrologic years.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1612500L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1612500L"><span>Improved Weather and Power Forecasts for Energy Operations - the German Research Project EWeLiNE</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lundgren, Kristina; Siefert, Malte; Hagedorn, Renate; Majewski, Detlev</p> <p>2014-05-01</p> <p>The German energy system is going through a fundamental change. Based on the energy plans of the German federal government, the share of electrical power production from renewables should increase to 35% by 2020. This means that, in the near future at certain times renewable energies will provide a major part of Germany's power production. Operating a power supply system with a large share of weather-dependent power sources in a secure way requires improved power forecasts. One of the most promising strategies to improve the existing wind power and PV power forecasts is to optimize the underlying weather forecasts and to enhance the collaboration between the meteorology and energy sectors. Deutscher Wetterdienst addresses these challenges in collaboration with Fraunhofer IWES within the research project EWeLiNE. The overarching goal of the project is to improve the wind and PV power forecasts by combining improved power forecast models and optimized weather forecasts. During the project, the numerical weather prediction models COSMO-DE and COSMO-DE-EPS (Ensemble Prediction System) by Deutscher Wetterdienst will be generally optimized towards improved wind power and PV forecasts. For instance, it will be investigated whether the assimilation of new types of data, e.g. power production data, can lead to improved weather forecasts. With regard to the probabilistic forecasts, the focus is on the generation of ensembles and ensemble calibration. One important aspect of the project is to integrate the probabilistic information into decision making processes by developing user-specified products. In this paper we give an overview of the project and present first results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009CG.....35.1035D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009CG.....35.1035D"><span>Applications of the PUFF model to forecasts of volcanic clouds dispersal from Etna and Vesuvio</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Daniele, P.; Lirer, L.; Petrosino, P.; Spinelli, N.; Peterson, R.</p> <p>2009-05-01</p> <p>PUFF is a numerical volcanic ash tracking model developed to simulate the behaviour of ash clouds in the atmosphere. The model uses wind field data provided by meteorological models and adds dispersion and sedimentation physics to predict the evolution of the cloud once it reaches thermodynamic equilibrium with the atmosphere. The software is intended for use in emergency response situations during an eruption to quickly forecast the position and trajectory of the ash cloud in the near (˜1-72 h) future. In this paper, we describe the first application of the PUFF model in forecasting volcanic ash dispersion from the Etna and Vesuvio volcanoes. We simulated the daily occurrence of an eruptive event of Etna utilizing ash cloud parameters describing the paroxysm of 22nd July 1998 and wind field data for the 1st September 2005-31st December 2005 time span from the Global Forecast System (GFS) model at the approximate location of the Etna volcano (38N 15E). The results show that volcanic ash particles are dispersed in a range of directions in response to changing wind field at various altitudes and that the ash clouds are mainly dispersed toward the east and southeast, although the exact trajectory is highly variable, and can change within a few hours. We tested the sensitivity of the model to the mean particle grain size and found that an increased concentration of ash particles in the atmosphere results when the mean grain size is decreased. Similarly, a dramatic variation in dispersion results when the logarithmic standard deviation of the particle-size distribution is changed. Additionally, we simulated the occurrence of an eruptive event at both Etna and Vesuvio, using the same parameters describing the initial volcanic plume, and wind field data recorded for 1st September 2005, at approximately 38N 15E for Etna and 41N 14E for Vesuvio. The comparison of the two simulations indicates that identical eruptions occurring at the same time at the two volcanic centres display significantly different dispersal axes as a consequence of the different local wind field acting at the respective eruptive vents. At the Vesuvio the volcano, a plinian eruptive event with the dynamical parameters of the 79 A.D. eruption was simulated daily for one year, from 1st July 2005 to 30th June 2006. The statistical processing of results points out that, although in most cases the ash cloud dispersal encompasses many different areas, generally the easterly southeasterly direction is preferred. Our results highlight the significant role of wind field trends in influencing the distribution of ash particles from eruptive columns and prove that the dynamical parameters that most influence the variability of plume dispersal are the duration of the eruption and the maximum column height. Finally, the possible use of cloud simulations for refining hazard maps of areas exposed to volcanic ash dispersal is proposed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009WRR....45.5407W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009WRR....45.5407W"><span>A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.</p> <p>2009-05-01</p> <p>Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70028875','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70028875"><span>Monitoring a restless volcano: The 2004 eruption of Mount St. Helens</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Gardner, C.</p> <p>2005-01-01</p> <p>Although the precise course of volcanic activity is difficult to predict, volcanologists are pretty adept at interpreting volcanic signals from well-monitored volcanoes in order to make short-term forecasts. Various monitoring tools record effects to give us warning before eruptions, changes in eruptive behavior during eruptions, or signals that an eruption is ending. Foremost among these tools is seismic monitoring. The character, size, depth and rate of earthquakes are all important to the interpretation of what is happening belowground. The first inkling of renewed activity at Mount St. Helens began in the early hours of Sept. 23, when a seismic swarm - tens to hundreds of earthquakes over days to a week - began beneath the volcano. This article details the obervations made during the eruptive sequence.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/534491','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/534491"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Perry, F.V.; Valentine, G.A.; Crowe, B.M.</p> <p></p> <p>This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The objective of this project was to determine whether isotopic techniques can be used to assess the eruption potential and eruption volume of continental stratovolcanoes. Large-volume eruptions from stratovolcanoes pose significant hazards to population and infrastructure in many parts of the world. We are testing whether this technique will allow a short- to medium-term (decades to millennia) probabilistic hazard assessment of large-volume eruptions. If successful, the technique will be useful to countries or regions that must consider medium tomore » long-term volcanic (e.g., nuclear waste facilities). We have begun sample acquisition and isotopic measurements at two stratovolcanoes, Pico de Orizaba in eastern Mexico and Daisen in western Japan.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25581088','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25581088"><span>The psychology of intelligence analysis: drivers of prediction accuracy in world politics.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mellers, Barbara; Stone, Eric; Atanasov, Pavel; Rohrbaugh, Nick; Metz, S Emlen; Ungar, Lyle; Bishop, Michael M; Horowitz, Michael; Merkle, Ed; Tetlock, Philip</p> <p>2015-03-01</p> <p>This article extends psychological methods and concepts into a domain that is as profoundly consequential as it is poorly understood: intelligence analysis. We report findings from a geopolitical forecasting tournament that assessed the accuracy of more than 150,000 forecasts of 743 participants on 199 events occurring over 2 years. Participants were above average in intelligence and political knowledge relative to the general population. Individual differences in performance emerged, and forecasting skills were surprisingly consistent over time. Key predictors were (a) dispositional variables of cognitive ability, political knowledge, and open-mindedness; (b) situational variables of training in probabilistic reasoning and participation in collaborative teams that shared information and discussed rationales (Mellers, Ungar, et al., 2014); and (c) behavioral variables of deliberation time and frequency of belief updating. We developed a profile of the best forecasters; they were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness. They had greater understanding of geopolitics, training in probabilistic reasoning, and opportunities to succeed in cognitively enriched team environments. Last but not least, they viewed forecasting as a skill that required deliberate practice, sustained effort, and constant monitoring of current affairs. PsycINFO Database Record (c) 2015 APA, all rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.2661F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.2661F"><span>Application of Medium and Seasonal Flood Forecasts for Agriculture Damage Assessment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fakhruddin, Shamsul; Ballio, Francesco; Menoni, Scira</p> <p>2015-04-01</p> <p>Early warning is a key element for disaster risk reduction. In recent decades, major advancements have been made in medium range and seasonal flood forecasting. This progress provides a great opportunity to reduce agriculture damage and improve advisories for early action and planning for flood hazards. This approach can facilitate proactive rather than reactive management of the adverse consequences of floods. In the agricultural sector, for instance, farmers can take a diversity of options such as changing cropping patterns, applying fertilizer, irrigating and changing planting timing. An experimental medium range (1-10 day) and seasonal (20-25 days) flood forecasting model has been developed for Thailand and Bangladesh. It provides 51 sets of discharge ensemble forecasts of 1-10 days with significant persistence and high certainty and qualitative outlooks for 20-25 days. This type of forecast could assist farmers and other stakeholders for differential preparedness activities. These ensembles probabilistic flood forecasts have been customized based on user-needs for community-level application focused on agriculture system. The vulnerabilities of agriculture system were calculated based on exposure, sensitivity and adaptive capacity. Indicators for risk and vulnerability assessment were conducted through community consultations. The forecast lead time requirement, user-needs, impacts and management options for crops were identified through focus group discussions, informal interviews and community surveys. This paper illustrates potential applications of such ensembles for probabilistic medium range and seasonal flood forecasts in a way that is not commonly practiced globally today.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1610534H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1610534H"><span>Time-varying loss forecast for an earthquake scenario in Basel, Switzerland</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Herrmann, Marcus; Zechar, Jeremy D.; Wiemer, Stefan</p> <p>2014-05-01</p> <p>When an unexpected earthquake occurs, people suddenly want advice on how to cope with the situation. The 2009 L'Aquila quake highlighted the significance of public communication and pushed the usage of scientific methods to drive alternative risk mitigation strategies. For instance, van Stiphout et al. (2010) suggested a new approach for objective evacuation decisions on short-term: probabilistic risk forecasting combined with cost-benefit analysis. In the present work, we apply this approach to an earthquake sequence that simulated a repeat of the 1356 Basel earthquake, one of the most damaging events in Central Europe. A recent development to benefit society in case of an earthquake are probabilistic forecasts of the aftershock occurrence. But seismic risk delivers a more direct expression of the socio-economic impact. To forecast the seismic risk on short-term, we translate aftershock probabilities to time-varying seismic hazard and combine this with time-invariant loss estimation. Compared with van Stiphout et al. (2010), we use an advanced aftershock forecasting model and detailed settlement data to allow us spatial forecasts and settlement-specific decision-making. We quantify the risk forecast probabilistically in terms of human loss. For instance one minute after the M6.6 mainshock, the probability for an individual to die within the next 24 hours is 41 000 times higher than the long-term average; but the absolute value remains at minor 0.04 %. The final cost-benefit analysis adds value beyond a pure statistical approach: it provides objective statements that may justify evacuations. To deliver supportive information in a simple form, we propose a warning approach in terms of alarm levels. Our results do not justify evacuations prior to the M6.6 mainshock, but in certain districts afterwards. The ability to forecast the short-term seismic risk at any time-and with sufficient data anywhere-is the first step of personal decision-making and raising risk awareness among the public. Reference Van Stiphout, T., S. Wiemer, and W. Marzocchi (2010). 'Are short-term evacuations warranted? Case of the 2009 L'Aquila earthquake'. In: Geophysical Research Letters 37.6, pp. 1-5. url: http://onlinelibrary.wiley.com/doi/10.1029/ 2009GL042352/abstract.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1715786W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1715786W"><span>Extracting local information from crowds through betting markets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Weijs, Steven</p> <p>2015-04-01</p> <p>In this research, a set-up is considered in which users can bet against a forecasting agency to challenge their probabilistic forecasts. From an information theory standpoint, a reward structure is considered that either provides the forecasting agency with better information, paying the successful providers of information for their winning bets, or funds excellent forecasting agencies through users that think they know better. Especially for local forecasts, the approach may help to diagnose model biases and to identify local predictive information that can be incorporated in the models. The challenges and opportunities for implementing such a system in practice are also discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1215747M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1215747M"><span>Observations of Eyjafjallajökull eruption's plume at Potenza EARLINET station</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mona, Lucia; Amodeo, Aldo; Boselli, Antonella; Cornacchia, Carmela; D'Amico, Guiseppe; Giunta, Aldo; Madonna, Fabio; Pappalardo, Gelsomina</p> <p>2010-05-01</p> <p>Eyjafjallajökull is one of the smallest glacier in Iceland. After seismic activity recorded during December 2009, a first eruption started on 20 March, between 22:30 and 23:30 UT. After a brief stop, a new phase of the Eyjafjallajökull eruption started around midnight on April 14, where melt penetrated its way to the central crater beneath the glacier. An eruption plume was observed in the early morning on 14 April. Ash loaded eruption plume rose to more than 8 km height, deflected to the East by westerly winds. Eruptive activity continued in the following days until 23 April with variable maximum height (between 8 and 2 km a.s.l.). Until 27 April, a plume is always visible in proximity of the volcano. On 15 April, the eruption plume reaches continental Europe with closure of airspace over large part of Northern Europe. In the following days, airspace was closed also in some regions of Southern Europe. On 15 April, 10:00 UT CNR-IMAA, Potenza distributed an alert to EARLINET stations informing about a large amount of ash is directing towards North-West of Europe. Even if EARLINET is not an operational, but research oriented, network, almost all the EARLINET stations followed the event performing measurements whenever weather conditions allow it. Because of their proximity to the source, England and Scandinavian countries are of course the most involved in the transported ash arrival. Accordingly to the MetOffice forecasts, the ash plume would have to reach Central Europe on 16 April. The transport toward South was almost blocked by the Alps. A different scenario is forecasted by MetOffice for 20-21 April when the arrival of the volcanic plume is forecasted down to the Southern Italy. At CNR-IMAA, the atmospheric observatory (CIAO) followed the event by means of all available instruments, including EARLINET multi-wavelength lidars, cloud-radar, microwave profiler and AERONET sun-photometer. Low clouds and rain did not permit measurements over Potenza for the period starting from the distributed alert on 15 April until the evening of 19 April. Since 19 April, measurements were performed almost continuously, with breaks only for light rain and low clouds, until 22 April evening when intense rain started again. During the whole observation period aerosol content is not negligible in the free troposphere with sparse aerosols distributed between 3 and 8 km a.s.l. In addition thin layers are distinguishable in the reported temporal evolution at different times and altitudes (e.g. descending layer between 10 and 5 km on 21 April, 00:00 UT -14:00 UT). The most intense aerosol return above the PBL is observed on 20 April around 22:20 UT at about 4 km a.s.l. Ancillary information confirm the volcanic origin of the selected layer. Accordingly to DREAM forecast, no dust should be present over Italy for this day. HYSPLIT backtrajectories show that the observed layer comes from Northern Europe, probably from Iceland. In the following hours, the volcanic layer went down in altitude, mixing with the underlying local aerosol layer. ACKNOWLEDGMENTS The financial support for EARLINET by the European Commission under grant RICA-025991 is gratefully acknowledged. The authors thank NOAA Air Resources Laboratory (ARL) for the provision of the HYSPLIT backtrajectory analysis, the Barcelona Supercomputing Center for DREAM forecasts, NASA for MODIS image and MetOffice for forecast of volcano plume dispersion.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.4929V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.4929V"><span>Comparison of the performance and reliability of 18 lumped hydrological models driven by ECMWF rainfall ensemble forecasts: a case study on 29 French catchments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Velázquez, Juan Alberto; Anctil, François; Ramos, Maria-Helena; Perrin, Charles</p> <p>2010-05-01</p> <p>An ensemble forecasting system seeks to assess and to communicate the uncertainty of hydrological predictions by proposing, at each time step, an ensemble of forecasts from which one can estimate the probability distribution of the predictant (the probabilistic forecast), in contrast with a single estimate of the flow, for which no distribution is obtainable (the deterministic forecast). In the past years, efforts towards the development of probabilistic hydrological prediction systems were made with the adoption of ensembles of numerical weather predictions (NWPs). The additional information provided by the different available Ensemble Prediction Systems (EPS) was evaluated in a hydrological context on various case studies (see the review by Cloke and Pappenberger, 2009). For example, the European ECMWF-EPS was explored in case studies by Roulin et al. (2005), Bartholmes et al. (2005), Jaun et al. (2008), and Renner et al. (2009). The Canadian EC-EPS was also evaluated by Velázquez et al. (2009). Most of these case studies investigate the ensemble predictions of a given hydrological model, set up over a limited number of catchments. Uncertainty from weather predictions is assessed through the use of meteorological ensembles. However, uncertainty from the tested hydrological model and statistical robustness of the forecasting system when coping with different hydro-meteorological conditions are less frequently evaluated. The aim of this study is to evaluate and compare the performance and the reliability of 18 lumped hydrological models applied to a large number of catchments in an operational ensemble forecasting context. Some of these models were evaluated in a previous study (Perrin et al. 2001) for their ability to simulate streamflow. Results demonstrated that very simple models can achieve a level of performance almost as high (sometimes higher) as models with more parameters. In the present study, we focus on the ability of the hydrological models to provide reliable probabilistic forecasts of streamflow, based on ensemble weather predictions. The models were therefore adapted to run in a forecasting mode, i.e., to update initial conditions according to the last observed discharge at the time of the forecast, and to cope with ensemble weather scenarios. All models are lumped, i.e., the hydrological behavior is integrated over the spatial scale of the catchment, and run at daily time steps. The complexity of tested models varies between 3 and 13 parameters. The models are tested on 29 French catchments. Daily streamflow time series extend over 17 months, from March 2005 to July 2006. Catchment areas range between 1470 km2 and 9390 km2, and represent a variety of hydrological and meteorological conditions. The 12 UTC 10-day ECMWF rainfall ensemble (51 members) was used, which led to daily streamflow forecasts for a 9-day lead time. In order to assess the performance and reliability of the hydrological ensemble predictions, we computed the Continuous Ranked probability Score (CRPS) (Matheson and Winkler, 1976), as well as the reliability diagram (e.g. Wilks, 1995) and the rank histogram (Talagrand et al., 1999). Since the ECMWF deterministic forecasts are also available, the performance of the hydrological forecasting systems was also evaluated by comparing the deterministic score (MAE) with the probabilistic score (CRPS). The results obtained for the 18 hydrological models and the 29 studied catchments are discussed in the perspective of improving the operational use of ensemble forecasting in hydrology. References Bartholmes, J. and Todini, E.: Coupling meteorological and hydrological models for flood forecasting, Hydrol. Earth Syst. Sci., 9, 333-346, 2005. Cloke, H. and Pappenberger, F.: Ensemble Flood Forecasting: A Review. Journal of Hydrology 375 (3-4): 613-626, 2009. Jaun, S., Ahrens, B., Walser, A., Ewen, T., and Schär, C.: A probabilistic view on the August 2005 floods in the upper Rhine catchment, Nat. Hazards Earth Syst. Sci., 8, 281-291, 2008. Matheson, J. E. and Winkler, R. L.: Scoring rules for continuous probability distributions, Manage Sci., 22, 1087-1096, 1976. Perrin, C., Michel C. and Andréassian,V. Does a large number of parameters enhance model performance? Comparative assessment of common catchment model structures on 429 catchments, J. Hydrol., 242, 275-301, 2001. Renner, M., Werner, M. G. F., Rademacher, S., and Sprokkereef, E.: Verification of ensemble flow forecast for the River Rhine, J. Hydrol., 376, 463-475, 2009. Roulin, E. and Vannitsem, S.: Skill of medium-range hydrological ensemble predictions, J. Hydrometeorol., 6, 729-744, 2005. Talagrand, O., Vautard, R., and Strauss, B.: Evaluation of the probabilistic prediction systems, in: Proceedings, ECMWF Workshop on Predictability, Shinfield Park, Reading, Berkshire, ECMWF, 1-25, 1999. Velázquez, J.A., Petit, T., Lavoie, A., Boucher M.-A., Turcotte R., Fortin V., and Anctil, F. : An evaluation of the Canadian global meteorological ensemble prediction system for short-term hydrological forecasting, Hydrol. Earth Syst. Sci., 13, 2221-2231, 2009. Wilks, D. S.: Statistical Methods in the Atmospheric Sciences, Academic Press, San Diego, CA, 465 pp., 1995.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_13 --> <div id="page_14" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="261"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70193298','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70193298"><span>A robust method to forecast volcanic ash clouds</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Denlinger, Roger P.; Pavolonis, Mike; Sieglaff, Justin</p> <p>2012-01-01</p> <p>Ash clouds emanating from volcanic eruption columns often form trails of ash extending thousands of kilometers through the Earth's atmosphere, disrupting air traffic and posing a significant hazard to air travel. To mitigate such hazards, the community charged with reducing flight risk must accurately assess risk of ash ingestion for any flight path and provide robust forecasts of volcanic ash dispersal. In response to this need, a number of different transport models have been developed for this purpose and applied to recent eruptions, providing a means to assess uncertainty in forecasts. Here we provide a framework for optimal forecasts and their uncertainties given any model and any observational data. This involves random sampling of the probability distributions of input (source) parameters to a transport model and iteratively running the model with different inputs, each time assessing the predictions that the model makes about ash dispersal by direct comparison with satellite data. The results of these comparisons are embodied in a likelihood function whose maximum corresponds to the minimum misfit between model output and observations. Bayes theorem is then used to determine a normalized posterior probability distribution and from that a forecast of future uncertainty in ash dispersal. The nature of ash clouds in heterogeneous wind fields creates a strong maximum likelihood estimate in which most of the probability is localized to narrow ranges of model source parameters. This property is used here to accelerate probability assessment, producing a method to rapidly generate a prediction of future ash concentrations and their distribution based upon assimilation of satellite data as well as model and data uncertainties. Applying this method to the recent eruption of Eyjafjallajökull in Iceland, we show that the 3 and 6 h forecasts of ash cloud location probability encompassed the location of observed satellite-determined ash cloud loads, providing an efficient means to assess all of the hazards associated with these ash clouds.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMPA31B0340S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMPA31B0340S"><span>Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.</p> <p>2017-12-01</p> <p>The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1611647A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1611647A"><span>How to pose the question matters: Behavioural Economics concepts in decision making on the basis of ensemble forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Alfonso, Leonardo; van Andel, Schalk Jan</p> <p>2014-05-01</p> <p>Part of recent research in ensemble and probabilistic hydro-meteorological forecasting analyses which probabilistic information is required by decision makers and how it can be most effectively visualised. This work, in addition, analyses if decision making in flood early warning is also influenced by the way the decision question is posed. For this purpose, the decision-making game "Do probabilistic forecasts lead to better decisions?", which Ramos et al (2012) conducted at the EGU General Assembly 2012 in the city of Vienna, has been repeated with a small group and expanded. In that game decision makers had to decide whether or not to open a flood release gate, on the basis of flood forecasts, with and without uncertainty information. A conclusion of that game was that, in the absence of uncertainty information, decision makers are compelled towards a more risk-averse attitude. In order to explore to what extent the answers were driven by the way the questions were framed, in addition to the original experiment, a second variant was introduced where participants were asked to choose between a sure value (for either loosing or winning with a giving probability) and a gamble. This set-up is based on Kahneman and Tversky (1979). Results indicate that the way how the questions are posed may play an important role in decision making and that Prospect Theory provides promising concepts to further understand how this works.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014BVol...76..771S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014BVol...76..771S"><span>Long-term multi-hazard assessment for El Misti volcano (Peru)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sandri, Laura; Thouret, Jean-Claude; Constantinescu, Robert; Biass, Sébastien; Tonini, Roberto</p> <p>2014-02-01</p> <p>We propose a long-term probabilistic multi-hazard assessment for El Misti Volcano, a composite cone located <20 km from Arequipa. The second largest Peruvian city is a rapidly expanding economic centre and is classified by UNESCO as World Heritage. We apply the Bayesian Event Tree code for Volcanic Hazard (BET_VH) to produce probabilistic hazard maps for the predominant volcanic phenomena that may affect c.900,000 people living around the volcano. The methodology accounts for the natural variability displayed by volcanoes in their eruptive behaviour, such as different types/sizes of eruptions and possible vent locations. For this purpose, we treat probabilistically several model runs for some of the main hazardous phenomena (lahars, pyroclastic density currents (PDCs), tephra fall and ballistic ejecta) and data from past eruptions at El Misti (tephra fall, PDCs and lahars) and at other volcanoes (PDCs). The hazard maps, although neglecting possible interactions among phenomena or cascade effects, have been produced with a homogeneous method and refer to a common time window of 1 year. The probability maps reveal that only the north and east suburbs of Arequipa are exposed to all volcanic threats except for ballistic ejecta, which are limited to the uninhabited but touristic summit cone. The probability for pyroclastic density currents reaching recently expanding urban areas and the city along ravines is around 0.05 %/year, similar to the probability obtained for roof-critical tephra loading during the rainy season. Lahars represent by far the most probable threat (around 10 %/year) because at least four radial drainage channels can convey them approximately 20 km away from the volcano across the entire city area in heavy rain episodes, even without eruption. The Río Chili Valley represents the major concern to city safety owing to the probable cascading effect of combined threats: PDCs and rockslides, dammed lake break-outs and subsequent lahars or floods. Although this study does not intend to replace the current El Misti hazard map, the quantitative results of this probabilistic multi-hazard assessment can be incorporated into a multi-risk analysis, to support decision makers in any future improvement of the current hazard evaluation, such as further land-use planning and possible emergency management.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20060056393','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20060056393"><span>Time Relevance of Convective Weather Forecast for Air Traffic Automation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Chan, William N.</p> <p>2006-01-01</p> <p>The Federal Aviation Administration (FAA) is handling nearly 120,000 flights a day through its Air Traffic Management (ATM) system and air traffic congestion is expected to increse substantially over the next 20 years. Weather-induced impacts to throughput and efficiency are the leading cause of flight delays accounting for 70% of all delays with convective weather accounting for 60% of all weather related delays. To support the Next Generation Air Traffic System goal of operating at 3X current capacity in the NAS, ATC decision support tools are being developed to create advisories to assist controllers in all weather constraints. Initial development of these decision support tools did not integrate information regarding weather constraints such as thunderstorms and relied on an additional system to provide that information. Future Decision Support Tools should move towards an integrated system where weather constraints are factored into the advisory of a Decision Support Tool (DST). Several groups such at NASA-Ames, Lincoln Laboratories, and MITRE are integrating convective weather data with DSTs. A survey of current convective weather forecast and observation data show they span a wide range of temporal and spatial resolutions. Short range convective observations can be obtained every 5 mins with longer range forecasts out to several days updated every 6 hrs. Today, the short range forecasts of less than 2 hours have a temporal resolution of 5 mins. Beyond 2 hours, forecasts have much lower temporal. resolution of typically 1 hour. Spatial resolutions vary from 1km for short range to 40km for longer range forecasts. Improving the accuracy of long range convective forecasts is a major challenge. A report published by the National Research Council states improvements for convective forecasts for the 2 to 6 hour time frame will only be achieved for a limited set of convective phenomena in the next 5 to 10 years. Improved longer range forecasts will be probabilistic as opposed to the deterministic shorter range forecasts. Despite the known low level of confidence with respect to long range convective forecasts, these data are still useful to a DST routing algorithm. It is better to develop an aircraft route using the best information available than no information. The temporally coarse long range forecast data needs to be interpolated to be useful to a DST. A DST uses aircraft trajectory predictions that need to be evaluated for impacts by convective storms. Each time-step of a trajectory prediction n&s to be checked against weather data. For the case of coarse temporal data, there needs to be a method fill in weather data where there is none. Simply using the coarse weather data without any interpolation can result in DST routes that are impacted by regions of strong convection. Increasing the temporal resolution of these data can be achieved but result in a large dataset that may prove to be an operational challenge in transmission and loading by a DST. Currently, it takes about 7mins retrieve a 7mb RUC2 forecast file from NOAA at NASA-Ames Research Center. A prototype NCWF6 1 hour forecast is about 3mb in size. A Six hour NCWFG forecast with a 1hr forecast time-step will be about l8mb (6 x 3mb). A 6 hour NCWF6 forecast with a l5min forecast time-step will be about 7mb (24 x 3mb). Based on the time it takes to retrieve a 7mb RUC2 forecast, it will take approximately 70mins to retrieve a 6 hour NCWF forecast with 15min time steps. Until those issues are addressed, there is a need to develop an algorithm that interpolates between these temporally coarse long range forecasts. This paper describes a method of how to use low temporal resolution probabilistic weather forecasts in a DST. The beginning of this paper is a description of some convective weather forecast and observation products followed by an example of how weather data are used by a DST. The subsequent sections will describe probabilistic forecasts followed by a descrtion of a method to use low temporal resolution probabilistic weather forecasts by providing a relevance value to these data outside of their valid times.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.T14C..04F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.T14C..04F"><span>Probabilistic In Situ Stress Estimation and Forecasting using Sequential Data Assimilation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fichtner, A.; van Dinther, Y.; Kuensch, H. R.</p> <p>2017-12-01</p> <p>Our physical understanding and forecasting ability of earthquakes, and other solid Earth dynamic processes, is significantly hampered by limited indications on the evolving state of stress and strength on faults. Integrating observations and physics-based numerical modeling to quantitatively estimate this evolution of a fault's state is crucial. However, systematic attempts are limited and tenuous, especially in light of the scarcity and uncertainty of natural data and the difficulty of modelling the physics governing earthquakes. We adopt the statistical framework of sequential data assimilation - extensively developed for weather forecasting - to efficiently integrate observations and prior knowledge in a forward model, while acknowledging errors in both. To prove this concept we perform a perfect model test in a simplified subduction zone setup, where we assimilate synthetic noised data on velocities and stresses from a single location. Using an Ensemble Kalman Filter, these data and their errors are assimilated to update 150 ensemble members from a Partial Differential Equation-driven seismic cycle model. Probabilistic estimates of fault stress and dynamic strength evolution capture the truth exceptionally well. This is possible, because the sampled error covariance matrix contains prior information from the physics that relates velocities, stresses and pressure at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed such that fault coupling can be updated to either inhibit or trigger events. In the subsequent forecast step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next event. At subsequent assimilation steps, the system's forecasting ability turns out to be significantly better than that of a periodic recurrence model (requiring an alarm 17% vs. 68% of the time). This thus provides distinct added value with respect to using observations or numerical models separately. Although several challenges for applications to a natural setting remain, these first results indicate the large potential of data assimilation techniques for probabilistic seismic hazard assessment and other challenges in dynamic solid earth systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMGC34C..01P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMGC34C..01P"><span>The MIT Integrated Global System Model: A facility for Assessing and Communicating Climate Change Uncertainty (Invited)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Prinn, R. G.</p> <p>2013-12-01</p> <p>The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that we need to evaluate policies based on their ability to lower risk, and to re-evaluate decisions over time as new knowledge is gained. Reference: R. G. Prinn, Development and Application of Earth System Models, Proceedings, National Academy of Science, June 15, 2012, http://www.pnas.org/cgi/doi/10.1073/pnas.1107470109.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013JVGR..252...14B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013JVGR..252...14B"><span>Assessing spatio-temporal eruption forecasts in a monogenetic volcanic field</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bebbington, Mark S.</p> <p>2013-02-01</p> <p>Many spatio-temporal models have been proposed for forecasting the location and timing of the next eruption in a monogenetic volcanic field. These have almost invariably been fitted retrospectively. That is, the model has been tuned to all of the data, and hence an assessment of the goodness of fit has not been carried out on independent data. The low rate of eruptions in monogenetic fields means that there is not the opportunity to carry out a purely prospective test, as thousands of years would be required to accumulate the necessary data. This leaves open the possibility of a retrospective sequential test, where the parameters are calculated only on the basis of prior events and the resulting forecast compared statistically with the location and time of the next eruption. In general, events in volcanic fields are not dated with sufficient accuracy and precision to pursue this line of investigation; An exception is the Auckland Volcanic Field (New Zealand), consisting of c. 50 centers formed during the last c. 250 kyr, for which an age-order model exists in the form of a Monte Carlo sampling algorithm, facilitating repeated sequential testing. I examine a suite of spatial, temporal and spatio-temporal hazard models, comparing the degree of fit, and attempt to draw lessons from how and where each model is particularly successful or unsuccessful. A relatively simple (independent) combination of a renewal model (temporal term) and a spatially uniform ellipse (spatial term) performs as well as any other model. Both avoid over fitting the data, and hence large errors, when the spatio-temporal occurrence pattern changes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1916806D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1916806D"><span>Magma transfer at Campi Flegrei caldera (Italy) before the 1538 AD eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Di Vito, Mauro A.; Acocella, Valerio; Aiello, Giuseppe; Barra, Diana; Battaglia, Maurizio; Carandente, Antonio; Del Gaudio, Carlo; de Vita, Sandro; Ricciardi, Giovanni; Rico, Ciro; Scandone, Roberto; Terrasi, Filippo</p> <p>2017-04-01</p> <p>Defining and understanding the shallow transfer of magma at volcanoes is crucial to forecast eruptions, possibly the ultimate goal of volcanology. This is particularly challenging at felsic calderas experiencing unrest, which typically includes significant changes in seismicity, deformation and degassing rates. Caldera unrest is particularly frequent, affects wide areas and often does not culminate in an eruption. Moreover its evidence is usually complicated by the presence of a hydrothermal system. As a result, forecasting any eruption and vent-opening sites within a caldera is very difficult. The Campi Flegrei caldera (CFc), in the densely inhabited area of Naples (Italy), is commonly considered one of the most dangerous active volcanic systems. CFc is a 12 km wide depression hosting two nested calderas formed during the eruptions of the Campanian Ignimbrite ( 39 ka) and the Neapolitan Yellow Tuff ( 15 ka). In the last 5 ka, resurgence, with uplift >60 m close to the central part of the caldera, was accompanied by volcanism between 4.8 and 3.8 ka. After 3 ka of quiescence, increasing seismicity and uplift preceded the last eruption at Monte Nuovo in 1538 for several decades. The most recent activity culminated in four unrest episodes between 1950-1952, 1969-1972, 1982-1984 and 2005-Present, with a cumulative uplift at Pozzuoli of 4.5 m; the present unrest episode has been interpreted as being magma-driven. These unrest episodes are considered the most evident expression of a longer-term (centuries or more) restless activity. The post-1980 deformation largely results from a magmatic oblate or sill-like source at 4 km depth below Pozzuoli. Despite the restless activity of CFc, the recent unrest episodes did not culminate in eruption, so that any possibility to define the pre-eruptive shallow transfer of magma remains elusive. Indeed, this definition is a crucial step in order to identify and understand pre-eruptive processes, and thus to make any forecast. To fill this gap, we focused on the last eruption of 1538, reconstructing its pre-eruptive deformation pattern. For this, we exploited the unique historical, archaeological, geological and long-term geodetic record of the caldera to carefully determine the height variations (and related errors) of 20 selected sites along its coastline. The integration of this large dataset permitted the first reconstruction of pre-eruptive short- and long-term ground deformation of the CFc and to model the magma transfer before the eruption. Our data suggest a progressive magma accumulation from 1251 to 1536 in a 4.6±0.9 km deep source below the caldera centre, and its transfer, between 1536 and 1538, to a 3.8±0.6 km deep magmatic source 4 km NW of the caldera centre, below Monte Nuovo; this peripheral source fed the eruption through a shallower source, 0.4±0.3 km deep. This reconstruction corroborates the existence of a stationary oblate source, below the caldera centre, that was feeding lateral eruptions for the last 5 ka, and suggests: repeated emplacement of magma through intrusions below the caldera centre; occasional lateral transfer of magma feeding non-central eruptions within the caldera. Comparison with historical unrest at calderas worldwide suggests that this behavior is common.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014GeoJI.197..322G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014GeoJI.197..322G"><span>Magma displacements under insular volcanic fields, applications to eruption forecasting: El Hierro, Canary Islands, 2011-2013</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>García, A.; Fernández-Ros, A.; Berrocoso, M.; Marrero, J. M.; Prates, G.; De la Cruz-Reyna, S.; Ortiz, R.</p> <p>2014-04-01</p> <p>Significant deformations, followed by increased seismicity detected since 2011 July at El Hierro, Canary Islands, Spain, prompted the deployment of additional monitoring equipment. The climax of this unrest was a submarine eruption first detected on 2011 October 10, and located at about 2 km SW of La Restinga, southernmost village of El Hierro Island. The eruption ceased on 2012 March 5, after the volcanic tremor signals persistently weakened through 2012 February. However, the seismic activity did not end with the eruption, as several other seismic crises followed. The seismic episodes presented a characteristic pattern: over a few days the number and magnitude of seismic event increased persistently, culminating in seismic events severe enough to be felt all over the island. Those crises occurred in 2011 November, 2012 June and September, 2012 December to 2013 January and in 2013 March-April. In all cases the seismic unrest was preceded by significant deformations measured on the island's surface that continued during the whole episode. Analysis of the available GPS and seismic data suggests that several magma displacement processes occurred at depth from the beginning of the unrest. The first main magma movement or `injection' culminated with the 2011 October submarine eruption. A model combining the geometry of the magma injection process and the variations in seismic energy release has allowed successful forecasting of the new-vent opening.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006NPGeo..13...53F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006NPGeo..13...53F"><span>Application of the LEPS technique for Quantitative Precipitation Forecasting (QPF) in Southern Italy: a preliminary study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Federico, S.; Avolio, E.; Bellecci, C.; Colacino, M.; Walko, R. L.</p> <p>2006-03-01</p> <p>This paper reports preliminary results for a Limited area model Ensemble Prediction System (LEPS), based on RAMS (Regional Atmospheric Modelling System), for eight case studies of moderate-intense precipitation over Calabria, the southernmost tip of the Italian peninsula. LEPS aims to transfer the benefits of a probabilistic forecast from global to regional scales in countries where local orographic forcing is a key factor to force convection. To accomplish this task and to limit computational time in an operational implementation of LEPS, we perform a cluster analysis of ECMWF-EPS runs. Starting from the 51 members that form the ECMWF-EPS we generate five clusters. For each cluster a representative member is selected and used to provide initial and dynamic boundary conditions to RAMS, whose integrations generate LEPS. RAMS runs have 12-km horizontal resolution. To analyze the impact of enhanced horizontal resolution on quantitative precipitation forecasts, LEPS forecasts are compared to a full Brute Force (BF) ensemble. This ensemble is based on RAMS, has 36 km horizontal resolution and is generated by 51 members, nested in each ECMWF-EPS member. LEPS and BF results are compared subjectively and by objective scores. Subjective analysis is based on precipitation and probability maps of case studies whereas objective analysis is made by deterministic and probabilistic scores. Scores and maps are calculated by comparing ensemble precipitation forecasts against reports from the Calabria regional raingauge network. Results show that LEPS provided better rainfall predictions than BF for all case studies selected. This strongly suggests the importance of the enhanced horizontal resolution, compared to ensemble population, for Calabria for these cases. To further explore the impact of local physiographic features on QPF (Quantitative Precipitation Forecasting), LEPS results are also compared with a 6-km horizontal resolution deterministic forecast. Due to local and mesoscale forcing, the high resolution forecast (Hi-Res) has better performance compared to the ensemble mean for rainfall thresholds larger than 10mm but it tends to overestimate precipitation for lower amounts. This yields larger false alarms that have a detrimental effect on objective scores for lower thresholds. To exploit the advantages of a probabilistic forecast compared to a deterministic one, the relation between the ECMWF-EPS 700 hPa geopotential height spread and LEPS performance is analyzed. Results are promising even if additional studies are required.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A31H2288D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A31H2288D"><span>Probabilistic empirical prediction of seasonal climate: evaluation and potential applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dieppois, B.; Eden, J.; van Oldenborgh, G. J.</p> <p>2017-12-01</p> <p>Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a new evaluation of an established empirical system used to predict seasonal climate across the globe. Forecasts for surface air temperature, precipitation and sea level pressure are produced by the KNMI Probabilistic Empirical Prediction (K-PREP) system every month and disseminated via the KNMI Climate Explorer (climexp.knmi.nl). K-PREP is based on multiple linear regression and built on physical principles to the fullest extent with predictive information taken from the global CO2-equivalent concentration, large-scale modes of variability in the climate system and regional-scale information. K-PREP seasonal forecasts for the period 1981-2016 will be compared with corresponding dynamically generated forecasts produced by operational forecast systems. While there are many regions of the world where empirical forecast skill is extremely limited, several areas are identified where K-PREP offers comparable skill to dynamical systems. We discuss two key points in the future development and application of the K-PREP system: (a) the potential for K-PREP to provide a more useful basis for reference forecasts than those based on persistence or climatology, and (b) the added value of including K-PREP forecast information in multi-model forecast products, at least for known regions of good skill. We also discuss the potential development of stakeholder-driven applications of the K-PREP system, including empirical forecasts for circumboreal fire activity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013E%26PSL.366..112R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013E%26PSL.366..112R"><span>Ash-plume dynamics and eruption source parameters by infrasound and thermal imagery: The 2010 Eyjafjallajökull eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ripepe, M.; Bonadonna, C.; Folch, A.; Delle Donne, D.; Lacanna, G.; Marchetti, E.; Höskuldsson, A.</p> <p>2013-03-01</p> <p>During operational ash-cloud forecasting, prediction of ash concentration and total erupted mass directly depends on the determination of mass eruption rate (MER), which is typically inferred from plume height. Uncertainties for plume heights are large, especially for bent-over plumes in which the ascent dynamics are strongly affected by the surrounding wind field. Here we show how uncertainties can be reduced if MER is derived directly from geophysical observations of source dynamics. The combination of infrasound measurements and thermal camera imagery allows for the infrasonic type of source to be constrained (a dipole in this case) and for the plume exit velocity to be calculated (54-142 m/s) based on the acoustic signal recorded during the 2010 Eyjafjallajökull eruption from 4 to 21 May. Exit velocities are converted into MER using additional information on vent diameter (50±10 m) and mixture density (5.4±1.1 kg/m3), resulting in an average ∼9×105 kg/s MER during the considered period of the eruption. We validate our acoustic-derived MER by using independent measurements of plume heights (Icelandic Meteorological Office radar observations). Acoustically derived MER are converted into plume heights using field-based relationships and a 1D radially averaged buoyant plume theory model using a reconstructed total grain size distribution. We conclude that the use of infrasonic monitoring may lead to important understanding of the plume dynamics and allows for real-time determination of eruption source parameters. This could improve substantially the forecasting of volcano-related hazards, with important implications for civil aviation safety.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017FrEaS...5..108Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017FrEaS...5..108Z"><span>Sequential assimilation of volcanic monitoring data to quantify eruption potential: Application to Kerinci volcano</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhan, Yan; Gregg, Patricia M.; Chaussard, Estelle; Aoki, Yosuke</p> <p>2017-12-01</p> <p>Quantifying the eruption potential of a restless volcano requires the ability to model parameters such as overpressure and calculate the host rock stress state as the system evolves. A critical challenge is developing a model-data fusion framework to take advantage of observational data and provide updates of the volcanic system through time. The Ensemble Kalman Filter (EnKF) uses a Monte Carlo approach to assimilate volcanic monitoring data and update models of volcanic unrest, providing time-varying estimates of overpressure and stress. Although the EnKF has been proven effective to forecast volcanic deformation using synthetic InSAR and GPS data, until now, it has not been applied to assimilate data from an active volcanic system. In this investigation, the EnKF is used to provide a “hindcast” of the 2009 explosive eruption of Kerinci volcano, Indonesia. A two-sources analytical model is used to simulate the surface deformation of Kerinci volcano observed by InSAR time-series data and to predict the system evolution. A deep, deflating dike-like source reproduces the subsiding signal on the flanks of the volcano, and a shallow spherical McTigue source reproduces the central uplift. EnKF predicted parameters are used in finite element models to calculate the host-rock stress state prior to the 2009 eruption. Mohr-Coulomb failure models reveal that the shallow magma reservoir is trending towards tensile failure prior to 2009, which may be the catalyst for the 2009 eruption. Our results illustrate that the EnKF shows significant promise for future applications to forecasting the eruption potential of restless volcanoes and hind-cast the triggering mechanisms of observed eruptions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1212719L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1212719L"><span>A multidisciplinary approach for high-resolution reconstruction of the eruptive past of La Soufrière (Guadeloupe) over the last 12 000 years: Implications for hazards assessment.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Legendre, Yoann; Komorowski, Jean-Christophe; Boudon, Georges</p> <p>2010-05-01</p> <p>La Soufrière de Guadeloupe is a dangerous andesitic composite volcano characterized over the last 12 000 years by numerous phreatic eruptions that alternate with few magmatic eruptions, including the last magmatic and best-studied "Soufrière" subplinian eruption in 1530 AD, and unusually numerous flank-collapse events. Field analysis of the deposits provide constraints for values of the physical input parameters for simple models which provide with first-order simulation of eruptive phenomena, and from which quantitative probabilistic hazard maps can be elaborated in which epistemic and aleatory uncertainty can be incorporated and quantified. The study of yesterday's eruptions provide key insights for elaborating realistic simulations and describing potential eruptive scenarios for tomorrow's eruptions. However hazard assessment is biased towards eruptions of significant magnitude that produce extensive, and relatively thick deposits. Nevertheless, eruptions of moderate magnitude which are often more frequent, can significantly affect vulnerable island communities living at short distances from the vent. However, their deposits are ephemeral in the geologic record on account of intense erosion from tropical rainfall, important soil development and erosion by the emplacement of recurrent pyroclastic density currents, debris avalanches, and mudflows. We have developed a novel approach by using a manual sediment corer to obtain undisturbed sedimentary eruptive archives in sheltered zones on the volcano where a longer eruption record has been preserved. We describe two such cores (6.32 and 6.64 m long) that extend over at least 8700 years and that contain several thin tephra layers missing at the outcrop scale. We combine these new data with the analysis of more than 120 stratigraphic sections on outcrops studied over the last decade to provide a new eruptive chronology for La Soufriere volcano over the last 12 000 years. This chronology is robustly constrained by 105 new 14C age dates of wood, charcoal, and paleosoil samples that complete the existing 14C database (total of about 261 dates). A multidisciplinary analysis (sedimentology, lithology, microtextures, magnetic susceptibility) of the sediment cores and field data has allowed us to identify hidden, and missing eruptions, and to re-interpret mis-identified eruptions. For the last 12 000 years we have identified at least 5 distinct new pumice fallout deposits, some of which are associated with pumice pyroclastic flow deposits. We also identified several deposits formed by magmatic turbulent pyroclastic density currents (blasts) mostly associated with flank-collapse events. Thus, the number of Holocene magmatic eruptions has significantly increased compared to previous knowledge. More over we have identified eruptive sequences that consist of a diverse range of phenomena including edifice-collapse, associated laterally directed explosions (blasts), pumice fallout with column-collapse and dome growth similar to the AD1530 most recent magmatic eruption. The magmatic eruptive rate could be twice as important with 11-13 magmatic eruptions in 12 000 years, a rate of about 0.92-1.08 magmatic eruption by 1000 years. This new data will allow a better determination of the recurrence, magnitude, intensity, and the spatio-temporal evolution of deposit types that define different eruptive scenarios. Hence, this high-resolution reconstruction of the eruptive past will provide the basis for an improved probabilistic hazard and risk assessment for La Soufrière of Guadeloupe, a dangerous volcano, currently experiencing prolongued unrest since 1992.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1919531L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1919531L"><span>Ensemble sea ice forecast for predicting compressive situations in the Baltic Sea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lehtiranta, Jonni; Lensu, Mikko; Kokkonen, Iiro; Haapala, Jari</p> <p>2017-04-01</p> <p>Forecasting of sea ice hazards is important for winter shipping in the Baltic Sea. In current numerical models the ice thickness distribution and drift are captured well, but compressive situations are often missing from forecast products. Its inclusion is requested by the shipping community, as compression poses a threat to ship operations. As compressing ice is capable of stopping ships for days and even damaging them, its inclusion in ice forecasts is vital. However, we have found that compression can not be predicted well in a deterministic forecast, since it can be a local and a quickly changing phenomenon. It is also very sensitive to small changes in the wind speed and direction, the prevailing ice conditions, and the model parameters. Thus, a probabilistic ensemble simulation is needed to produce a meaningful compression forecast. An ensemble model setup was developed in the SafeWIN project for this purpose. It uses the HELMI multicategory ice model, which was amended for making simulations in parallel. The ensemble was built by perturbing the atmospheric forcing and the physical parameters of the ice pack. The model setup will provide probabilistic forecasts for the compression in the Baltic sea ice. Additionally the model setup provides insight into the uncertainties related to different model parameters and their impact on the model results. We have completed several hindcast simulations for the Baltic Sea for verification purposes. These results are shown to match compression reports gathered from ships. In addition, an ensemble forecast is in preoperational testing phase and its first evaluation will be presented in this work.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70192334','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70192334"><span>Incorporating probabilistic seasonal climate forecasts into river management using a risk-based framework</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Sojda, Richard S.; Towler, Erin; Roberts, Mike; Rajagopalan, Balaji</p> <p>2013-01-01</p> <p>[1] Despite the influence of hydroclimate on river ecosystems, most efforts to date have focused on using climate information to predict streamflow for water supply. However, as water demands intensify and river systems are increasingly stressed, research is needed to explicitly integrate climate into streamflow forecasts that are relevant to river ecosystem management. To this end, we present a five step risk-based framework: (1) define risk tolerance, (2) develop a streamflow forecast model, (3) generate climate forecast ensembles, (4) estimate streamflow ensembles and associated risk, and (5) manage for climate risk. The framework is successfully demonstrated for an unregulated watershed in southwest Montana, where the combination of recent drought and water withdrawals has made it challenging to maintain flows needed for healthy fisheries. We put forth a generalized linear modeling (GLM) approach to develop a suite of tools that skillfully model decision-relevant low flow characteristics in terms of climate predictors. Probabilistic precipitation forecasts are used in conjunction with the GLMs, resulting in season-ahead prediction ensembles that provide the full risk profile. These tools are embedded in an end-to-end risk management framework that directly supports proactive fish conservation efforts. Results show that the use of forecasts can be beneficial to planning, especially in wet years, but historical precipitation forecasts are quite conservative (i.e., not very “sharp”). Synthetic forecasts show that a modest “sharpening” can strongly impact risk and improve skill. We emphasize that use in management depends on defining relevant environmental flows and risk tolerance, requiring local stakeholder involvement.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.3631B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.3631B"><span>Operational value of ensemble streamflow forecasts for hydropower production: A Canadian case study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Boucher, Marie-Amélie; Tremblay, Denis; Luc, Perreault; François, Anctil</p> <p>2010-05-01</p> <p>Ensemble and probabilistic forecasts have many advantages over deterministic ones, both in meteorology and hydrology (e.g. Krzysztofowicz, 2001). Mainly, they inform the user on the uncertainty linked to the forecast. It has been brought to attention that such additional information could lead to improved decision making (e.g. Wilks and Hamill, 1995; Mylne, 2002; Roulin, 2007), but very few studies concentrate on operational situations involving the use of such forecasts. In addition, many authors have demonstrated that ensemble forecasts outperform deterministic forecasts in terms of performance (e.g. Jaun et al., 2005; Velazquez et al., 2009; Laio and Tamea, 2007). However, such performance is mostly assessed on the basis of numerical scoring rules, which compare the forecasts to the observations, and seldom in terms of management gains. The proposed case study adopts an operational point of view, on the basis that a novel forecasting system has value only if it leads to increase monetary and societal gains (e.g. Murphy, 1994; Laio and Tamea, 2007). More specifically, Environment Canada operational ensemble precipitation forecasts are used to drive the HYDROTEL distributed hydrological model (Fortin et al., 1995), calibrated on the Gatineau watershed located in Québec, Canada. The resulting hydrological ensemble forecasts are then incorporated into Hydro-Québec SOHO stochastic management optimization tool that automatically search for optimal operation decisions for the all reservoirs and hydropower plants located on the basin. The timeline of the study is the fall season of year 2003. This period is especially relevant because of high precipitations that nearly caused a major spill, and forced the preventive evacuation of a portion of the population located near one of the dams. We show that the use of the ensemble forecasts would have reduced the occurrence of spills and flooding, which is of particular importance for dams located in populous area, and increased hydropower production. The ensemble precipitation forecasts extend from March 1st of 2002 to December 31st of 2003. They were obtained using two atmospheric models, SEF (8 members plus the control deterministic forecast) and GEM (8 members). The corresponding deterministic precipitation forecast issued by SEF model is also used within HYDROTEL in order to compare ensemble streamflow forecasts with their deterministic counterparts. Although this study does not incorporate all the sources of uncertainty, precipitation is certainly the most important input for hydrological modeling and conveys a great portion of the total uncertainty. References: Fortin, J.P., Moussa, R., Bocquillon, C. and Villeneuve, J.P. 1995: HYDROTEL, un modèle hydrologique distribué pouvant bénéficier des données fournies par la télédétection et les systèmes d'information géographique, Revue des Sciences de l'Eau, 8(1), 94-124. Jaun, S., Ahrens, B., Walser, A., Ewen, T. and Schaer, C. 2008: A probabilistic view on the August 2005 floods in the upper Rhine catchment, Natural Hazards and Earth System Sciences, 8 (2), 281-291. Krzysztofowicz, R. 2001: The case for probabilistic forecasting in hydrology, Journal of Hydrology, 249, 2-9. Murphy, A.H. 1994: Assessing the economic value of weather forecasts: An overview of methods, results and issues, Meteorological Applications, 1, 69-73. Mylne, K.R. 2002: Decision-Making from probability forecasts based on forecast value, Meteorological Applications, 9, 307-315. Laio, F. and Tamea, S. 2007: Verification tools for probabilistic forecasts of continuous hydrological variables, Hydrology and Earth System Sciences, 11, 1267-1277. Roulin, E. 2007: Skill and relative economic value of medium-range hydrological ensemble predictions, Hydrology and Earth System Sciences, 11, 725-737. Velazquez, J.-A., Petit, T., Lavoie, A., Boucher, M.-A., Turcotte, R., Fortin, V. and Anctil, F. 2009: An evaluation of the Canadian global meteorological ensemble prediction system for short-term hydrological forecasting, Hydrology and Earth System Sciences, 13(11), 2221-2231. Wilks, D.S. and Hamill, T.M. 1995: Potential economic value of ensemble-based surface weather forecasts, Monthly Weather Review, 123(12), 3565-3575.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27801079','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27801079"><span>Resolution of Probabilistic Weather Forecasts with Application in Disease Management.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hughes, G; McRoberts, N; Burnett, F J</p> <p>2017-02-01</p> <p>Predictive systems in disease management often incorporate weather data among the disease risk factors, and sometimes this comes in the form of forecast weather data rather than observed weather data. In such cases, it is useful to have an evaluation of the operational weather forecast, in addition to the evaluation of the disease forecasts provided by the predictive system. Typically, weather forecasts and disease forecasts are evaluated using different methodologies. However, the information theoretic quantity expected mutual information provides a basis for evaluating both kinds of forecast. Expected mutual information is an appropriate metric for the average performance of a predictive system over a set of forecasts. Both relative entropy (a divergence, measuring information gain) and specific information (an entropy difference, measuring change in uncertainty) provide a basis for the assessment of individual forecasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4322588','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4322588"><span>Global integrated drought monitoring and prediction system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hao, Zengchao; AghaKouchak, Amir; Nakhjiri, Navid; Farahmand, Alireza</p> <p>2014-01-01</p> <p>Drought is by far the most costly natural disaster that can lead to widespread impacts, including water and food crises. Here we present data sets available from the Global Integrated Drought Monitoring and Prediction System (GIDMaPS), which provides drought information based on multiple drought indicators. The system provides meteorological and agricultural drought information based on multiple satellite-, and model-based precipitation and soil moisture data sets. GIDMaPS includes a near real-time monitoring component and a seasonal probabilistic prediction module. The data sets include historical drought severity data from the monitoring component, and probabilistic seasonal forecasts from the prediction module. The probabilistic forecasts provide essential information for early warning, taking preventive measures, and planning mitigation strategies. GIDMaPS data sets are a significant extension to current capabilities and data sets for global drought assessment and early warning. The presented data sets would be instrumental in reducing drought impacts especially in developing countries. Our results indicate that GIDMaPS data sets reliably captured several major droughts from across the globe. PMID:25977759</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_14 --> <div id="page_15" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="281"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25977759','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25977759"><span>Global integrated drought monitoring and prediction system.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hao, Zengchao; AghaKouchak, Amir; Nakhjiri, Navid; Farahmand, Alireza</p> <p>2014-01-01</p> <p>Drought is by far the most costly natural disaster that can lead to widespread impacts, including water and food crises. Here we present data sets available from the Global Integrated Drought Monitoring and Prediction System (GIDMaPS), which provides drought information based on multiple drought indicators. The system provides meteorological and agricultural drought information based on multiple satellite-, and model-based precipitation and soil moisture data sets. GIDMaPS includes a near real-time monitoring component and a seasonal probabilistic prediction module. The data sets include historical drought severity data from the monitoring component, and probabilistic seasonal forecasts from the prediction module. The probabilistic forecasts provide essential information for early warning, taking preventive measures, and planning mitigation strategies. GIDMaPS data sets are a significant extension to current capabilities and data sets for global drought assessment and early warning. The presented data sets would be instrumental in reducing drought impacts especially in developing countries. Our results indicate that GIDMaPS data sets reliably captured several major droughts from across the globe.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.2249R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.2249R"><span>A hydro-meteorological ensemble prediction system for real-time flood forecasting purposes in the Milano area</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ravazzani, Giovanni; Amengual, Arnau; Ceppi, Alessandro; Romero, Romualdo; Homar, Victor; Mancini, Marco</p> <p>2015-04-01</p> <p>Analysis of forecasting strategies that can provide a tangible basis for flood early warning procedures and mitigation measures over the Western Mediterranean region is one of the fundamental motivations of the European HyMeX programme. Here, we examine a set of hydro-meteorological episodes that affected the Milano urban area for which the complex flood protection system of the city did not completely succeed before the occurred flash-floods. Indeed, flood damages have exponentially increased in the area during the last 60 years, due to industrial and urban developments. Thus, the improvement of the Milano flood control system needs a synergism between structural and non-structural approaches. The flood forecasting system tested in this work comprises the Flash-flood Event-based Spatially distributed rainfall-runoff Transformation, including Water Balance (FEST-WB) and the Weather Research and Forecasting (WRF) models, in order to provide a hydrological ensemble prediction system (HEPS). Deterministic and probabilistic quantitative precipitation forecasts (QPFs) have been provided by WRF model in a set of 48-hours experiments. HEPS has been generated by combining different physical parameterizations (i.e. cloud microphysics, moist convection and boundary-layer schemes) of the WRF model in order to better encompass the atmospheric processes leading to high precipitation amounts. We have been able to test the value of a probabilistic versus a deterministic framework when driving Quantitative Discharge Forecasts (QDFs). Results highlight (i) the benefits of using a high-resolution HEPS in conveying uncertainties for this complex orographic area and (ii) a better simulation of the most of extreme precipitation events, potentially enabling valuable probabilistic QDFs. Hence, the HEPS copes with the significant deficiencies found in the deterministic QPFs. These shortcomings would prevent to correctly forecast the location and timing of high precipitation rates and total amounts at the catchment scale, thus impacting heavily the deterministic QDFs. In contrast, early warnings would have been possible within a HEPS context for the Milano area, proving the suitability of such system for civil protection purposes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70185016','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70185016"><span>Adjusting particle-size distributions to account for aggregation in tephra-deposit model forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Mastin, Larry G.; Van Eaton, Alexa; Durant, A.J.</p> <p>2016-01-01</p> <p>Volcanic ash transport and dispersion (VATD) models are used to forecast tephra deposition during volcanic eruptions. Model accuracy is limited by the fact that fine-ash aggregates (clumps into clusters), thus altering patterns of deposition. In most models this is accounted for by ad hoc changes to model input, representing fine ash as aggregates with density ρagg, and a log-normal size distribution with median μagg and standard deviation σagg. Optimal values may vary between eruptions. To test the variance, we used the Ash3d tephra model to simulate four deposits: 18 May 1980 Mount St. Helens; 16–17 September 1992 Crater Peak (Mount Spurr); 17 June 1996 Ruapehu; and 23 March 2009 Mount Redoubt. In 192 simulations, we systematically varied μagg and σagg, holding ρagg constant at 600 kg m−3. We evaluated the fit using three indices that compare modeled versus measured (1) mass load at sample locations; (2) mass load versus distance along the dispersal axis; and (3) isomass area. For all deposits, under these inputs, the best-fit value of μagg ranged narrowly between  ∼  2.3 and 2.7φ (0.20–0.15 mm), despite large variations in erupted mass (0.25–50 Tg), plume height (8.5–25 km), mass fraction of fine ( <  0.063 mm) ash (3–59 %), atmospheric temperature, and water content between these eruptions. This close agreement suggests that aggregation may be treated as a discrete process that is insensitive to eruptive style or magnitude. This result offers the potential for a simple, computationally efficient parameterization scheme for use in operational model forecasts. Further research may indicate whether this narrow range also reflects physical constraints on processes in the evolving cloud.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMGC24F..06D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMGC24F..06D"><span>Using Analog Ensemble to generate spatially downscaled probabilistic wind power forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Delle Monache, L.; Shahriari, M.; Cervone, G.</p> <p>2017-12-01</p> <p>We use the Analog Ensemble (AnEn) method to generate probabilistic 80-m wind power forecasts. We use data from the NCEP GFS ( 28 km resolution) and NCEP NAM (12 km resolution). We use forecasts data from NAM and GFS, and analysis data from NAM which enables us to: 1) use a lower-resolution model to create higher-resolution forecasts, and 2) use a higher-resolution model to create higher-resolution forecasts. The former essentially increases computing speed and the latter increases forecast accuracy. An aggregated model of the former can be compared against the latter to measure the accuracy of the AnEn spatial downscaling. The AnEn works by taking a deterministic future forecast and comparing it with past forecasts. The model searches for the best matching estimates within the past forecasts and selects the predictand value corresponding to these past forecasts as the ensemble prediction for the future forecast. Our study is based on predicting wind speed and air density at more than 13,000 grid points in the continental US. We run the AnEn model twice: 1) estimating 80-m wind speed by using predictor variables such as temperature, pressure, geopotential height, U-component and V-component of wind, 2) estimating air density by using predictors such as temperature, pressure, and relative humidity. We use the air density values to correct the standard wind power curves for different values of air density. The standard deviation of the ensemble members (i.e. ensemble spread) will be used as the degree of difficulty to predict wind power at different locations. The value of the correlation coefficient between the ensemble spread and the forecast error determines the appropriateness of this measure. This measure is prominent for wind farm developers as building wind farms in regions with higher predictability will reduce the real-time risks of operating in the electricity markets.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017E%26ES...71a2002T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017E%26ES...71a2002T"><span>Tephra Fallout Hazard Assessment for VEI5 Plinian Eruption at Kuju Volcano, Japan, Using TEPHRA2</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tsuji, Tomohiro; Ikeda, Michiharu; Kishimoto, Hiroshi; Fujita, Koji; Nishizaka, Naoki; Onishi, Kozo</p> <p>2017-06-01</p> <p>Tephra fallout has a potential impact on engineered structures and systems at nuclear power plants. We provide the first report estimating potential accumulations of tephra fallout as big as VEI5 eruption from Kuju Volcano and calculated hazard curves at the Ikata Power Plant, using the TEPHRA2 computer program. We reconstructed the eruptive parameters of Kj-P1 tephra fallout deposit based on geological survey and literature review. A series of parameter studies were carried out to determine the best values of empirical parameters, such as diffusion coefficient and the fall time threshold. Based on such a reconstruction, we represent probabilistic analyses which assess the variation in meteorological condition, using wind profiles extracted from a 22 year long wind dataset. The obtained hazard curves and probability maps of tephra fallout associated to a Plinian eruption were used to discuss the exceeding probability at the site and the implications of such a severe eruption scenario.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010ems..confE.145T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010ems..confE.145T"><span>The state of the art of flood forecasting - Hydrological Ensemble Prediction Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Thielen-Del Pozo, J.; Pappenberger, F.; Salamon, P.; Bogner, K.; Burek, P.; de Roo, A.</p> <p>2010-09-01</p> <p>Flood forecasting systems form a key part of ‘preparedness' strategies for disastrous floods and provide hydrological services, civil protection authorities and the public with information of upcoming events. Provided the warning leadtime is sufficiently long, adequate preparatory actions can be taken to efficiently reduce the impacts of the flooding. Because of the specific characteristics of each catchment, varying data availability and end-user demands, the design of the best flood forecasting system may differ from catchment to catchment. However, despite the differences in concept and data needs, there is one underlying issue that spans across all systems. There has been an growing awareness and acceptance that uncertainty is a fundamental issue of flood forecasting and needs to be dealt with at the different spatial and temporal scales as well as the different stages of the flood generating processes. Today, operational flood forecasting centres change increasingly from single deterministic forecasts to probabilistic forecasts with various representations of the different contributions of uncertainty. The move towards these so-called Hydrological Ensemble Prediction Systems (HEPS) in flood forecasting represents the state of the art in forecasting science, following on the success of the use of ensembles for weather forecasting (Buizza et al., 2005) and paralleling the move towards ensemble forecasting in other related disciplines such as climate change predictions. The use of HEPS has been internationally fostered by initiatives such as "The Hydrologic Ensemble Prediction Experiment" (HEPEX), created with the aim to investigate how best to produce, communicate and use hydrologic ensemble forecasts in hydrological short-, medium- und long term prediction of hydrological processes. The advantages of quantifying the different contributions of uncertainty as well as the overall uncertainty to obtain reliable and useful flood forecasts also for extreme events, has become evident. However, despite the demonstrated advantages, worldwide the incorporation of HEPS in operational flood forecasting is still limited. The applicability of HEPS for smaller river basins was tested in MAP D-Phase, an acronym for "Demonstration of Probabilistic Hydrological and Atmospheric Simulation of flood Events in the Alpine region" which was launched in 2005 as a Forecast Demonstration Project of World Weather Research Programme of WMO, and entered a pre-operational and still active testing phase in 2007. In Europe, a comparatively high number of EPS driven systems for medium-large rivers exist. National flood forecasting centres of Sweden, Finland and the Netherlands, have already implemented HEPS in their operational forecasting chain, while in other countries including France, Germany, Czech Republic and Hungary, hybrids or experimental chains have been installed. As an example of HEPS, the European Flood Alert System (EFAS) is being presented. EFAS provides medium-range probabilistic flood forecasting information for large trans-national river basins. It incorporates multiple sets of weather forecast including different types of EPS and deterministic forecasts from different providers. EFAS products are evaluated and visualised as exceedance of critical levels only - both in forms of maps and time series. Different sources of uncertainty and its impact on the flood forecasting performance for every grid cell has been tested offline but not yet incorporated operationally into the forecasting chain for computational reasons. However, at stations where real-time discharges are available, a hydrological uncertainty processor is being applied to estimate the total predictive uncertainty from the hydrological and input uncertainties. Research on long-term EFAS results has shown the need for complementing statistical analysis with case studies for which examples will be shown.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1715091B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1715091B"><span>HEPS4Power - Extended-range Hydrometeorological Ensemble Predictions for Improved Hydropower Operations and Revenues</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bogner, Konrad; Monhart, Samuel; Liniger, Mark; Spririg, Christoph; Jordan, Fred; Zappa, Massimiliano</p> <p>2015-04-01</p> <p>In recent years large progresses have been achieved in the operational prediction of floods and hydrological drought with up to ten days lead time. Both the public and the private sectors are currently using probabilistic runoff forecast in order to monitoring water resources and take actions when critical conditions are to be expected. The use of extended-range predictions with lead times exceeding 10 days is not yet established. The hydropower sector in particular might have large benefits from using hydro meteorological forecasts for the next 15 to 60 days in order to optimize the operations and the revenues from their watersheds, dams, captions, turbines and pumps. The new Swiss Competence Centers in Energy Research (SCCER) targets at boosting research related to energy issues in Switzerland. The objective of HEPS4POWER is to demonstrate that operational extended-range hydro meteorological forecasts have the potential to become very valuable tools for fine tuning the production of energy from hydropower systems. The project team covers a specific system-oriented value chain starting from the collection and forecast of meteorological data (MeteoSwiss), leading to the operational application of state-of-the-art hydrological models (WSL) and terminating with the experience in data presentation and power production forecasts for end-users (e-dric.ch). The first task of the HEPS4POWER will be the downscaling and post-processing of ensemble extended-range meteorological forecasts (EPS). The goal is to provide well-tailored forecasts of probabilistic nature that should be reliable in statistical and localized at catchment or even station level. The hydrology related task will consist in feeding the post-processed meteorological forecasts into a HEPS using a multi-model approach by implementing models with different complexity. Also in the case of the hydrological ensemble predictions, post-processing techniques need to be tested in order to improve the quality of the forecasts against observed discharge. Analysis should be specifically oriented to the maximisation of hydroelectricity production. Thus, verification metrics should include economic measures like cost loss approaches. The final step will include the transfer of the HEPS system to several hydropower systems, the connection with the energy market prices and the development of probabilistic multi-reservoir production and management optimizations guidelines. The baseline model chain yielding three-days forecasts established for a hydropower system in southern-Switzerland will be presented alongside with the work-plan to achieve seasonal ensemble predictions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012JGRD..117.4102S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012JGRD..117.4102S"><span>Assessment of the long-lead probabilistic prediction for the Asian summer monsoon precipitation (1983-2011) based on the APCC multimodel system and a statistical model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sohn, Soo-Jin; Min, Young-Mi; Lee, June-Yi; Tam, Chi-Yung; Kang, In-Sik; Wang, Bin; Ahn, Joong-Bae; Yamagata, Toshio</p> <p>2012-02-01</p> <p>The performance of the probabilistic multimodel prediction (PMMP) system of the APEC Climate Center (APCC) in predicting the Asian summer monsoon (ASM) precipitation at a four-month lead (with February initial condition) was compared with that of a statistical model using hindcast data for 1983-2005 and real-time forecasts for 2006-2011. Particular attention was paid to probabilistic precipitation forecasts for the boreal summer after the mature phase of El Niño and Southern Oscillation (ENSO). Taking into account the fact that coupled models' skill for boreal spring and summer precipitation mainly comes from their ability to capture ENSO teleconnection, we developed the statistical model using linear regression with the preceding winter ENSO condition as the predictor. Our results reveal several advantages and disadvantages in both forecast systems. First, the PMMP appears to have higher skills for both above- and below-normal categories in the six-year real-time forecast period, whereas the cross-validated statistical model has higher skills during the 23-year hindcast period. This implies that the cross-validated statistical skill may be overestimated. Second, the PMMP is the better tool for capturing atypical ENSO (or non-canonical ENSO related) teleconnection, which has affected the ASM precipitation during the early 1990s and in the recent decade. Third, the statistical model is more sensitive to the ENSO phase and has an advantage in predicting the ASM precipitation after the mature phase of La Niña.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017SpWea..15.1562H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017SpWea..15.1562H"><span>Cost-Loss Analysis of Ensemble Solar Wind Forecasting: Space Weather Use of Terrestrial Weather Tools</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Henley, E. M.; Pope, E. C. D.</p> <p>2017-12-01</p> <p>This commentary concerns recent work on solar wind forecasting by Owens and Riley (2017). The approach taken makes effective use of tools commonly used in terrestrial weather—notably, via use of a simple model—generation of an "ensemble" forecast, and application of a "cost-loss" analysis to the resulting probabilistic information, to explore the benefit of this forecast to users with different risk appetites. This commentary aims to highlight these useful techniques to the wider space weather audience and to briefly discuss the general context of application of terrestrial weather approaches to space weather.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70193579','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70193579"><span>The 2010 explosive eruption of Java's Merapi volcano—A ‘100-year’ event</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Surono,; Jousset, Philippe; Pallister, John S.; Boichu, Marie; Buongiorno, M. Fabrizia; Budisantoso, Agus; Costa, Fidel; Andreastuti, Supriyati; Prata, Fred; Schneider, David; Clarisse, Lieven; Humaida, Hanik; Sumarti, Sri; Bignami, Christian; Griswold, Julia P.; Carn, Simon A.; Oppenheimer, Clive; Lavigne, Franck</p> <p>2012-01-01</p> <p>Merapi volcano (Indonesia) is one of the most active and hazardous volcanoes in the world. It is known for frequent small to moderate eruptions, pyroclastic flows produced by lava dome collapse, and the large population settled on and around the flanks of the volcano that is at risk. Its usual behavior for the last decades abruptly changed in late October and early November 2010, when the volcano produced its largest and most explosive eruptions in more than a century, displacing at least a third of a million people, and claiming nearly 400 lives. Despite the challenges involved in forecasting this ‘hundred year eruption’, we show that the magnitude of precursory signals (seismicity, ground deformation, gas emissions) was proportional to the large size and intensity of the eruption. In addition and for the first time, near-real-time satellite radar imagery played an equal role with seismic, geodetic, and gas observations in monitoring eruptive activity during a major volcanic crisis. The Indonesian Center of Volcanology and Geological Hazard Mitigation (CVGHM) issued timely forecasts of the magnitude of the eruption phases, saving 10,000–20,000 lives. In addition to reporting on aspects of the crisis management, we report the first synthesis of scientific observations of the eruption. Our monitoring and petrologic data show that the 2010 eruption was fed by rapid ascent of magma from depths ranging from 5 to 30 km. Magma reached the surface with variable gas content resulting in alternating explosive and rapid effusive eruptions, and released a total of ~ 0.44 Tg of SO2. The eruptive behavior seems also related to the seismicity along a tectonic fault more than 40 km from the volcano, highlighting both the complex stress pattern of the Merapi region of Java and the role of magmatic pressurization in activating regional faults. We suggest a dynamic triggering of the main explosions on 3 and 4 November by the passing seismic waves generated by regional earthquakes on these days.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010spm..conf..283C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010spm..conf..283C"><span>Business Planning in the Light of Neuro-fuzzy and Predictive Forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chakrabarti, Prasun; Basu, Jayanta Kumar; Kim, Tai-Hoon</p> <p></p> <p>In this paper we have pointed out gain sensing on forecast based techniques.We have cited an idea of neural based gain forecasting. Testing of sequence of gain pattern is also verifies using statsistical analysis of fuzzy value assignment. The paper also suggests realization of stable gain condition using K-Means clustering of data mining. A new concept of 3D based gain sensing has been pointed out. The paper also reveals what type of trend analysis can be observed for probabilistic gain prediction.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140011342','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140011342"><span>Prediction of the Arctic Oscillation in Boreal Winter by Dynamical Seasonal Forecasting Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Kang, Daehyun; Lee, Myong-In; Im, Jungho; Kim, Daehyun; Kim, Hye-Mi; Kang, Hyun-Suk; Shubert, Siegfried D.; Arriba, Albertom; MacLachlan, Craig</p> <p>2013-01-01</p> <p>This study assesses the prediction skill of the boreal winter Arctic Oscillation (AO) in the state-of-the-art dynamical ensemble prediction systems (EPSs): the UKMO GloSea4, the NCEP CFSv2, and the NASA GEOS-5. Long-term reforecasts made with the EPSs are used to evaluate representations of the AO, and to examine skill scores for the deterministic and probabilistic forecast of the AO index. The reforecasts reproduce the observed changes in the large-scale patterns of the Northern Hemispheric surface temperature, upper-level wind, and precipitation according to the AO phase. Results demonstrate that all EPSs have better prediction skill than the persistence prediction for lead times up to 3-month, suggesting a great potential for skillful prediction of the AO and the associated climate anomalies in seasonal time scale. It is also found that the deterministic and probabilistic forecast skill of the AO in the recent period (1997-2010) is higher than that in the earlier period (1983-1996).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012aogs...30..117H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012aogs...30..117H"><span>Recent Progress of Solar Weather Forecasting at Naoc</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>He, Han; Wang, Huaning; Du, Zhanle; Zhang, Liyun; Huang, Xin; Yan, Yan; Fan, Yuliang; Zhu, Xiaoshuai; Guo, Xiaobo; Dai, Xinghua</p> <p></p> <p>The history of solar weather forecasting services at National Astronomical Observatories, Chinese Academy of Sciences (NAOC) can be traced back to 1960s. Nowadays, NAOC is the headquarters of the Regional Warning Center of China (RWC-China), which is one of the members of the International Space Environment Service (ISES). NAOC is responsible for exchanging data, information and space weather forecasts of RWC-China with other RWCs. The solar weather forecasting services at NAOC cover short-term prediction (within two or three days), medium-term prediction (within several weeks), and long-term prediction (in time scale of solar cycle) of solar activities. Most efforts of the short-term prediction research are concentrated on the solar eruptive phenomena, such as flares, coronal mass ejections (CMEs) and solar proton events, which are the key driving sources of strong space weather disturbances. Based on the high quality observation data of the latest space-based and ground-based solar telescopes and with the help of artificial intelligence techniques, new numerical models with quantitative analyses and physical consideration are being developed for the predictions of solar eruptive events. The 3-D computer simulation technology is being introduced for the operational solar weather service platform to visualize the monitoring of solar activities, the running of the prediction models, as well as the presenting of the forecasting results. A new generation operational solar weather monitoring and forecasting system is expected to be constructed in the near future at NAOC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.6080S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.6080S"><span>Abstract on the Effective validation of both new and existing methods for the observation and forecasting of volcanic emissions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sathnur, Ashwini</p> <p>2017-04-01</p> <p>Validation of the Existing products of the Remote Sensing instruments Review Comment Number 1 Ground - based instruments and space - based instruments are available for remote sensing of the Volcanic eruptions. Review Comment Number 2 The sunlight spectrum appears over the volcanic geographic area. This sunlight is reflected with the image of the volcano geographic area, to the satellite. The satellite captures this emitted spectrum of the image and further calculates the occurrences of the volcanic eruption. Review Comment Number 3 This computation system derives the presence and detection of sulphur dioxide and Volcanic Ash in the emitted spectrum. The temperature of the volcanic region is also measured. If these inputs derive the possibility of occurrence of an eruption, then the data is manually captured by the system for further usage and hazard mitigation. Review Comment Number 4 The instrument is particularly important in capturing the volcanogenic signal. This capturing operation should be carried out during the appropriate time of the day. This is carried out ideally at the time of the day when the reflected image spectra is best available. Capturing the data is not advisable to be performed at the night time, as the sunlight spectra is at its minimum. This would lead to erroneous data interpretation, as there is no sunlight for reflection of the volcanic region. Thus leading to the least capture of the emitted light spectra. Review Comment Number 5 An ideal area coverage of the spectrometer is mandatory. This is basically for the purpose of capturing the right area of data, in order to precisely derive the occurrence of a volcanic eruption. The larger the spatial resolution, there would be a higher capture of the geographic region, and this would lead to a lesser precise data capture. This would lead to missing details in the data capture. Review Comment Number 6 Ideal qualities for the remote sensing instrument are mentioned below:- Minimum "false" positives. Cost - free data made available. Minimum band - width problem. Rapid communication system. Validation and Requirements of the New products of the Remote Sensing instruments The qualities of the existing products would be present in the new products also. Along with these qualities, newly devised additional qualities are also required in order to build an advanced remote sensing instrument. The new additional requirements are mentioned below:- Review Comment Number 1 Enlarging the spatial resolution so that the volcanic plumes erupting from the early volcanic eruption is captured by the remote sensing instrument. This spatial resolution data capture would involve better video and camera facilities on the remote sensing instrument. Review Comment Number 2 Capturing the traces of carbon, carbonic acid and water vapour, along with the existing product's capture of sulphur dioxide and volcanic Ash. Review Comment Number 3 Creating an additional module in the instrument to derive the functionality of forecasting a volcanic eruption. This new forecast model should be able to predict the occurrences of volcanic eruption several months in advance. This is basically to create mechanisms for providing early solutions to the problems of mitigation of volcanic hazards. Review Comment Number 4 Creating additional features in the remote sensing instrument to enable the automatic transfer of forecasted eruptions of volcanoes, to the disaster relief operations team. This transfer of information is to be performed automatically, without any request raised from the relief operations team, for the predicted forecast information. This is for the purpose of receiving the information at the right - time, thus eliminating any possibility of occurrences of errors during hazard management.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1813503R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1813503R"><span>Sensitivity to volcanic field boundary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Runge, Melody; Bebbington, Mark; Cronin, Shane; Lindsay, Jan; Rashad Moufti, Mohammed</p> <p>2016-04-01</p> <p>Volcanic hazard analyses are desirable where there is potential for future volcanic activity to affect a proximal population. This is frequently the case for volcanic fields (regions of distributed volcanism) where low eruption rates, fertile soil, and attractive landscapes draw populations to live close by. Forecasting future activity in volcanic fields almost invariably uses spatial or spatio-temporal point processes with model selection and development based on exploratory analyses of previous eruption data. For identifiability reasons, spatio-temporal processes, and practically also spatial processes, the definition of a spatial region is required to which volcanism is confined. However, due to the complex and predominantly unknown sub-surface processes driving volcanic eruptions, definition of a region based solely on geological information is currently impossible. Thus, the current approach is to fit a shape to the known previous eruption sites. The class of boundary shape is an unavoidable subjective decision taken by the forecaster that is often overlooked during subsequent analysis of results. This study shows the substantial effect that this choice may have on even the simplest exploratory methods for hazard forecasting, illustrated using four commonly used exploratory statistical methods and two very different regions: the Auckland Volcanic Field, New Zealand, and Harrat Rahat, Kingdom of Saudi Arabia. For Harrat Rahat, sensitivity of results to boundary definition is substantial. For the Auckland Volcanic Field, the range of options resulted in similar shapes, nevertheless, some of the statistical tests still showed substantial variation in results. This work highlights the fact that when carrying out any hazard analysis on volcanic fields, it is vital to specify how the volcanic field boundary has been defined, assess the sensitivity of boundary choice, and to carry these assumptions and related uncertainties through to estimates of future activity and hazard analyses.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2001BVol...63...45L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2001BVol...63...45L"><span>Long-term volcanic hazard forecasts based on Somma-Vesuvio past eruptive activity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lirer, Lucio; Petrosino, Paola; Alberico, Ines; Postiglione, Immacolata</p> <p>2001-02-01</p> <p>Distributions of pyroclastic deposits from the main explosive events at Somma-Vesuvio during the 8,000-year B.P.-A.D. 1906 time-span have been analysed to provide maps of volcanic hazard for long-term eruption forecasting. In order to define hazard ratings, the spatial distributions and loads (kg/m2) exerted by the fall deposits on the roofs of buildings have been considered. A load higher than 300 kg/m2 is defined as destructive. The relationship load/frequency (the latter defined as the number of times that an area has been impacted by the deposition of fall deposits) is considered to be a suitable parameter for differentiating among areas according to hazard rating. Using past fall deposit distributions as the basis for future eruptive scenarios, the total area that could be affected by the products of a future Vesuvio explosive eruption is 1,500 km2. The perivolcanic area (274 km2) has the greatest hazard rating because it could be buried by pyroclastic flow deposits thicker than 0.5 m and up to several tens of metres in thickness. Currently, the perivolcanic area also has the highest risk because of the high exposed value, mainly arising from the high population density.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.3530G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.3530G"><span>Modelling framework developed for managing and forecasting the El Hierro 2011-2014 unrest processes based on the analysis of the seismicity and deformation data rate.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Garcia, Alicia; Fernandez-Ros, Alberto; Berrocoso, Manuel; Marrero, Jose Manuel; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramon</p> <p>2014-05-01</p> <p>In July 2011 at El Hierro (Canary Islands, Spain), a volcanic unrest was detected, with significant deformations followed by increased seismicity. A submarine eruption started on 10 October 2011 and ceased on 5 March 2012, after the volcanic tremor signals persistently weakened through February 2012. However, the seismic activity did not end when the eruption, as several other seismic crises followed since. The seismic episodes presented a characteristic pattern: over a few days the number and magnitude of seismic event increased persistently, culminating in seismic events severe enough to be felt all over the island. In all cases the seismic activity was preceded by significant deformations measured on the island's surface that continued during the whole episode. Analysis of the available GNSS-GPS and seismic data suggests that several magma injection processes occurred at depth from the beginning of the unrest. A model combining the geometry of the magma injection process and the variations in seismic energy released has allowed successful forecasting of the new-vent opening. The model presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70144544','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70144544"><span>“Points requiring elucidation” about Hawaiian volcanism: Chapter 24</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Poland, Michael P.; Carey, Rebecca; Cayol, Valérie; Poland, Michael P.; Weis, Dominique</p> <p>2015-01-01</p> <p>Hawaiian volcanoes, which are easily accessed and observed at close range, are among the most studied on the planet and have spurred great advances in the geosciences, from understanding deep Earth processes to forecasting volcanic eruptions. More than a century of continuous observation and study of Hawai‘i's volcanoes has also sharpened focus on those questions that remain unanswered. Although there is good evidence that volcanism in Hawai‘i is the result of a high-temperature upwelling plume from the mantle, the source composition and dynamics of the plume are controversial. Eruptions at the surface build the volcanoes of Hawai‘i, but important topics, including how the volcanoes grow and collapse and how magma is stored and transported, continue to be subjects of intense research. Forecasting volcanic activity is based mostly on pattern recognition, but determining and predicting the nature of eruptions, especially in serving the critical needs of hazards mitigation, require more realistic models and a greater understanding of what drives eruptive activity. These needs may be addressed by better integration among disciplines as well as by developing dynamic physics- and chemistry-based models that more thoroughly relate the physiochemical behavior of Hawaiian volcanism, from the deep Earth to the surface, to geological, geochemical, and geophysical data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017BVol...79....3T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017BVol...79....3T"><span>New insights into Holocene eruption episodes from proximal deposit sequences at Mt. Taranaki (Egmont), New Zealand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Torres-Orozco, Rafael; Cronin, Shane J.; Pardo, Natalia; Palmer, Alan S.</p> <p>2017-01-01</p> <p>Upper stratovolcano flanks contain the most nuanced depositional record of long eruption episodes, but steep, irregular terrain makes these sequences difficult to correlate and interpret. This necessitates development of a detailed and systematic approach to describing localized depositional facies and relating these to eruptive processes. In this work, the late-Holocene eruption history of Mt. Taranaki/Egmont, New Zealand, was re-assessed based on a study of proximal deposits spanning the 14C-dated age range of 5.0-0.3 cal ka B.P. Mt. Taranaki is a textbook-example stratovolcano, with geological evidence pointing to sudden switches in scale, type and frequency of eruptions over its 130 ka history. The proximal stratigraphy presented here almost doubles the number of eruptions recognized from previous soil-stratigraphy studies. A total of 53 lithostratigraphic bed-sets record eruptions of the summit crater and parasitic vents like Fanthams Peak (the latter between 3.0 and 1.5 cal ka B.P.). At least 12 of the eruptions represented by these bed-sets comprise deposits comparable with or thicker than those of the latest sub-Plinian eruption of AD 1655. The largest eruption episode represented is the 4.6-4.7-cal ka B.P. Kokowai. Contrasting eruption styles were identified, from stable basaltic-andesite eruption columns at Fanthams Peak, to andesitic lava-dome extrusion, blasts and partial collapse of unstable eruption columns at Mt. Taranaki's summit. The centemetre-scale proximal deposit descriptions were used to identify several previously unknown, smaller eruption events. These details are indispensable for building a comprehensive probabilistic event record and in the development of realistic eruptive scenarios for complex eruption episodes prior to re-awakening of a volcano.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17793232','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17793232"><span>Tsunamis generated by eruptions from mount st. Augustine volcano, alaska.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kienle, J; Kowalik, Z; Murty, T S</p> <p>1987-06-12</p> <p>During an eruption of the Alaskan volcano Mount St. Augustine in the spring of 1986, there was concern about the possibility that a tsunami might be generated by the collapse of a portion of the volcano into the shallow water of Cook Inlet. A similar edifice collapse of the volcano and ensuing sea wave occurred during an eruption in 1883. Other sea waves resulting in great loss of life and property have been generated by the eruption of coastal volcanos around the world. Although Mount St. Augustine remained intact during this eruptive cycle, a possible recurrence of the 1883 events spurred a numerical simulation of the 1883 sea wave. This simulation, which yielded a forecast of potential wave heights and travel times, was based on a method that could be applied generally to other coastal volcanos.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18..497L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18..497L"><span>Velocity changes at Volcán de Colima: Seismic and Experimental observations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lamb, Oliver; Lavallée, Yan; De Angelis, Silvio; Varley, Nick; Reyes-Dávila, Gabriel; Arámbula-Mendoza, Raúl; Hornby, Adrian; Wall, Richard; Kendrick, Jackie</p> <p>2016-04-01</p> <p>Immediately prior to dome-building eruptions, volcano-seismic swarms are a direct consequence of strain localisation in the ascending magma. A deformation mechanism map of magma subjected to strain localisation will help develop accurate numerical models, which, coupled to an understanding of the mechanics driving monitored geophysical signals prior to lava eruption, will enhance forecasts. Here we present how seismic data from Volcán de Colima, Mexico, is combined with experimental work to give insights into fracturing in and around magma. Volcán de Colima is a dome-forming volcano that has been almost-continuously erupting since November 1998. We use coda-wave interferometry to quantify small changes in seismic velocity structure between pairs of similar earthquakes, employing waveforms from clusters of repeating earthquakes. The changes in all pairs of events were then used together to create a continuous function of velocity change at all stations within 7 km of the volcano from October to December 1998. We complement our seismic data with acoustic emission data from tensional experiments using samples collected at Volcán de Colima. Decreases in velocity and frequency reflect changes in the sample properties prior to failure. By comparing experimental and seismic observations, we may place constraints on the conditions of the natural seismogenic processes. Using a combination of field and experimental data promises a greater understanding of the processes affecting the rise of magma during an eruption. This will help with the challenge of forecasting and hazard mitigation during dome-forming eruptions worldwide.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.A43E3320R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.A43E3320R"><span>NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Robertson, F. R.; Roberts, J. B.</p> <p>2014-12-01</p> <p>This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20150002527','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20150002527"><span>NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Robertson, Franklin R.; Roberts, Jason B.</p> <p>2014-01-01</p> <p>This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JHyd..551..555B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JHyd..551..555B"><span>The multi temporal/multi-model approach to predictive uncertainty assessment in real-time flood forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio</p> <p>2017-08-01</p> <p>This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018SSRv..214...46G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018SSRv..214...46G"><span>The Origin, Early Evolution and Predictability of Solar Eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Green, Lucie M.; Török, Tibor; Vršnak, Bojan; Manchester, Ward; Veronig, Astrid</p> <p>2018-02-01</p> <p>Coronal mass ejections (CMEs) were discovered in the early 1970s when space-borne coronagraphs revealed that eruptions of plasma are ejected from the Sun. Today, it is known that the Sun produces eruptive flares, filament eruptions, coronal mass ejections and failed eruptions; all thought to be due to a release of energy stored in the coronal magnetic field during its drastic reconfiguration. This review discusses the observations and physical mechanisms behind this eruptive activity, with a view to making an assessment of the current capability of forecasting these events for space weather risk and impact mitigation. Whilst a wealth of observations exist, and detailed models have been developed, there still exists a need to draw these approaches together. In particular more realistic models are encouraged in order to asses the full range of complexity of the solar atmosphere and the criteria for which an eruption is formed. From the observational side, a more detailed understanding of the role of photospheric flows and reconnection is needed in order to identify the evolutionary path that ultimately means a magnetic structure will erupt.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16902133','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16902133"><span>Time-resolved seismic tomography detects magma intrusions at Mount Etna.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Patanè, D; Barberi, G; Cocina, O; De Gori, P; Chiarabba, C</p> <p>2006-08-11</p> <p>The continuous volcanic and seismic activity at Mount Etna makes this volcano an important laboratory for seismological and geophysical studies. We used repeated three-dimensional tomography to detect variations in elastic parameters during different volcanic cycles, before and during the October 2002-January 2003 flank eruption. Well-defined anomalous low P- to S-wave velocity ratio volumes were revealed. Absent during the pre-eruptive period, the anomalies trace the intrusion of volatile-rich (>/=4 weight percent) basaltic magma, most of which rose up only a few months before the onset of eruption. The observed time changes of velocity anomalies suggest that four-dimensional tomography provides a basis for more efficient volcano monitoring and short- and midterm eruption forecasting of explosive activity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUOSMG14A1911W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUOSMG14A1911W"><span>Development of a Probabilistic Decision-Support Model to Forecast Coastal Resilience</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wilson, K.; Safak, I.; Brenner, O.; Lentz, E. E.; Hapke, C. J.</p> <p>2016-02-01</p> <p>Site-specific forecasts of coastal change are a valuable management tool in preparing for and assessing storm-driven impacts in coastal areas. More specifically, understanding the likelihood of storm impacts, recovery following events, and the alongshore variability of both is central in evaluating vulnerability and resiliency of barrier islands. We introduce a probabilistic modeling framework that integrates hydrodynamic, anthropogenic, and morphologic components of the barrier system to evaluate coastal change at Fire Island, New York. The model is structured on a Bayesian network (BN), which utilizes observations to learn statistical relationships between system variables. In addition to predictive ability, probabilistic models convey the level of confidence associated with a prediction, an important consideration for coastal managers. Our model predicts the likelihood of morphologic change on the upper beach based on several decades of beach monitoring data. A coupled hydrodynamic BN combines probabilistic and deterministic modeling approaches; by querying nearly two decades of nested-grid wave simulations that account for both distant swells and local seas, we produce scenarios of event and seasonal wave climates. The wave scenarios of total water level - a sum of run up, surge and tide - and anthropogenic modification are the primary drivers of morphologic change in our model structure. Preliminary results show the hydrodynamic BN is able to reproduce time series of total water levels, a critical validation process before generating scenarios, and forecasts of geomorphic change over three month intervals are up to 70% accurate. Predictions of storm-induced change and recovery are linked to evaluate zones of persistent vulnerability or resilience and will help managers target restoration efforts, identify areas most vulnerable to habitat degradation, and highlight resilient zones that may best support relocation of critical infrastructure.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JVGR..325....1B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JVGR..325....1B"><span>Great Balls of Fire: A probabilistic approach to quantify the hazard related to ballistics - A case study at La Fossa volcano, Vulcano Island, Italy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Biass, Sébastien; Falcone, Jean-Luc; Bonadonna, Costanza; Di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino</p> <p>2016-10-01</p> <p>We present a probabilistic approach to quantify the hazard posed by volcanic ballistic projectiles (VBP) and their potential impact on the built environment. A model named Great Balls of Fire (GBF) is introduced to describe ballistic trajectories of VBPs accounting for a variable drag coefficient and topography. It relies on input parameters easily identifiable in the field and is designed to model large numbers of VBPs stochastically. Associated functions come with the GBF code to post-process model outputs into a comprehensive probabilistic hazard assessment for VBP impacts. Outcomes include probability maps to exceed given thresholds of kinetic energies at impact, hazard curves and probabilistic isoenergy maps. Probabilities are calculated either on equally-sized pixels or zones of interest. The approach is calibrated, validated and applied to La Fossa volcano, Vulcano Island (Italy). We constructed a generic eruption scenario based on stratigraphic studies and numerical inversions of the 1888-1890 long-lasting Vulcanian cycle of La Fossa. Results suggest a ~ 10- 2% probability of occurrence of VBP impacts with kinetic energies ≤ 104 J at the touristic locality of Porto. In parallel, the vulnerability to roof perforation was estimated by combining field observations and published literature, allowing for a first estimate of the potential impact of VBPs during future Vulcanian eruptions. Results indicate a high physical vulnerability to the VBP hazard, and, consequently, half of the building stock having a ≥ 2.5 × 10- 3% probability of roof perforation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AGUFMPA21A..04J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AGUFMPA21A..04J"><span>Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection</p> <p>2011-12-01</p> <p>Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.8618B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.8618B"><span>Pyroclastic density current hazard maps at Campi Flegrei caldera (Italy): the effects of event scale, vent location and time forecasts.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina</p> <p>2016-04-01</p> <p>Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.V23E0530M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.V23E0530M"><span>A Conceptual Model of Future Volcanism at Medicine Lake Volcano, California - With an Emphasis on Understanding Local Volcanic Hazards</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Molisee, D. D.; Germa, A.; Charbonnier, S. J.; Connor, C.</p> <p>2017-12-01</p> <p>Medicine Lake Volcano (MLV) is most voluminous of all the Cascade Volcanoes ( 600 km3), and has the highest eruption frequency after Mount St. Helens. Detailed mapping by USGS colleagues has shown that during the last 500,000 years MLV erupted >200 lava flows ranging from basalt to rhyolite, produced at least one ash-flow tuff, one caldera forming event, and at least 17 scoria cones. Underlying these units are 23 additional volcanic units that are considered to be pre-MLV in age. Despite the very high likelihood of future eruptions, fewer than 60 of 250 mapped volcanic units (MLV and pre-MLV) have been dated reliably. A robust set of eruptive ages is key to understanding the history of the MLV system and to forecasting the future behavior of the volcano. The goals of this study are to 1) obtain additional radiometric ages from stratigraphically strategic units; 2) recalculate recurrence rate of eruptions based on an augmented set of radiometric dates; and 3) use lava flow, PDC, ash fall-out, and lahar computational simulation models to assess the potential effects of discrete volcanic hazards locally and regionally. We identify undated target units (units in key stratigraphic positions to provide maximum chronological insight) and obtain field samples for radiometric dating (40Ar/39Ar and K/Ar) and petrology. Stratigraphic and radiometric data are then used together in the Volcano Event Age Model (VEAM) to identify changes in the rate and type of volcanic eruptions through time, with statistical uncertainty. These newly obtained datasets will be added to published data to build a conceptual model of volcanic hazards at MLV. Alternative conceptual models, for example, may be that the rate of MLV lava flow eruptions are nonstationary in time and/or space and/or volume. We explore the consequences of these alternative models on forecasting future eruptions. As different styles of activity have different impacts, we estimate these potential effects using simulation. The results of this study will improve the existing MLV hazard assessment in hopes of mitigating casualties and social impact should an eruption occur at MLV.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA617981','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA617981"><span>Long-term Acoustic Real-Time Sensor for Polar Areas (LARA)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2014-09-30</p> <p>volcanic eruptions forecast for the near future, and the LARA moorings will allow us to observe the accuracy of these models in real-time. TRANSITIONS...systems at AUTEC and SCORE. In addition LARA technology will be useful for real-time monitoring of deep-ocean seismic and volcanic activity (e.g...M.J., Matsumoto, H., and Butterfield, D.A. (2012): Seismic precursors and magma ascent before the April 2011 eruption at Axial Seamount. Nature</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1014247','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1014247"><span>Long-Term Acoustic Real-Time Sensor for Polar Areas (LARA)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2015-09-30</p> <p>segment in the northeast Pacific Ocean. Both areas have seafloor volcanic eruptions forecast for the near future, and the LARA moorings will allow us...time monitoring of deep-ocean seismic and volcanic activity (e.g., Dziak et al., 2012) - especially in areas where SOSUS coverage no longer exists...precursors and magma ascent before the April 2011 eruption at Axial Seamount. Nature Geoscience, 5, pp. 478-482. Klatt, O., Boebel, O., and Fahrbach, E</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA598250','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA598250"><span>Long-term Acoustic Real-Time Sensor for Polar Areas (LARA)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2013-09-30</p> <p>Volcano and the Middle Valley Ridge segment in the northeast Pacific Ocean. Both areas have seafloor volcanic eruptions forecast for the near future...Sensor for Polar Areas (LARA) for real-time monitoring of marine mammals, ambient noise levels, seismic activities (e.g., eruption of undersea volcanoes...LARA technology will be useful for real-time monitoring of deep-ocean seismic and volcanic activity (e.g., Dziak et al., 2011) - especially in areas</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010IEITI..91.1234K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010IEITI..91.1234K"><span>Hybrid Intrusion Forecasting Framework for Early Warning System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo</p> <p></p> <p>Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1031455','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1031455"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Mendes, J.; Bessa, R.J.; Keko, H.</p> <p></p> <p>Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highlymore » dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1612808G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1612808G"><span>Development of a flood early warning system and communication with end-users: the Vipava/Vipacco case study in the KULTURisk FP7 project</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Grossi, Giovanna; Caronna, Paolo; Ranzi, Roberto</p> <p>2014-05-01</p> <p>Within the framework of risk communication, the goal of an early warning system is to support the interaction between technicians and authorities (and subsequently population) as a prevention measure. The methodology proposed in the KULTURisk FP7 project aimed to build a closer collaboration between these actors, in the perspective of promoting pro-active actions to mitigate the effects of flood hazards. The transnational (Slovenia/ Italy) Soča/Isonzo case study focused on this concept of cooperation between stakeholders and hydrological forecasters. The DIMOSHONG_VIP hydrological model was calibrated for the Vipava/Vipacco River (650 km2), a tributary of the Soča/Isonzo River, on the basis of flood events occurred between 1998 and 2012. The European Centre for Medium-Range Weather Forecasts (ECMWF) provided the past meteorological forecasts, both deterministic (1 forecast) and probabilistic (51 ensemble members). The resolution of the ECMWF grid is currently about 15 km (Deterministic-DET) and 30 km (Ensemble Prediction System-EPS). A verification was conducted to validate the flood-forecast outputs of the DIMOSHONG_VIP+ECMWF early warning system. Basic descriptive statistics, like event probability, probability of a forecast occurrence and frequency bias were determined. Some performance measures were calculated, such as hit rate (probability of detection) and false alarm rate (probability of false detection). Relative Opening Characteristic (ROC) curves were generated both for deterministic and probabilistic forecasts. These analysis showed a good performance of the early warning system, in respect of the small size of the sample. A particular attention was spent to the design of flood-forecasting output charts, involving and inquiring stakeholders (Alto Adriatico River Basin Authority), hydrology specialists in the field, and common people. Graph types for both forecasted precipitation and discharge were set. Three different risk thresholds were identified ("attention", "pre-alarm" or "alert", "alarm"), with an "icon-style" representation, suitable for communication to civil protection stakeholders or the public. Aiming at showing probabilistic representations in a "user-friendly" way, we opted for the visualization of the single deterministic forecasted hydrograph together with the 5%, 25%, 50%, 75% and 95% percentiles bands of the Hydrological Ensemble Prediction System (HEPS). HEPS is generally used for 3-5 days hydrological forecasts, while the error due to incorrect initial data is comparable to the error due to the lower resolution with respect to the deterministic forecast. In the short term forecasting (12-48 hours) the HEPS-members show obviously a similar tendency; in this case, considering its higher resolution, the deterministic forecast is expected to be more effective. The plot of different forecasts in the same chart allows the use of model outputs from 4/5 days to few hours before a potential flood event. This framework was built to help a stakeholder, like a mayor, a civil protection authority, etc, in the flood control and management operations, and was designed to be included in a wider decision support system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1028704','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1028704"><span>Coastal Foredune Evolution, Part 1: Environmental Factors and Forcing Processes Affecting Morphological Evolution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2017-02-01</p> <p>ERDC/CHL CHETN-II-56 February 2017 Approved for public release; distribution is unlimited. Coastal Foredune Evolution, Part 1: Environmental... Coastal and Hydraulics Engineering Technical Note (CHETN) is the first of two CHETNs focused on improving technologies to forecast coastal foredune...morphodynamic evolution of coastal foredunes. Part 2 reviews modeling approaches to forecast these changes and develops a probabilistic modeling framework to</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018TCry...12..935R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018TCry...12..935R"><span>Impact of rheology on probabilistic forecasts of sea ice trajectories: application for search and rescue operations in the Arctic</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rabatel, Matthias; Rampal, Pierre; Carrassi, Alberto; Bertino, Laurent; Jones, Christopher K. R. T.</p> <p>2018-03-01</p> <p>We present a sensitivity analysis and discuss the probabilistic forecast capabilities of the novel sea ice model neXtSIM used in hindcast mode. The study pertains to the response of the model to the uncertainty on winds using probabilistic forecasts of ice trajectories. neXtSIM is a continuous Lagrangian numerical model that uses an elasto-brittle rheology to simulate the ice response to external forces. The sensitivity analysis is based on a Monte Carlo sampling of 12 members. The response of the model to the uncertainties is evaluated in terms of simulated ice drift distances from their initial positions, and from the mean position of the ensemble, over the mid-term forecast horizon of 10 days. The simulated ice drift is decomposed into advective and diffusive parts that are characterised separately both spatially and temporally and compared to what is obtained with a free-drift model, that is, when the ice rheology does not play any role in the modelled physics of the ice. The seasonal variability of the model sensitivity is presented and shows the role of the ice compactness and rheology in the ice drift response at both local and regional scales in the Arctic. Indeed, the ice drift simulated by neXtSIM in summer is close to the one obtained with the free-drift model, while the more compact and solid ice pack shows a significantly different mechanical and drift behaviour in winter. For the winter period analysed in this study, we also show that, in contrast to the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy's trajectories and compared to the capability of the free-drift model. We found that neXtSIM performs significantly better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search and rescue operations, although the sources of uncertainties assumed for the present experiment are not sufficient for complete coverage of the observed IABP positions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.1613L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.1613L"><span>scoringRules - A software package for probabilistic model evaluation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian</p> <p>2016-04-01</p> <p>Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1610499T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1610499T"><span>The effect of the sea on hazard assessment for tephra fallout at Campi Flegrei: a preliminary approach through the use of pyPHaz, an open tool to analyze and visualize probabilistic hazards</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tonini, Roberto; Sandri, Laura; Costa, Antonio; Selva, Jacopo</p> <p>2014-05-01</p> <p>Campi Flegrei (CF) is a large volcanic field located west of the Gulf of Naples, characterized by a wide and almost circular caldera which is partially submerged beneath the Gulf of Pozzuoli. It is known that the magma-water interaction is a key element to determine the character of submarine eruptions and their impact on the surrounding areas, but this phenomenon is still not well understood and it is rarely considered in hazard assessment. The aim of the present work is to present a preliminary study of the effect of the sea on the tephra fall hazard from CF on the municipality of Naples, by introducing a variability in the probability of tephra production according to the eruptive scale (defined on the basis of the erupted volume) and the depth of the opening submerged vents. Four different Probabilistic Volcanic Hazard Assessment (PVHA) models have been defined through the application of the model BET_VH at CF, by accounting for different modeling procedures and assumptions for the submerged part of the caldera. In particular, we take into account: 1) the effect of the sea as null, i.e. as if the water were not present; 2) the effect of the sea as a cap that totally blocks the explosivity of eruptions and consequently the tephra production; 3) an ensemble model between the two models described at the previous points 1) and 2); 4) a variable probability of tephra production depending on the depth of the submerged vent. The PVHA models are then input to pyPHaz, a tool developed and designed at INGV to visualize, analyze and merge into ensemble models PVHA's results and, potentially, any other kind of probabilistic hazard assessment, both natural and anthropic, in order to evaluate the importance of considering a variability among subaerial and submerged vents on tephra fallout hazard from CF in Naples. The analysis is preliminary and does not pretend to be exhaustive, but on one hand it represents a starting point for future works; on the other hand, it is a good case study to show the potentiality of the pyPHaz tool that, thanks to a dedicated Graphical User Interface (GUI), allows to interactively manage and visualize results of probabilistic hazards (hazard curves together with probability and hazard maps for different levels of uncertainties), and to compare or merge different hazard models producing ensemble models. This work has been developed in the framework of two Italian projects, "ByMuR (Bayesian Multi-Risk Assessment: a case study for natural risks in the city of Naples)" funded by the Italian Ministry of Education, Universities and Research (MIUR), and "V1: Probabilistic Volcanic Hazard Assessments" funded by the Italian Department of Civil Protection (DPC).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28863495','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28863495"><span>On the limits of probabilistic forecasting in nonlinear time series analysis II: Differential entropy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Amigó, José M; Hirata, Yoshito; Aihara, Kazuyuki</p> <p>2017-08-01</p> <p>In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5479545','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5479545"><span>Probabilistic population aging</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2017-01-01</p> <p>We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMSM11E..06B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMSM11E..06B"><span>Verification of space weather forecasts at the UK Met Office</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.</p> <p>2017-12-01</p> <p>The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1296681-volcanic-lightning-plume-behavior-reveal-evolving-hazards-during-april-eruption-calbuco-volcano-chile','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1296681-volcanic-lightning-plume-behavior-reveal-evolving-hazards-during-april-eruption-calbuco-volcano-chile"><span>Volcanic lightning and plume behavior reveal evolving hazards during the April 2015 eruption of Calbuco Volcano, Chile</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Van Eaton, Alexa R.; Behnke, Sonja Ann; Amigo, Alvaro; ...</p> <p>2016-04-12</p> <p>Soon after the onset of an eruption, model forecasts of ash dispersal are used to mitigate the hazards to aircraft, infrastructure, and communities downwind. However, it is a significant challenge to constrain the model inputs during an evolving eruption. Here we demonstrate that volcanic lightning may be used in tandem with satellite detection to recognize and quantify changes in eruption style and intensity. Using the eruption of Calbuco volcano in southern Chile on 22 and 23 April 2015, we investigate rates of umbrella cloud expansion from satellite observations, occurrence of lightning, and mapped characteristics of the fall deposits. Our remotemore » sensing analysis gives a total erupted volume that is within uncertainty of the mapped volume (0.56 ± 0.28 km3 bulk). Furthermore, observations and volcanic plume modeling further suggest that electrical activity was enhanced both by ice formation in the ash clouds >10 km above sea level and development of a low-level charge layer from ground-hugging currents.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70182739','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70182739"><span>Volcanic lightning and plume behavior reveal evolving hazards during the April 2015 eruption of Calbuco volcano, Chile</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Van Eaton, Alexa; Amigo, Álvaro; Bertin, Daniel; Mastin, Larry G.; Giacosa, Raúl E; González, Jerónimo; Valderrama, Oscar; Fontijn, Karen; Behnke, Sonja A</p> <p>2016-01-01</p> <p>Soon after the onset of an eruption, model forecasts of ash dispersal are used to mitigate the hazards to aircraft, infrastructure and communities downwind. However, it is a significant challenge to constrain the model inputs during an evolving eruption. Here we demonstrate that volcanic lightning may be used in tandem with satellite detection to recognize and quantify changes in eruption style and intensity. Using the eruption of Calbuco volcano in southern Chile on 22-23 April 2015, we investigate rates of umbrella cloud expansion from satellite observations, occurrence of lightning, and mapped characteristics of the fall deposits. Our remote-sensing analysis gives a total erupted volume that is within uncertainty of the mapped volume (0.56 ±0.28 km3 bulk). Observations and volcanic plume modeling further suggest that electrical activity was enhanced both by ice formation in the ash clouds >10 km asl and development of a low-level charge layer from ground-hugging currents.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26235052','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26235052"><span>Hail formation triggers rapid ash aggregation in volcanic plumes.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Van Eaton, Alexa R; Mastin, Larry G; Herzog, Michael; Schwaiger, Hans F; Schneider, David J; Wallace, Kristi L; Clarke, Amanda B</p> <p>2015-08-03</p> <p>During explosive eruptions, airborne particles collide and stick together, accelerating the fallout of volcanic ash and climate-forcing aerosols. This aggregation process remains a major source of uncertainty both in ash dispersal forecasting and interpretation of eruptions from the geological record. Here we illuminate the mechanisms and timescales of particle aggregation from a well-characterized 'wet' eruption. The 2009 eruption of Redoubt Volcano, Alaska, incorporated water from the surface (in this case, a glacier), which is a common occurrence during explosive volcanism worldwide. Observations from C-band weather radar, fall deposits and numerical modelling demonstrate that hail-forming processes in the eruption plume triggered aggregation of ∼95% of the fine ash and stripped much of the erupted mass out of the atmosphere within 30 min. Based on these findings, we propose a mechanism of hail-like ash aggregation that contributes to the anomalously rapid fallout of fine ash and occurrence of concentrically layered aggregates in volcanic deposits.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1296681','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1296681"><span>Volcanic lightning and plume behavior reveal evolving hazards during the April 2015 eruption of Calbuco Volcano, Chile</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Van Eaton, Alexa R.; Behnke, Sonja Ann; Amigo, Alvaro</p> <p></p> <p>Soon after the onset of an eruption, model forecasts of ash dispersal are used to mitigate the hazards to aircraft, infrastructure, and communities downwind. However, it is a significant challenge to constrain the model inputs during an evolving eruption. Here we demonstrate that volcanic lightning may be used in tandem with satellite detection to recognize and quantify changes in eruption style and intensity. Using the eruption of Calbuco volcano in southern Chile on 22 and 23 April 2015, we investigate rates of umbrella cloud expansion from satellite observations, occurrence of lightning, and mapped characteristics of the fall deposits. Our remotemore » sensing analysis gives a total erupted volume that is within uncertainty of the mapped volume (0.56 ± 0.28 km3 bulk). Furthermore, observations and volcanic plume modeling further suggest that electrical activity was enhanced both by ice formation in the ash clouds >10 km above sea level and development of a low-level charge layer from ground-hugging currents.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4532834','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4532834"><span>Hail formation triggers rapid ash aggregation in volcanic plumes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Van Eaton, Alexa R.; Mastin, Larry G.; Herzog, Michael; Schwaiger, Hans F.; Schneider, David J.; Wallace, Kristi L.; Clarke, Amanda B.</p> <p>2015-01-01</p> <p>During explosive eruptions, airborne particles collide and stick together, accelerating the fallout of volcanic ash and climate-forcing aerosols. This aggregation process remains a major source of uncertainty both in ash dispersal forecasting and interpretation of eruptions from the geological record. Here we illuminate the mechanisms and timescales of particle aggregation from a well-characterized ‘wet' eruption. The 2009 eruption of Redoubt Volcano, Alaska, incorporated water from the surface (in this case, a glacier), which is a common occurrence during explosive volcanism worldwide. Observations from C-band weather radar, fall deposits and numerical modelling demonstrate that hail-forming processes in the eruption plume triggered aggregation of ∼95% of the fine ash and stripped much of the erupted mass out of the atmosphere within 30 min. Based on these findings, we propose a mechanism of hail-like ash aggregation that contributes to the anomalously rapid fallout of fine ash and occurrence of concentrically layered aggregates in volcanic deposits. PMID:26235052</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70041342','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70041342"><span>Repose time and cumulative moment magnitude: A new tool for forecasting eruptions?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Thelen, W.A.; Malone, S.D.; West, M.E.</p> <p>2010-01-01</p> <p>During earthquake swarms on active volcanoes, one of the primary challenges facing scientists is determining the likelihood of an eruption. Here we present the relation between repose time and the cumulative moment magnitude (CMM) as a tool to aid in differentiating between an eruption and a period of unrest. In several case studies, the CMM is lower at shorter repose times than it is at longer repose times. The relationship between repose time and CMM may be linear in log-log space, particularly at Mount St. Helens. We suggest that the volume and competence of the plug within the conduit drives the strength of the precursory CMM.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ACP....1710709P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ACP....1710709P"><span>Atmospheric processes affecting the separation of volcanic ash and SO2 in volcanic eruptions: inferences from the May 2011 Grímsvötn eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Prata, Fred; Woodhouse, Mark; Huppert, Herbert E.; Prata, Andrew; Thordarson, Thor; Carn, Simon</p> <p>2017-09-01</p> <p>The separation of volcanic ash and sulfur dioxide (SO2) gas is sometimes observed during volcanic eruptions. The exact conditions under which separation occurs are not fully understood but the phenomenon is of importance because of the effects volcanic emissions have on aviation, on the environment, and on the earth's radiation balance. The eruption of Grímsvötn, a subglacial volcano under the Vatnajökull glacier in Iceland during 21-28 May 2011 produced one of the most spectacular examples of ash and SO2 separation, which led to errors in the forecasting of ash in the atmosphere over northern Europe. Satellite data from several sources coupled with meteorological wind data and photographic evidence suggest that the eruption column was unable to sustain itself, resulting in a large deposition of ash, which left a low-level ash-rich atmospheric plume moving southwards and then eastwards towards the southern Scandinavian coast and a high-level predominantly SO2 plume travelling northwards and then spreading eastwards and westwards. Here we provide observational and modelling perspectives on the separation of ash and SO2 and present quantitative estimates of the masses of ash and SO2 that erupted, the directions of transport, and the likely impacts. We hypothesise that a partial column collapse or <q>sloughing</q> fed with ash from pyroclastic density currents (PDCs) occurred during the early stage of the eruption, leading to an ash-laden gravity intrusion that was swept southwards, separated from the main column. Our model suggests that water-mediated aggregation caused enhanced ash removal because of the plentiful supply of source water from melted glacial ice and from entrained atmospheric water. The analysis also suggests that ash and SO2 should be treated with separate source terms, leading to improvements in forecasting the movement of both types of emissions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017BVol...79...73B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017BVol...79...73B"><span>Potential impacts of tephra fallout from a large-scale explosive eruption at Sakurajima volcano, Japan</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Biass, S.; Todde, A.; Cioni, R.; Pistolesi, M.; Geshi, N.; Bonadonna, C.</p> <p>2017-10-01</p> <p>We present an exposure analysis of infrastructure and lifeline to tephra fallout for a future large-scale explosive eruption of Sakurajima volcano. An eruption scenario is identified based on the field characterization of the last subplinian eruption at Sakurajima and a review of reports of the eruptions that occurred in the past six centuries. A scenario-based probabilistic hazard assessment is performed using the Tephra2 model, considering various eruption durations to reflect complex eruptive sequences of all considered reference eruptions. A quantitative exposure analysis of infrastructures and lifelines is presented primarily using open-access data. The post-event impact assessment of Magill et al. (Earth Planets Space 65:677-698, 2013) after the 2011 VEI 2 eruption of Shinmoedake is used to discuss the vulnerability and the resilience of infrastructures during a future large eruption of Sakurajima. Results indicate a main eastward dispersal, with longer eruption durations increasing the probability of tephra accumulation in proximal areas and reducing it in distal areas. The exposure analysis reveals that 2300 km of road network, 18 km2 of urban area, and 306 km2 of agricultural land have a 50% probability of being affected by an accumulation of tephra of 1 kg/m2. A simple qualitative exposure analysis suggests that the municipalities of Kagoshima, Kanoya, and Tarumizu are the most likely to suffer impacts. Finally, the 2011 VEI 2 eruption of Shinmoedake demonstrated that the already implemented mitigation strategies have increased resilience and improved recovery of affected infrastructures. Nevertheless, the extent to which these mitigation actions will perform during the VEI 4 eruption presented here is unclear and our hazard assessment points to possible damages on the Sakurajima peninsula and the neighboring municipality of Tarumizu.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1812024N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1812024N"><span>Using seismic and tilt measurements simultaneously to forecast eruptions of silicic volcanoes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Neuberg, Jurgen; Collinson, Amy; Mothes, Patricia</p> <p>2016-04-01</p> <p>Independent interpretations of seismic swarms and tilt measurement on active silicic volcanoes have been successfully used to assess their eruption potential. Swarms of low-frequency seismic events have been associated with brittle failure or stick-slip motion of magma during ascent and have been used to estimate qualitatively the magma ascent rate which typically accelerates before lava dome collapses. Tilt signals are extremely sensitive indicators for volcano deformation and have been often modelled and interpreted as inflation or deflation of a shallow magma reservoir. Here we show that tilt in many cases does not represent inflation or deflation but is directly linked to magma ascent rate.This talk aims to combine these two independent observations, seismicity and deformation, to design and implement a forecasting tool that can be deployed in volcano observatories on an operational level.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22628652','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22628652"><span>Linking petrology and seismology at an active volcano.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Saunders, Kate; Blundy, Jon; Dohmen, Ralf; Cashman, Kathy</p> <p>2012-05-25</p> <p>Many active volcanoes exhibit changes in seismicity, ground deformation, and gas emissions, which in some instances arise from magma movement in the crust before eruption. An enduring challenge in volcano monitoring is interpreting signs of unrest in terms of the causal subterranean magmatic processes. We examined over 300 zoned orthopyroxene crystals from the 1980-1986 eruption of Mount St. Helens that record pulsatory intrusions of new magma and volatiles into an existing larger reservoir before the eruption occurred. Diffusion chronometry applied to orthopyroxene crystal rims shows that episodes of magma intrusion correlate temporally with recorded seismicity, providing evidence that some seismic events are related to magma intrusion. These time scales are commensurate with monitoring signals at restless volcanoes, thus improving our ability to forecast volcanic eruptions by using petrology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20170001797&hterms=Ozone&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DOzone','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20170001797&hterms=Ozone&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DOzone"><span>Probabilistic Forecasting of Surface Ozone with a Novel Statistical Approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Balashov, Nikolay V.; Thompson, Anne M.; Young, George S.</p> <p>2017-01-01</p> <p>The recent change in the Environmental Protection Agency's surface ozone regulation, lowering the surface ozone daily maximum 8-h average (MDA8) exceedance threshold from 75 to 70 ppbv, poses significant challenges to U.S. air quality (AQ) forecasters responsible for ozone MDA8 forecasts. The forecasters, supplied by only a few AQ model products, end up relying heavily on self-developed tools. To help U.S. AQ forecasters, this study explores a surface ozone MDA8 forecasting tool that is based solely on statistical methods and standard meteorological variables from the numerical weather prediction (NWP) models. The model combines the self-organizing map (SOM), which is a clustering technique, with a step wise weighted quadratic regression using meteorological variables as predictors for ozone MDA8. The SOM method identifies different weather regimes, to distinguish between various modes of ozone variability, and groups them according to similarity. In this way, when a regression is developed for a specific regime, data from the other regimes are also used, with weights that are based on their similarity to this specific regime. This approach, regression in SOM (REGiS), yields a distinct model for each regime taking into account both the training cases for that regime and other similar training cases. To produce probabilistic MDA8 ozone forecasts, REGiS weighs and combines all of the developed regression models on the basis of the weather patterns predicted by an NWP model. REGiS is evaluated over the San Joaquin Valley in California and the northeastern plains of Colorado. The results suggest that the model performs best when trained and adjusted separately for an individual AQ station and its corresponding meteorological site.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016HESS...20..505F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016HESS...20..505F"><span>Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.</p> <p>2016-01-01</p> <p>The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e., the observation that large-scale rainfall structures are more persistent and predictable than small-scale convective cells. This paper presents the development, adaptation and verification of the STEPS system for Belgium (STEPS-BE). STEPS-BE provides in real-time 20-member ensemble precipitation nowcasts at 1 km and 5 min resolutions up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 75-90 % of the forecast errors.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015HESSD..12.6831F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015HESSD..12.6831F"><span>Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.</p> <p>2015-07-01</p> <p>The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE). STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80-90 % of the forecast errors.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110008724','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110008724"><span>Understanding Solar Eruptions with SDO/HMI Measuring Photospheric Flows, Testing Models, and Steps Towards Forecasting Solar Eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Schuck, Peter W.; Linton, Mark; Muglach, Karin; Welsch, Brian; Hageman, Jacob</p> <p>2010-01-01</p> <p>The imminent launch of Solar Dynamics Observatory (SDO) will carry the first full-disk imaging vector magnetograph, the Helioseismic and Magnetic Imager (HMI), into an inclined geosynchronous orbit. This magnetograph will provide nearly continuous measurements of photospheric vector magnetic fields at cadences of 90 seconds to 12 minutes with I" resolution, precise pointing, and unfettered by atmospheric seeing. The enormous data stream of 1.5 Terabytes per day from SDO will provide an unprecedented opportunity to understand the mysteries of solar eruptions. These ground-breaking observations will permit the application of a new technique, the differential affine velocity estimator for vector magnetograms (DAVE4VM), to measure photospheric plasma flows in active regions. These measurements will permit, for the first time, accurate assessments of the coronal free energy available for driving CMEs and flares. The details of photospheric plasma flows, particularly along magnetic neutral-lines, are critical to testing models for initiating coronal mass ejections (CMEs) and flares. Assimilating flows and fields into state-of-the art 3D MHD simulations that model the highly stratified solar atmosphere from the convection zone to the corona represents the next step towards achieving NASA's Living with a Star forecasting goals of predicting "when a solar eruption leading to a CME will occur." This talk will describe these major science and predictive advances that will be delivered by SDO /HMI.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20100021379&hterms=step&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Dstep','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20100021379&hterms=step&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Dstep"><span>Understanding Solar Eruptions with SDO/HMI Measuring Photospheric Flows, Testing Models, and Steps Towards Forecasting Solar Eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Schuck, Peter W.; Linton, M.; Muglach, K.; Hoeksema, T.</p> <p>2010-01-01</p> <p>The Solar Dynamics Observatory (SDO) is carrying the first full-disk imaging vector magnetograph, the Helioseismic and Magnetic Imager (HMI), into an inclined geosynchronous orbit. This magnetograph will provide nearly continuous measurements of photospheric vector magnetic fields at cadences of 90 seconds to 12 minutes with 1" resolution, precise pointing, and unfettered by atmospheric seeing. The enormous data stream of 1.5 Terabytes per day from SAO will provide an unprecedented opportunity to understand the mysteries of solar eruptions. These ground-breaking observations will permit the application of a new technique, the differential affine velocity estimator for vector magnetograms (DAVE4VM), to measure photospheric plasma flows in active regions. These measurements will permit, for the first time, accurate assessments of the coronal free energy available for driving CMEs and flares. The details of photospheric plasma flows, particularly along magnetic neutral-lines, are critical to testing models for initiating coronal mass ejections (CMEs) and flares. Assimilating flows and fields into state-of-the art 3D MHD simulations that model the highly stratified solar atmosphere from the convection zone to the corona represents the next step towards achieving NASA's Living with a Star forecasting goals of predicting "when a solar eruption leading to a CME will occur." Our presentation will describe these major science and predictive advances that will be delivered by SDO/HMI.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AtmRe.194..245S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AtmRe.194..245S"><span>Probabilistic precipitation nowcasting based on an extrapolation of radar reflectivity and an ensemble approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sokol, Zbyněk; Mejsnar, Jan; Pop, Lukáš; Bližňák, Vojtěch</p> <p>2017-09-01</p> <p>A new method for the probabilistic nowcasting of instantaneous rain rates (ENS) based on the ensemble technique and extrapolation along Lagrangian trajectories of current radar reflectivity is presented. Assuming inaccurate forecasts of the trajectories, an ensemble of precipitation forecasts is calculated and used to estimate the probability that rain rates will exceed a given threshold in a given grid point. Although the extrapolation neglects the growth and decay of precipitation, their impact on the probability forecast is taken into account by the calibration of forecasts using the reliability component of the Brier score (BS). ENS forecasts the probability that the rain rates will exceed thresholds of 0.1, 1.0 and 3.0 mm/h in squares of 3 km by 3 km. The lead times were up to 60 min, and the forecast accuracy was measured by the BS. The ENS forecasts were compared with two other methods: combined method (COM) and neighbourhood method (NEI). NEI considered the extrapolated values in the square neighbourhood of 5 by 5 grid points of the point of interest as ensemble members, and the COM ensemble was comprised of united ensemble members of ENS and NEI. The results showed that the calibration technique significantly improves bias of the probability forecasts by including additional uncertainties that correspond to neglected processes during the extrapolation. In addition, the calibration can also be used for finding the limits of maximum lead times for which the forecasting method is useful. We found that ENS is useful for lead times up to 60 min for thresholds of 0.1 and 1 mm/h and approximately 30 to 40 min for a threshold of 3 mm/h. We also found that a reasonable size of the ensemble is 100 members, which provided better scores than ensembles with 10, 25 and 50 members. In terms of the BS, the best results were obtained by ENS and COM, which are comparable. However, ENS is better calibrated and thus preferable.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013JVGR..261..153B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013JVGR..261..153B"><span>Analysis of the seismic activity associated with the 2010 eruption of Merapi Volcano, Java</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Budi-Santoso, Agus; Lesage, Philippe; Dwiyono, Sapari; Sumarti, Sri; Subandriyo; Surono; Jousset, Philippe; Metaxian, Jean-Philippe</p> <p>2013-07-01</p> <p>The 2010 eruption of Merapi is the first large explosive eruption of the volcano that has been instrumentally observed. The main characteristics of the seismic activity during the pre-eruptive period and the crisis are presented and interpreted in this paper. The first seismic precursors were a series of four shallow swarms during the period between 12 and 4 months before the eruption. These swarms are interpreted as the result of perturbations of the hydrothermal system by increasing heat flow. Shorter-term and more continuous precursory seismic activity started about 6 weeks before the initial explosion on 26 October 2010. During this period, the rate of seismicity increased almost constantly yielding a cumulative seismic energy release for volcano-tectonic (VT) and multiphase events (MP) of 7.5 × 1010 J. This value is 3 times the maximum energy release preceding previous effusive eruptions of Merapi. The high level reached and the accelerated behavior of both the deformation of the summit and the seismic activity are distinct features of the 2010 eruption. The hypocenters of VT events in 2010 occur in two clusters at of 2.5 to 5 km and less than 1.5 km depths below the summit. An aseismic zone was detected at 1.5-2.5 km depth, consistent with studies of previous eruptions, and indicating that this is a robust feature of Merapi's subsurface structure. Our analysis suggests that the aseismic zone is a poorly consolidated layer of altered material within the volcano. Deep VT events occurred mainly before 17 October 2010; subsequent to that time shallow activity strongly increased. The deep seismic activity is interpreted as associated with the enlargement of a narrow conduit by an unusually large volume of rapidly ascending magma. The shallow seismicity is interpreted as recording the final magma ascent and the rupture of a summit-dome plug, which triggered the eruption on 26 October 2010. Hindsight forecasting of the occurrence time of the eruption is performed by applying the Material Failure Forecast Method (FFM) using cumulative Real-time Seismic Amplitude (RSAM) calculated both from raw records and on signals classified according to their dominant frequency. Stable estimates of eruption time with errors as small as ± 4 h are obtained within a 6 day lapse time before the eruption. This approach could therefore be useful to support decision making in the case of future large explosive episodes at Merapi.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.2892H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.2892H"><span>Monthly forecasting of agricultural pests in Switzerland</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hirschi, M.; Dubrovsky, M.; Spirig, C.; Samietz, J.; Calanca, P.; Weigel, A. P.; Fischer, A. M.; Rotach, M. W.</p> <p>2012-04-01</p> <p>Given the repercussions of pests and diseases on agricultural production, detailed forecasting tools have been developed to simulate the degree of infestation depending on actual weather conditions. The life cycle of pests is most successfully predicted if the micro-climate of the immediate environment (habitat) of the causative organisms can be simulated. Sub-seasonal pest forecasts therefore require weather information for the relevant habitats and the appropriate time scale. The pest forecasting system SOPRA (www.sopra.info) currently in operation in Switzerland relies on such detailed weather information, using hourly weather observations up to the day the forecast is issued, but only a climatology for the forecasting period. Here, we aim at improving the skill of SOPRA forecasts by transforming the weekly information provided by ECMWF monthly forecasts (MOFCs) into hourly weather series as required for the prediction of upcoming life phases of the codling moth, the major insect pest in apple orchards worldwide. Due to the probabilistic nature of operational monthly forecasts and the limited spatial and temporal resolution, their information needs to be post-processed for use in a pest model. In this study, we developed a statistical downscaling approach for MOFCs that includes the following steps: (i) application of a stochastic weather generator to generate a large pool of daily weather series consistent with the climate at a specific location, (ii) a subsequent re-sampling of weather series from this pool to optimally represent the evolution of the weekly MOFC anomalies, and (iii) a final extension to hourly weather series suitable for the pest forecasting model. Results show a clear improvement in the forecast skill of occurrences of upcoming codling moth life phases when incorporating MOFCs as compared to the operational pest forecasting system. This is true both in terms of root mean squared errors and of the continuous rank probability scores of the probabilistic forecasts vs. the mean absolute errors of the deterministic system. Also, the application of the climate conserving recalibration (CCR, Weigel et al. 2009) technique allows for successful correction of the under-confidence in the forecasted occurrences of codling moth life phases. Reference: Weigel, A. P.; Liniger, M. A. & Appenzeller, C. (2009). Seasonal Ensemble Forecasts: Are Recalibrated Single Models Better than Multimodels? Mon. Wea. Rev., 137, 1460-1479.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24749287','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24749287"><span>[Forecast of costs of ecodependent cancer treatment for the development of management decisions].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Krasovskiy, V O</p> <p>2014-01-01</p> <p>The methodical approach for probabilistic forecasting and differentiation of treatment of costs of ecodependent cancer cases has been elaborated. The modality is useful in the organization of medical aid to cancer patients, in developing management decisions for the reduction the occupational load on the population, as well as in solutions problems in compensation to the population economic and social loss from industrial plants.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1615427P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1615427P"><span>HEPEX - achievements and challenges!</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pappenberger, Florian; Ramos, Maria-Helena; Thielen, Jutta; Wood, Andy; Wang, Qj; Duan, Qingyun; Collischonn, Walter; Verkade, Jan; Voisin, Nathalie; Wetterhall, Fredrik; Vuillaume, Jean-Francois Emmanuel; Lucatero Villasenor, Diana; Cloke, Hannah L.; Schaake, John; van Andel, Schalk-Jan</p> <p>2014-05-01</p> <p>HEPEX is an international initiative bringing together hydrologists, meteorologists, researchers and end-users to develop advanced probabilistic hydrological forecast techniques for improved flood, drought and water management. HEPEX was launched in 2004 as an independent, cooperative international scientific activity. During the first meeting, the overarching goal was defined as: "to develop and test procedures to produce reliable hydrological ensemble forecasts, and to demonstrate their utility in decision making related to the water, environmental and emergency management sectors." The applications of hydrological ensemble predictions span across large spatio-temporal scales, ranging from short-term and localized predictions to global climate change and regional modeling. Within the HEPEX community, information is shared through its blog (www.hepex.org), meetings, testbeds and intercompaison experiments, as well as project reportings. Key questions of HEPEX are: * What adaptations are required for meteorological ensemble systems to be coupled with hydrological ensemble systems? * How should the existing hydrological ensemble prediction systems be modified to account for all sources of uncertainty within a forecast? * What is the best way for the user community to take advantage of ensemble forecasts and to make better decisions based on them? This year HEPEX celebrates its 10th year anniversary and this poster will present a review of the main operational and research achievements and challenges prepared by Hepex contributors on data assimilation, post-processing of hydrologic predictions, forecast verification, communication and use of probabilistic forecasts in decision-making. Additionally, we will present the most recent activities implemented by Hepex and illustrate how everyone can join the community and participate to the development of new approaches in hydrologic ensemble prediction.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.S21A2685N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.S21A2685N"><span>Real-time Mainshock Forecast by Statistical Discrimination of Foreshock Clusters</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nomura, S.; Ogata, Y.</p> <p>2016-12-01</p> <p>Foreshock discremination is one of the most effective ways for short-time forecast of large main shocks. Though many large earthquakes accompany their foreshocks, discreminating them from enormous small earthquakes is difficult and only probabilistic evaluation from their spatio-temporal features and magnitude evolution may be available. Logistic regression is the statistical learning method best suited to such binary pattern recognition problems where estimates of a-posteriori probability of class membership are required. Statistical learning methods can keep learning discreminating features from updating catalog and give probabilistic recognition of forecast in real time. We estimated a non-linear function of foreshock proportion by smooth spline bases and evaluate the possibility of foreshocks by the logit function. In this study, we classified foreshocks from earthquake catalog by the Japan Meteorological Agency by single-link clustering methods and learned spatial and temporal features of foreshocks by the probability density ratio estimation. We use the epicentral locations, time spans and difference in magnitudes for learning and forecasting. Magnitudes of main shocks are also predicted our method by incorporating b-values into our method. We discuss the spatial pattern of foreshocks from the classifier composed by our model. We also implement a back test to validate predictive performance of the model by this catalog.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70024328','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70024328"><span>Spatial forecasting of disease risk and uncertainty</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>De Cola, L.</p> <p>2002-01-01</p> <p>Because maps typically represent the value of a single variable over 2-dimensional space, cartographers must simplify the display of multiscale complexity, temporal dynamics, and underlying uncertainty. A choropleth disease risk map based on data for polygonal regions might depict incidence (cases per 100,000 people) within each polygon for a year but ignore the uncertainty that results from finer-scale variation, generalization, misreporting, small numbers, and future unknowns. In response to such limitations, this paper reports on the bivariate mapping of data "quantity" and "quality" of Lyme disease forecasts for states of the United States. Historical state data for 1990-2000 are used in an autoregressive model to forecast 2001-2010 disease incidence and a probability index of confidence, each of which is then kriged to provide two spatial grids representing continuous values over the nation. A single bivariate map is produced from the combination of the incidence grid (using a blue-to-red hue spectrum), and a probabilistic confidence grid (used to control the saturation of the hue at each grid cell). The resultant maps are easily interpretable, and the approach may be applied to such problems as detecting unusual disease occurences, visualizing past and future incidence, and assembling a consistent regional disease atlas showing patterns of forecasted risks in light of probabilistic confidence.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11.4820D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11.4820D"><span>Regional crop yield forecasting: a probabilistic approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>de Wit, A.; van Diepen, K.; Boogaard, H.</p> <p>2009-04-01</p> <p>Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140009235','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140009235"><span>Toward the Probabilistic Forecasting of High-latitude GPS Phase Scintillation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Prikryl, P.; Jayachandran, P.T.; Mushini, S. C.; Richardson, I. G.</p> <p>2012-01-01</p> <p>The phase scintillation index was obtained from L1 GPS data collected with the Canadian High Arctic Ionospheric Network (CHAIN) during years of extended solar minimum 2008-2010. Phase scintillation occurs predominantly on the dayside in the cusp and in the nightside auroral oval. We set forth a probabilistic forecast method of phase scintillation in the cusp based on the arrival time of either solar wind corotating interaction regions (CIRs) or interplanetary coronal mass ejections (ICMEs). CIRs on the leading edge of high-speed streams (HSS) from coronal holes are known to cause recurrent geomagnetic and ionospheric disturbances that can be forecast one or several solar rotations in advance. Superposed epoch analysis of phase scintillation occurrence showed a sharp increase in scintillation occurrence just after the arrival of high-speed solar wind and a peak associated with weak to moderate CMEs during the solar minimum. Cumulative probability distribution functions for the phase scintillation occurrence in the cusp are obtained from statistical data for days before and after CIR and ICME arrivals. The probability curves are also specified for low and high (below and above median) values of various solar wind plasma parameters. The initial results are used to demonstrate a forecasting technique on two example periods of CIRs and ICMEs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ClDy...42.1425M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ClDy...42.1425M"><span>Global seasonal climate predictability in a two tiered forecast system: part I: boreal summer and fall seasons</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Misra, Vasubandhu; Li, H.; Wu, Z.; DiNapoli, S.</p> <p>2014-03-01</p> <p>This paper shows demonstrable improvement in the global seasonal climate predictability of boreal summer (at zero lead) and fall (at one season lead) seasonal mean precipitation and surface temperature from a two-tiered seasonal hindcast forced with forecasted SST relative to two other contemporary operational coupled ocean-atmosphere climate models. The results from an extensive set of seasonal hindcasts are analyzed to come to this conclusion. This improvement is attributed to: (1) The multi-model bias corrected SST used to force the atmospheric model. (2) The global atmospheric model which is run at a relatively high resolution of 50 km grid resolution compared to the two other coupled ocean-atmosphere models. (3) The physics of the atmospheric model, especially that related to the convective parameterization scheme. The results of the seasonal hindcast are analyzed for both deterministic and probabilistic skill. The probabilistic skill analysis shows that significant forecast skill can be harvested from these seasonal hindcasts relative to the deterministic skill analysis. The paper concludes that the coupled ocean-atmosphere seasonal hindcasts have reached a reasonable fidelity to exploit their SST anomaly forecasts to force such relatively higher resolution two tier prediction experiments to glean further boreal summer and fall seasonal prediction skill.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017NHESS..17..423S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017NHESS..17..423S"><span>Community-based early warning systems for flood risk mitigation in Nepal</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Smith, Paul J.; Brown, Sarah; Dugar, Sumit</p> <p>2017-03-01</p> <p>This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5488948','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5488948"><span>Probabilistic model predicts dynamics of vegetation biomass in a desert ecosystem in NW China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Wang, Xin-ping; Schaffer, Benjamin Eli; Yang, Zhenlei; Rodriguez-Iturbe, Ignacio</p> <p>2017-01-01</p> <p>The temporal dynamics of vegetation biomass are of key importance for evaluating the sustainability of arid and semiarid ecosystems. In these ecosystems, biomass and soil moisture are coupled stochastic variables externally driven, mainly, by the rainfall dynamics. Based on long-term field observations in northwestern (NW) China, we test a recently developed analytical scheme for the description of the leaf biomass dynamics undergoing seasonal cycles with different rainfall characteristics. The probabilistic characterization of such dynamics agrees remarkably well with the field measurements, providing a tool to forecast the changes to be expected in biomass for arid and semiarid ecosystems under climate change conditions. These changes will depend—for each season—on the forecasted rate of rainy days, mean depth of rain in a rainy day, and duration of the season. For the site in NW China, the current scenario of an increase of 10% in rate of rainy days, 10% in mean rain depth in a rainy day, and no change in the season duration leads to forecasted increases in mean leaf biomass near 25% in both seasons. PMID:28584097</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1457667-copula-based-conditional-probabilistic-forecast-model-wind-power-ramps','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1457667-copula-based-conditional-probabilistic-forecast-model-wind-power-ramps"><span>A Copula-Based Conditional Probabilistic Forecast Model for Wind Power Ramps</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hodge, Brian S; Krishnan, Venkat K; Zhang, Jie</p> <p></p> <p>Efficient management of wind ramping characteristics can significantly reduce wind integration costs for balancing authorities. By considering the stochastic dependence of wind power ramp (WPR) features, this paper develops a conditional probabilistic wind power ramp forecast (cp-WPRF) model based on Copula theory. The WPRs dataset is constructed by extracting ramps from a large dataset of historical wind power. Each WPR feature (e.g., rate, magnitude, duration, and start-time) is separately forecasted by considering the coupling effects among different ramp features. To accurately model the marginal distributions with a copula, a Gaussian mixture model (GMM) is adopted to characterize the WPR uncertaintymore » and features. The Canonical Maximum Likelihood (CML) method is used to estimate parameters of the multivariable copula. The optimal copula model is chosen based on the Bayesian information criterion (BIC) from each copula family. Finally, the best conditions based cp-WPRF model is determined by predictive interval (PI) based evaluation metrics. Numerical simulations on publicly available wind power data show that the developed copula-based cp-WPRF model can predict WPRs with a high level of reliability and sharpness.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70191322','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70191322"><span>Forecasting the probability of future groundwater levels declining below specified low thresholds in the conterminous U.S.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Dudley, Robert W.; Hodgkins, Glenn A.; Dickinson, Jesse</p> <p>2017-01-01</p> <p>We present a logistic regression approach for forecasting the probability of future groundwater levels declining or maintaining below specific groundwater-level thresholds. We tested our approach on 102 groundwater wells in different climatic regions and aquifers of the United States that are part of the U.S. Geological Survey Groundwater Climate Response Network. We evaluated the importance of current groundwater levels, precipitation, streamflow, seasonal variability, Palmer Drought Severity Index, and atmosphere/ocean indices for developing the logistic regression equations. Several diagnostics of model fit were used to evaluate the regression equations, including testing of autocorrelation of residuals, goodness-of-fit metrics, and bootstrap validation testing. The probabilistic predictions were most successful at wells with high persistence (low month-to-month variability) in their groundwater records and at wells where the groundwater level remained below the defined low threshold for sustained periods (generally three months or longer). The model fit was weakest at wells with strong seasonal variability in levels and with shorter duration low-threshold events. We identified challenges in deriving probabilistic-forecasting models and possible approaches for addressing those challenges.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1713444S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1713444S"><span>A multi-source probabilistic hazard assessment of tephra dispersal in the Neapolitan area</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sandri, Laura; Costa, Antonio; Selva, Jacopo; Folch, Arnau; Macedonio, Giovanni; Tonini, Roberto</p> <p>2015-04-01</p> <p>In this study we present the results obtained from a long-term Probabilistic Hazard Assessment (PHA) of tephra dispersal in the Neapolitan area. Usual PHA for tephra dispersal needs the definition of eruptive scenarios (usually by grouping eruption sizes and possible vent positions in a limited number of classes) with associated probabilities, a meteorological dataset covering a representative time period, and a tephra dispersal model. PHA then results from combining simulations considering different volcanological and meteorological conditions through weights associated to their specific probability of occurrence. However, volcanological parameters (i.e., erupted mass, eruption column height, eruption duration, bulk granulometry, fraction of aggregates) typically encompass a wide range of values. Because of such a natural variability, single representative scenarios or size classes cannot be adequately defined using single values for the volcanological inputs. In the present study, we use a method that accounts for this within-size-class variability in the framework of Event Trees. The variability of each parameter is modeled with specific Probability Density Functions, and meteorological and volcanological input values are chosen by using a stratified sampling method. This procedure allows for quantifying hazard without relying on the definition of scenarios, thus avoiding potential biases introduced by selecting single representative scenarios. Embedding this procedure into the Bayesian Event Tree scheme enables the tephra fall PHA and its epistemic uncertainties. We have appied this scheme to analyze long-term tephra fall PHA from Vesuvius and Campi Flegrei, in a multi-source paradigm. We integrate two tephra dispersal models (the analytical HAZMAP and the numerical FALL3D) into BET_VH. The ECMWF reanalysis dataset are used for exploring different meteorological conditions. The results obtained show that PHA accounting for the whole natural variability are consistent with previous probabilities maps elaborated for Vesuvius and Campi Flegrei on the basis of single representative scenarios, but show significant differences. In particular, the area characterized by a 300 kg/m2-load exceedance probability larger than 5%, accounting for the whole range of variability (that is, from small violent strombolian to plinian eruptions), is similar to that displayed in the maps based on the medium magnitude reference eruption, but it is of a smaller extent. This is due to the relatively higher weight of the small magnitude eruptions considered in this study, but neglected in the reference scenario maps. On the other hand, in our new maps the area characterized by a 300 kg/m2-load exceedance probability larger than 1% is much larger than that of the medium magnitude reference eruption, due to the contribution of plinian eruptions at lower probabilities, again neglected in the reference scenario maps.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1812661V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1812661V"><span>Sequential Data Assimilation for Seismicity: a Proof of Concept</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>van Dinther, Ylona; Fichtner, Andreas; Kuensch, Hansruedi</p> <p>2016-04-01</p> <p>Our probabilistic forecasting ability and physical understanding of earthquakes is significantly hampered by limited indications on the current and evolving state of stress and strength on faults. This information is typically thought to be beyond our resolution capabilities based on surface data. We show that the state of stress and strength are actually obtainable for settings with one dominant fault. State variables and their uncertainties are obtained using Ensemble Kalman Filtering, a sequential data assimilation technique extensively developed for weather forecasting purposes. Through the least-squares solution of Bayes theorem erroneous data is for the first time assimilated to update a Partial Differential Equation-driven seismic cycle model. This visco-elasto-plastic continuum forward model solves Navier-Stokes equations with a rate-dependent friction coefficient (van Dinther et al., JGR, 2013). To prove the concept of this weather - earthquake forecasting bridge we perform a perfect model test. Synthetic numerical data from a single analogue borehole is assimilated into 20 ensemble models over 14 cycles of analogue earthquakes. Since we know the true state of the numerical data model, a quantitative and qualitative evaluation shows that meaningful information on the stress and strength of the unobserved fault is typically already available, once data from a single, shallow borehole is assimilated over part of a seismic cycle. This is possible, since the sampled error covariance matrix contains prior information on the physics that relates velocities, stresses, and pressures at the surface to those at the fault. During the analysis step stress and strength distributions are thus reconstructed in such a way that fault coupling can be updated to either inhibit or trigger events. In the subsequent forward propagation step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next analogue earthquake. At the next constant assimilation step, the systems forecasting ability turns out to be beyond expectations; 5 analogue events are forecasted approximately accurately, 5 had indications slightly earlier, 3 were identified only during propagation, and 1 was missed. Else predominantly quite interseismic times were forecasted, but for 3 occasions where smaller events triggered prolonged probabilities until the larger event that came slightly latter. Besides temporal forecasting, we also observe some magnitude forecasting skill for 59% of the events, while the other event sizes were underestimated. This new framework thus provides potential to in the long-term assist with improving our probabilistic hazard assessment.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.8959K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.8959K"><span>Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kuchment, L.</p> <p>2012-04-01</p> <p>Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017HESS...21.2967M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017HESS...21.2967M"><span>Moving beyond the cost-loss ratio: economic assessment of streamflow forecasts for a risk-averse decision maker</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles</p> <p>2017-06-01</p> <p>A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1421637-constructing-probabilistic-scenarios-wide-area-solar-power-generation','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1421637-constructing-probabilistic-scenarios-wide-area-solar-power-generation"><span>Constructing probabilistic scenarios for wide-area solar power generation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Woodruff, David L.; Deride, Julio; Staid, Andrea; ...</p> <p>2017-12-22</p> <p>Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1421637-constructing-probabilistic-scenarios-wide-area-solar-power-generation','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1421637-constructing-probabilistic-scenarios-wide-area-solar-power-generation"><span>Constructing probabilistic scenarios for wide-area solar power generation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Woodruff, David L.; Deride, Julio; Staid, Andrea</p> <p></p> <p>Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1816808N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1816808N"><span>A framework for probabilistic pluvial flood nowcasting for urban areas</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick</p> <p>2016-04-01</p> <p>Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the larger city of Gent, Belgium. After each of the different above-mentioned components were evaluated, they were combined and tested for recent historical flood events. The rainfall nowcasting, hydraulic sewer and 2D inundation modelling and socio-economical flood risk results each could be partly evaluated: the rainfall nowcasting results based on radar data and rain gauges; the hydraulic sewer model results based on water level and discharge data at pumping stations; the 2D inundation modelling results based on limited data on some recent flood locations and inundation depths; the results for the socio-economical flood consequences of the most extreme events based on claims in the database of the national disaster agency. Different methods for visualization of the probabilistic inundation results are proposed and tested.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1916810O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1916810O"><span>Total probabilities of ensemble runoff forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian</p> <p>2017-04-01</p> <p>Ensemble forecasting has a long history from meteorological modelling, as an indication of the uncertainty of the forecasts. However, it is necessary to calibrate and post-process the ensembles as the they often exhibit both bias and dispersion errors. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters varying in space and time, while giving a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, which makes it unsuitable for our purpose. Our post-processing method of the ensembles is developed in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu), where we are making forecasts for whole Europe, and based on observations from around 700 catchments. As the target is flood forecasting, we are also more interested in improving the forecast skill for high-flows rather than in a good prediction of the entire flow regime. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different meteorological forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to estimate the total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but we are adding a spatial penalty in the calibration process to force a spatial correlation of the parameters. The penalty takes distance, stream-connectivity and size of the catchment areas into account. This can in some cases have a slight negative impact on the calibration error, but avoids large differences between parameters of nearby locations, whether stream connected or not. The spatial calibration also makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1813618O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1813618O"><span>Total probabilities of ensemble runoff forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian</p> <p>2016-04-01</p> <p>Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative impact on the calibration error, but makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017NatSR...714750D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017NatSR...714750D"><span>The Origin of the "Seasons" in Space Weather</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dikpati, Mausumi; Cally, Paul S.; McIntosh, Scott W.; Heifetz, Eyal</p> <p>2017-11-01</p> <p>Powerful `space weather' events caused by solar activity pose serious risks to human health, safety, economic activity and national security. Spikes in deaths due to heart attacks, strokes and other diseases occurred during prolonged power outages. Currently it is hard to prepare for and mitigate the impact of space weather because it is impossible to forecast the solar eruptions that can cause these terrestrial events until they are seen on the Sun. However, as recently reported in Nature, eruptive events like coronal mass ejections and solar flares, are organized into quasi-periodic "seasons", which include enhanced bursts of eruptions for several months, followed by quiet periods. We explored the dynamics of sunspot-producing magnetic fields and discovered for the first time that bursty and quiet seasons, manifested in surface magnetic structures, can be caused by quasi-periodic energy-exchange among magnetic fields, Rossby waves and differential rotation of the solar interior shear-layer (called tachocline). Our results for the first time provide a quantitative physical mechanism for forecasting the strength and duration of bursty seasons several months in advance, which can greatly enhance our ability to warn humans about dangerous solar bursts and prevent damage to satellites and power stations from space weather events.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20000101051&hterms=vital+findings&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dvital%2Bfindings','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20000101051&hterms=vital+findings&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dvital%2Bfindings"><span>Sigmoid CME Source Regions at the Sun: Some Recent Results</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Sterling, Alphonse C.; Rose, M. Franklin (Technical Monitor)</p> <p>2000-01-01</p> <p>Identifying Coronal Mass Ejection (CME) precursors in the solar corona would be an important step in space weather forecasting, as well as a vital key to understanding the physics of CMEs. Twisted magnetic field structures are suspected of being the source of at least some CMEs. These features can appear sigmoid (S or inverse-S) shaped in soft X-ray (SXR) images. We review recent observations of these structures and their relation to CMEs, using soft X-ray (SXR) data from the Soft X-ray Telescope (SXT) on the Yohkoh satellite, and EUV data from the EUV Imaging Telescope (EIT) on the SOHO satellite. These observations indicate that the pre-eruption sigmoid patterns are more prominent in SXRs than in EUV, and that sigmoid precursors are present in over 50% of CMEs. These findings are important for CME research, and may potentially be a major component to space weather forecasting. So far, however, the studies have been subject to restrictions that will have to be relaxed before sigmoid morphology can be used as a reliable predictive tool. Moreover, some CMEs do not display a SXR sigmoid structure prior to eruption, and some others show no prominent SXR signature of any kind before or during eruption.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20010019889&hterms=vital+findings&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Dvital%2Bfindings','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20010019889&hterms=vital+findings&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Dvital%2Bfindings"><span>Sigmoid CME Source Regions at The Sun: Some Recent Results</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Sterling, Alphonse C.</p> <p>2000-01-01</p> <p>Identifying coronal mass ejection (CME) precursors in the solar corona would be an important step in space weather forecasting, as well as a vital key to understanding the physics of CMEs. Twisted magnetic field structures are suspected of being the source of at least some CMEs. These features can appear sigmoid (S or inverse-S) shaped in soft X-ray, (SXR) images. We review recent observations of these structures and their relation to CMEs. using SXR data from the Soft X-ray Telescope (SXT) on the Yohkoh satellite, and EUV data from the EUV Imaging Telescope (EIT) on the SOHO satellite. These observations indicate that the pre-eruption sigmoid patterns are more prominent in SXRs than in EUV, and that sigmoid precursors are present in over 50% of CMEs. These findings are important for CME research, and may potentially be a major component to space weather forecasting. So far, however, the studies have been subject to restrictions that will have to be relaxed before sigmoid morphology can be used as a reliable predictive too[. Moreover, some CMEs do not display a SXR sigmoid structure prior to eruption, and some others show no prominent SXR signature of any kind before or during eruption.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29116182','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29116182"><span>The Origin of the "Seasons" in Space Weather.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dikpati, Mausumi; Cally, Paul S; McIntosh, Scott W; Heifetz, Eyal</p> <p>2017-11-07</p> <p>Powerful 'space weather' events caused by solar activity pose serious risks to human health, safety, economic activity and national security. Spikes in deaths due to heart attacks, strokes and other diseases occurred during prolonged power outages. Currently it is hard to prepare for and mitigate the impact of space weather because it is impossible to forecast the solar eruptions that can cause these terrestrial events until they are seen on the Sun. However, as recently reported in Nature, eruptive events like coronal mass ejections and solar flares, are organized into quasi-periodic "seasons", which include enhanced bursts of eruptions for several months, followed by quiet periods. We explored the dynamics of sunspot-producing magnetic fields and discovered for the first time that bursty and quiet seasons, manifested in surface magnetic structures, can be caused by quasi-periodic energy-exchange among magnetic fields, Rossby waves and differential rotation of the solar interior shear-layer (called tachocline). Our results for the first time provide a quantitative physical mechanism for forecasting the strength and duration of bursty seasons several months in advance, which can greatly enhance our ability to warn humans about dangerous solar bursts and prevent damage to satellites and power stations from space weather events.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1014248','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1014248"><span>Field Testing and Performance Evaluation of the Long-Range Acoustic Real-Time Sensor for Polar Areas (LARA)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2015-09-30</p> <p>Valley Ridge segment in the northeast Pacific Ocean. Both areas have seafloor volcanic eruptions forecast for the near future, and the LARA moorings...useful for real-time monitoring of deep-ocean seismic and volcanic activity (e.g., Dziak et al., 2012) - especially in areas where SOSUS coverage no...2012): Seismic precursors and magma ascent before the April 2011 eruption at Axial Seamount. Nature Geoscience, 5, pp. 478-482. Klinck, H., and</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120007669','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120007669"><span>Using Science Data and Models for Space Weather Forecasting - Challenges and Opportunities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Hesse, Michael; Pulkkinen, Antti; Zheng, Yihua; Maddox, Marlo; Berrios, David; Taktakishvili, Sandro; Kuznetsova, Masha; Chulaki, Anna; Lee, Hyesook; Mullinix, Rick; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20120007669'); toggleEditAbsImage('author_20120007669_show'); toggleEditAbsImage('author_20120007669_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20120007669_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20120007669_hide"></p> <p>2012-01-01</p> <p>Space research, and, consequently, space weather forecasting are immature disciplines. Scientific knowledge is accumulated frequently, which changes our understanding or how solar eruptions occur, and of how they impact targets near or on the Earth, or targets throughout the heliosphere. Along with continuous progress in understanding, space research and forecasting models are advancing rapidly in capability, often providing substantially increases in space weather value over time scales of less than a year. Furthermore, the majority of space environment information available today is, particularly in the solar and heliospheric domains, derived from research missions. An optimal forecasting environment needs to be flexible enough to benefit from this rapid development, and flexible enough to adapt to evolving data sources, many of which may also stem from non-US entities. This presentation will analyze the experiences obtained by developing and operating both a forecasting service for NASA, and an experimental forecasting system for Geomagnetically Induced Currents.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70187662','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70187662"><span>Satellite radar interferometry measures deformation at Okmok Volcano</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Lu, Zhong; Mann, Dorte; Freymueller, Jeff</p> <p>1998-01-01</p> <p>The center of the Okmok caldera in Alaska subsided 140 cm as a result of its February– April 1997 eruption, according to satellite data from ERS-1 and ERS-2 synthetic aperture radar (SAR) interferometry. The inferred deflationary source was located 2.7 km beneath the approximate center of the caldera using a point source deflation model. Researchers believe this source is a magma chamber about 5 km from the eruptive source vent. During the 3 years before the eruption, the center of the caldera uplifted by about 23 cm, which researchers believe was a pre-emptive inflation of the magma chamber. Scientists say such measurements demonstrate that radar interferometry is a promising spaceborne technique for monitoring remote volcanoes. Frequent, routine acquisition of images with SAR interferometry could make near realtime monitoring at such volcanoes the rule, aiding in eruption forecasting.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/gip/79/','USGSPUBS'); return false;" href="https://pubs.usgs.gov/gip/79/"><span>Alaska Volcano Observatory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Venezky, Dina Y.; Murray, Tom; Read, Cyrus</p> <p>2008-01-01</p> <p>Steam plume from the 2006 eruption of Augustine volcano in Cook Inlet, Alaska. Explosive ash-producing eruptions from Alaska's 40+ historically active volcanoes pose hazards to aviation, including commercial aircraft flying the busy North Pacific routes between North America and Asia. The Alaska Volcano Observatory (AVO) monitors these volcanoes to provide forecasts of eruptive activity. AVO is a joint program of the U.S. Geological Survey (USGS), the Geophysical Institute of the University of Alaska Fairbanks (UAFGI), and the State of Alaska Division of Geological and Geophysical Surveys (ADGGS). AVO is one of five USGS Volcano Hazards Program observatories that monitor U.S. volcanoes for science and public safety. Learn more about Augustine volcano and AVO at http://www.avo.alaska.edu.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.S21F..01R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.S21F..01R"><span>Outstanding challenges in the seismological study of volcanic processes: Results from recent U.S. and European community-wide discussion workshops</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Roman, D. C.; Rodgers, M.; Mather, T. A.; Power, J. A.; Pyle, D. M.</p> <p>2014-12-01</p> <p>Observations of volcanically induced seismicity are essential for eruption forecasting and for real-time and near-real-time warnings of hazardous volcanic activity. Studies of volcanic seismicity and of seismic wave propagation also provide critical understanding of subsurface magmatic systems and the physical processes associated with magma genesis, transport, and eruption. However, desipite significant advances in recent years, our ability to successfully forecast volcanic eruptions and fully understand subsurface volcanic processes is limited by our current understanding of the source processes of volcano-seismic events, the effects on seismic wave propagation within volcanic structures, limited data, and even the non-standardized terminology used to describe seismic waveforms. Progress in volcano seismology is further hampered by inconsistent data formats and standards, lack of state-of-the-art hardware and professional technical staff, as well as a lack of widely adopted analysis techniques and software. Addressing these challenges will not only advance scientific understanding of volcanoes, but also will lead to more accurate forecasts and warnings of hazardous volcanic eruptions that would ultimately save lives and property world-wide. Two recent workshops held in Anchorage, Alaska, and Oxford, UK, represent important steps towards developing a relationship among members of the academic community and government agencies, focused around a shared, long-term vision for volcano seismology. Recommendations arising from the two workshops fall into six categories: 1) Ongoing and enhanced community-wide discussions, 2) data and code curation and dissemination, 3) code development, 4) development of resources for more comprehensive data mining, 5) enhanced strategic seismic data collection, and 6) enhanced integration of multiple datasets (including seismicity) to understand all states of volcano activity through space and time. As presented sequentially above, these steps can be regarded as a road map for galvanizing and strengthening the volcano seismological community to drive new scientific and technical progress over the next 5-10 years.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1811825B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1811825B"><span>Volcano Deformation and Eruption Forecasting using Data Assimilation: Case of Grimsvötn volcano in Iceland</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bato, Mary Grace; Pinel, Virginie; Yan, Yajing</p> <p>2016-04-01</p> <p>The recent advances in Interferometric Synthetic Aperture Radar (InSAR) imaging and the increasing number of continuous Global Positioning System (GPS) networks recorded on volcanoes provide continuous and spatially extensive evolution of surface displacements during inter-eruptive periods. For basaltic volcanoes, these measurements combined with simple dynamical models (Lengliné et al. 2008 [1], Pinel et al, 2010 [2], Reverso et al, 2014 [3]) can be exploited to characterise and constrain parameters of one or several magmatic reservoirs using inversion methods. On the other hand, data assimilation-a time-stepping process that best combines models and observations, sometimes a priori information based on error statistics to predict the state of a dynamical system-has gained popularity in various fields of geoscience (e.g. ocean-weather forecasting, geomagnetism and natural resources exploration). In this work, we aim to first test the applicability and benefit of data assimilation, in particular the Ensemble Kalman Filter [4], in the field of volcanology. We predict the temporal behaviors of the overpressures and deformations by applying the two-magma chamber model of Reverso et. al., 2014 [3] and by using synthetic deformation data in order to establish our forecasting strategy. GPS time-series data of the recent eruptions at Grimsvötn volcano is used for the real case applicability of the method. [1] Lengliné, O., D Marsan, J Got, V. Pinel, V. Ferrazzini, P. Obuko, Seismicity and deformation induced by magma accumulation at three basaltic volcanoes, J. Geophys. Res., 113, B12305, 2008. [2] V. Pinel, C. Jaupart and F. Albino, On the relationship between cycles of eruptive activity and volcanic edifice growth, J. Volc. Geotherm. Res, 194, 150-164, 2010 [3] T. Reverso, J. Vandemeulebrouck, F. Jouanne, V. Pinel, T. Villemin, E. Sturkell, A two-magma chamber as a source of deformation at Grimsvötn volcano, Iceland, JGR, 2014 [4] Evensen, G., The Ensemble Kalman Filter: theoretical formulation and practical implementation. Ocean Dyn. 53, 343-367, 2003</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1816107V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1816107V"><span>Reference dataset of volcanic ash physicochemical and optical properties for atmospheric measurement retrievals and transport modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vogel, Andreas; Durant, Adam; Sytchkova, Anna; Diplas, Spyros; Bonadonna, Costanza; Scarnato, Barbara; Krüger, Kirstin; Kylling, Arve; Kristiansen, Nina; Stohl, Andreas</p> <p>2016-04-01</p> <p>Explosive volcanic eruptions emit up to 50 wt.% (total erupted mass) of fine ash particles (<63 microns), which individually can have theoretical atmospheric lifetimes that span hours to days. Depending on the injection height, fine ash may be subsequently transported and dispersed by the atmosphere over 100s - 1000s km and can pose a major threat for aviation operations. Recent volcanic eruptions, such as the 2010 Icelandic Eyjafjallajökull event, illustrated how volcanic ash can severely impact commercial air traffic. In order to manage the threat, it is important to have accurate forecast information on the spatial extent and absolute quantity of airborne volcanic ash. Such forecasts are constrained by empirically-derived estimates of the volcanic source term and the nature of the constituent volcanic ash properties. Consequently, it is important to include a quantitative assessment of measurement uncertainties of ash properties to provide realistic ash forecast uncertainty. Currently, information on volcanic ash physicochemical and optical properties is derived from a small number of somewhat dated publications. In this study, we provide a reference dataset for physical (size distribution and shape), chemical (bulk vs. surface chemistry) and optical properties (complex refractive index in the UV-vis-NIR range) of a representative selection of volcanic ash samples from 10 different volcanic eruptions covering the full variability in silica content (40-75 wt.% SiO2). Through the combination of empirical analytical methods (e.g., image analysis, Energy Dispersive Spectroscopy, X-ray Photoelectron Spectroscopy, Transmission Electron Microscopy and UV/Vis/NIR/FTIR Spectroscopy) and theoretical models (e.g., Bruggeman effective medium approach), it was possible to fully capture the natural variability of ash physicochemical and optical characteristics. The dataset will be applied in atmospheric measurement retrievals and atmospheric transport modelling to determine the sensitivity to uncertainty in ash particle characteristics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006AGUFM.V52A..04N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006AGUFM.V52A..04N"><span>Low-frequency seismic events in a wider volcanological context</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Neuberg, J. W.; Collombet, M.</p> <p>2006-12-01</p> <p>Low-frequency seismic events have been in the centre of attention for several years, particularly on volcanoes with highly viscous magmas. The ultimate aim is to detect changes in volcanic activity by identifying changes in the seismic behaviour in order to forecast an eruption, or in case of an ongoing eruption, forecast the short and longterm behaviour of the volcanic system. A major boost in recent years arose through several attempts of multi-parameter volcanic monitoring and modelling programs, which allowed multi-disciplinary groups of volcanologists to interpret seismic signals together with, e.g. ground deformation, stress field analysis and petrological information. This talk will give several examples of such multi-disciplinary projects, focussing on the joint modelling of seismic source processes for low-frequency events together with advanced magma flow models, and the signs of magma movement in the deformation and stress field at the surface.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70159596','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70159596"><span>Lava lake level as a gauge of magma reservoir pressure and eruptive hazard</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Patrick, Matthew R.; Anderson, Kyle R.; Poland, Michael P.; Orr, Tim R.; Swanson, Donald A.</p> <p>2015-01-01</p> <p>Forecasting volcanic activity relies fundamentally on tracking magma pressure through the use of proxies, such as ground surface deformation and earthquake rates. Lava lakes at open-vent basaltic volcanoes provide a window into the uppermost magma system for gauging reservoir pressure changes more directly. At Kīlauea Volcano (Hawaiʻi, USA) the surface height of the summit lava lake in Halemaʻumaʻu Crater fluctuates with surface deformation over short (hours to days) and long (weeks to months) time scales. This correlation implies that the lake behaves as a simple piezometer of the subsurface magma reservoir. Changes in lava level and summit deformation scale with (and shortly precede) changes in eruption rate from Kīlauea's East Rift Zone, indicating that summit lava level can be used for short-term forecasting of rift zone activity and associated hazards at Kīlauea.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.A52B..01N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.A52B..01N"><span>Developing a NASA strategy for sampling a major Pinatubo-like volcanic eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Newman, P. A.; Jucks, K. W.; Maring, H. B.</p> <p>2016-12-01</p> <p>Based on history, it is reasonable to expect a major volcanic eruption in the foreseeable future. By "major volcanic eruption", we mean an eruption that injects a substantial amount of material, gases and particles, into the stratosphere as a result of one eruption event. Such a volcanic eruption can impact weather, climate, and atmospheric chemistry on regional, hemispheric and global scales over significant time periods. Further, such an eruption can be an unintended analog for a number of geo-engineering schemes for mitigating greenhouse warming of the Earth. In order to understand and project the consequences of a major eruption, it is necessary to make a number of observations from a variety of perspectives. Such an eruption will occur, in the immediate sense, unexpectedly. Therefore, it is wise to have a thoughtfully developed plan for executing a rapid response that makes useful observations. A workshop was held on 17-18 May 2016 at NASA GSFC to develop a NASA observation strategy that could be quickly implemented in response to a major volcanic eruption, and would characterize the changes to atmospheric (especially stratospheric) composition following a large volcanic eruption. In this presentation we will provide an overview of the elements of this strategy with respect to satellite, balloon, ground, and aircraft observations. In addition, models simulations and forecasts will play a key role in any response strategy. Results will also be shown from a spectrum of simulations of volcanic eruptions that support this NASA strategy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMPA41D..08L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMPA41D..08L"><span>Advances in Monitoring, Modelling and Forecasting Volcanic Ash Plumes over the Past 5 Years and the Impact on Preparedness from the London VAAC Perspective</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lee, D. S.; Lisk, I.</p> <p>2015-12-01</p> <p>Hosted and run by the Met Office, the London VAAC (Volcanic Ash Advisory Centre) is responsible for issuing advisories on the location and likely dispersion of ash clouds originating from volcanoes in the North East Atlantic, primarily from Iceland. These advisories and additional guidance products are used by the civil aviation community to make decisions on airspace flight management. London VAAC has specialist forecasters who use a combination of volcano source data, satellite-based, ground-based and aircraft observations, weather forecast models and dispersion models. Since the eruption of the Icelandic volcano Eyjafjallajökull in 2010, which resulted in the decision by many northern European countries to impose significant restrictions on the use of their airspace, London VAAC has been active in further developing its volcanic ash monitoring, modelling and forecasting capabilities, collaborating with research organisations, industry, other VAACs, Meteorological Services and the Volcano Observatory in Iceland. It has been necessary to advance operational capabilities to address evolving requirements, including for more quantitative assessments of volcanic ash in the atmosphere. Here we summarise advances in monitoring, modelling and forecasting of volcanic ash plumes over the past 5 years from the London VAAC perspective, and the realization of science into operations. We also highlight the importance of collaborative activities, such as the 'VAAC Best Practice' Workshop, where information is exchanged between all nine VAACs worldwide on the operational practices in monitoring and forecasting volcanic ash, with the aim of working toward a more harmonized service for decision makers in the aviation community. We conclude on an evaluation of how better we are prepared for the next significant ash-rich Icelandic eruption, and the challenges still remaining.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20060052398&hterms=benefit+decision+making&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dbenefit%2Bdecision%2Bmaking','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20060052398&hterms=benefit+decision+making&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dbenefit%2Bdecision%2Bmaking"><span>Impact of Probabilistic Weather on Flight Routing Decisions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Sheth, Kapil; Sridhar, Banavar; Mulfinger, Daniel</p> <p>2006-01-01</p> <p>Flight delays in the United States have been found to increase year after year, along with the increase in air traffic. During the four-month period from May through August of 2005, weather related delays accounted for roughly 70% of all reported delays, The current weather prediction in tactical (within 2 hours) timeframe is at manageable levels, however, the state of forecasting weather for strategic (2-6 hours) timeframe is still not dependable for long-term planning. In the absence of reliable severe weather forecasts, the decision-making for flights longer than two hours is challenging. This paper deals with an approach of using probabilistic weather prediction for Traffic Flow Management use, and a general method using this prediction for estimating expected values of flight length and delays in the National Airspace System (NAS). The current state-of-the-art convective weather forecasting is employed to aid the decision makers in arriving at decisions for traffic flow and flight planing. The six-agency effort working on the Next Generation Air Transportation System (NGATS) have considered weather-assimilated decision-making as one of the principal foci out of a list of eight. The weather Integrated Product Team has considered integrated weather information and improved aviation weather forecasts as two of the main efforts (Ref. 1, 2). Recently, research has focused on the concept of operations for strategic traffic flow management (Ref. 3) and how weather data can be integrated for improved decision-making for efficient traffic management initiatives (Ref. 4, 5). An overview of the weather data needs and benefits of various participants in the air traffic system along with available products can be found in Ref. 6. Previous work related to use of weather data in identifying and categorizing pilot intrusions into severe weather regions (Ref. 7, 8) has demonstrated a need for better forecasting in the strategic planning timeframes and moving towards a probabilistic description of weather (Ref. 9). This paper focuses on. specified probability in a local region for flight intrusion/deviation decision-making. The process uses a probabilistic weather description, implements that in a air traffic assessment system to study trajectories of aircraft crossing a cut-off probability contour. This value would be useful for meteorologists in creating optimum distribution profiles for severe weather, Once available, the expected values of flight path and aggregate delays are calculated for efficient operations. The current research, however, does not deal with the issue of multiple cell encounters, as well as echo tops, and will be a topic of future work.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.9382S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.9382S"><span>The 2011-2012 eruption of Cordón Caulle volcano (Southern Andes): Evolution, crisis management and current hazards</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Silva Parejas, C.; Lara, L. E.; Bertin, D.; Amigo, A.; Orozco, G.</p> <p>2012-04-01</p> <p>A new kind of integrated approach was for first time achieved during the eruptive crisis of Cordón Caulle volcano (Southern Andes, 40.59°S, 72.12°W) in Chile. The monitoring network of SERNAGEOMIN around the volcano detected the increasing precursory seismicity, alerting the imminence of an eruption about 5 hours before its onset, on June 4, 2011. In addition, SERNAGEOMIN generated daily forecasts of tephra dispersal and fall (ASHFALL advection-diffusion model), and prepared simulations of areas affected by the possible occurrence of lahars and pyroclastic flows. Models were improved with observed effects on the field and satellite imagery, resulting in a good correlation. The information was timely supplied to the authorities as well as recommendations in order to better precise the vulnerable areas. Eruption has initially occurred from a couple of overlapped cones located along the eastern fault scarp of the Pleistocene-Holocene extensional graben of Cordón Caulle. Eruptive products have virtually the same bulk composition as those of the historical 1921 and 1960 eruptions, corresponding to phenocryst-poor rhyodacites (67-70 % SiO2). During the first eruptive stage, a ca. 15-km strong Plinian column lasting 27 hours emitted 0.2-0.4 km3 of magma (DRE). Thick tephra deposits have been accumulated in Chile and Argentina, whereas fine particles and aerosols dispersion disrupted air navigation across the Southern Hemisphere. The second ongoing eruptive stage, which started in mid-June, has been characterized by lava emission already covering a total area comparable to the 1960 lava flows with a total estimated volume <0.25 km3 (at the end of December 2011). Weak but persistent plumes have caused preventive flight suspensions in Chile and Argentina until the end of the year. Main current hazards at Cordón Caulle volcano are fine tephra fallout, secondary lahars, minor explosions and lava flow front collapse. Even if this case can be considered successful from the point of view of eruption forecast and hazard assessment, a new protocol of volcanic alerts has been recently signed between SERNAGEOMIN and the National Emergency Agency (ONEMI) in order to improve the communication, information transfer and roles of those institutions during risky volcanic crises.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014NHESS..14.1853B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014NHESS..14.1853B"><span>Long-term volcanic hazard assessment on El Hierro (Canary Islands)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Becerril, L.; Bartolini, S.; Sobradelo, R.; Martí, J.; Morales, J. M.; Galindo, I.</p> <p>2014-07-01</p> <p>Long-term hazard assessment, one of the bastions of risk-mitigation programs, is required for land-use planning and for developing emergency plans. To ensure quality and representative results, long-term volcanic hazard assessment requires several sequential steps to be completed, which include the compilation of geological and volcanological information, the characterisation of past eruptions, spatial and temporal probabilistic studies, and the simulation of different eruptive scenarios. Despite being a densely populated active volcanic region that receives millions of visitors per year, no systematic hazard assessment has ever been conducted on the Canary Islands. In this paper we focus our attention on El Hierro, the youngest of the Canary Islands and the most recently affected by an eruption. We analyse the past eruptive activity to determine the spatial and temporal probability, and likely style of a future eruption on the island, i.e. the where, when and how. By studying the past eruptive behaviour of the island and assuming that future eruptive patterns will be similar, we aim to identify the most likely volcanic scenarios and corresponding hazards, which include lava flows, pyroclastic fallout and pyroclastic density currents (PDCs). Finally, we estimate their probability of occurrence. The end result, through the combination of the most probable scenarios (lava flows, pyroclastic density currents and ashfall), is the first qualitative integrated volcanic hazard map of the island.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFM.V43A2208M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFM.V43A2208M"><span>Modeling and forecasting tephra hazards at Redoubt Volcano, Alaska, during 2009 unrest and eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mastin, L. G.; Denlinger, R. P.; Wallace, K. L.; Schaefer, J. R.</p> <p>2009-12-01</p> <p>In late 2008, Redoubt Volcano, on the west coast of Alaska’s Cook Inlet, began a period of unrest that culminated in more than 19 small tephra-producing events between March 19 and April 4, 2009, followed by growth of a lava dome whose volume now exceeds 70 million cubic meters. The explosive events lasted from <1 to 31 minutes, sent tephra columns to heights of 19 km asl, and emitted dense-rock (DRE) tephra volumes up to several million cubic meters. Tephra fall affected transportation and infrastructure throughout Cook Inlet, including the Anchorage metropolitan area. The months of unrest that preceded the first explosive event allowed us to develop tools to forecast tephra hazards. As described in an accompanying abstract, colleagues at the University of Pisa produced automated, daily tephra-fall forecast maps using the 3-D VOL-CALPUFF model with input scenarios that represented likely event sizes and durations. Tephra-fall forecast maps were also generated every six hours for hypothetical events of 10M m3 volume DRE using the 2-D model ASHFALL, and relationships between hypothetical plume height and eruption rate were evaluated four times daily under then-current atmospheric conditions using the program PLUMERIA. Eruptive deposits were mapped and isomass contours constructed for the two largest events, March 24 (0340-0355Z) and April 4 (1358-1429Z), which produced radar-determined plume heights of 18.3 and 15.2 km asl (~15.6 and 12.5 km above the vent), and tephra volumes (DRE) of 6.3M and 3.1M m3, respectively. For the volumetric eruption rates calculated from mapped erupted volume and seismic duration (V=6.2×103 and 1.7×103 m3/s DRE), measured plume heights H above the vent fall within 10% of the empirical best-fit curve H=1.67V0.259 published in the book Volcanic Plumes by Sparks et al. (1997, eq. 5.1). The plume heights are slightly higher than (but still within 13% of) the 14.6 and 11.1 km predicted by PLUMERIA under the existing atmospheric conditions. We have also modeled these two events using the 3-D transient model FALL3D, which considers topographic effects on wind and tephra dispersal. Using the eruption rates and plume heights constrained by deposit mapping, seismic data, and Doppler radar, and an archived wind field obtained from the NOAA GDAS model for these dates, modeled isomass contours from the April 4 event closely resemble measured values, but modeled contours from the March 24 event extend only about half to three fourths as far from the volcano as measured. This discrepancy may result from inaccuracies in the modeled wind pattern, the grain-size distribution, or turbulent entrainment algorithms. The deposit pattern may also have been affected by a lateral blast which is thought to have accompanied this event.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70187129','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70187129"><span>When mechanism matters: Bayesian forecasting using models of ecological diffusion</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.</p> <p>2017-01-01</p> <p>Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1214356H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1214356H"><span>Quasi-most unstable modes: a window to 'À la carte' ensemble diversity?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Homar Santaner, Victor; Stensrud, David J.</p> <p>2010-05-01</p> <p>The atmospheric scientific community is nowadays facing the ambitious challenge of providing useful forecasts of atmospheric events that produce high societal impact. The low level of social resilience to false alarms creates tremendous pressure on forecasting offices to issue accurate, timely and reliable warnings.Currently, no operational numerical forecasting system is able to respond to the societal demand for high-resolution (in time and space) predictions in the 12-72h time span. The main reasons for such deficiencies are the lack of adequate observations and the high non-linearity of the numerical models that are currently used. The whole weather forecasting problem is intrinsically probabilistic and current methods aim at coping with the various sources of uncertainties and the error propagation throughout the forecasting system. This probabilistic perspective is often created by generating ensembles of deterministic predictions that are aimed at sampling the most important sources of uncertainty in the forecasting system. The ensemble generation/sampling strategy is a crucial aspect of their performance and various methods have been proposed. Although global forecasting offices have been using ensembles of perturbed initial conditions for medium-range operational forecasts since 1994, no consensus exists regarding the optimum sampling strategy for high resolution short-range ensemble forecasts. Bred vectors, however, have been hypothesized to better capture the growing modes in the highly nonlinear mesoscale dynamics of severe episodes than singular vectors or observation perturbations. Yet even this technique is not able to produce enough diversity in the ensembles to accurately and routinely predict extreme phenomena such as severe weather. Thus, we propose a new method to generate ensembles of initial conditions perturbations that is based on the breeding technique. Given a standard bred mode, a set of customized perturbations is derived with specified amplitudes and horizontal scales. This allows the ensemble to excite growing modes across a wider range of scales. Results show that this approach produces significantly more spread in the ensemble prediction than standard bred modes alone. Several examples that illustrate the benefits from this approach for severe weather forecasts will be provided.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1712188O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1712188O"><span>The total probabilities from high-resolution ensemble forecasting of floods</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian</p> <p>2015-04-01</p> <p>Ensemble forecasting has for a long time been used in meteorological modelling, to give an indication of the uncertainty of the forecasts. As meteorological ensemble forecasts often show some bias and dispersion errors, there is a need for calibration and post-processing of the ensembles. Typical methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). To make optimal predictions of floods along the stream network in hydrology, we can easily use the ensemble members as input to the hydrological models. However, some of the post-processing methods will need modifications when regionalizing the forecasts outside the calibration locations, as done by Hemri et al. (2013). We present a method for spatial regionalization of the post-processed forecasts based on EMOS and top-kriging (Skøien et al., 2006). We will also look into different methods for handling the non-normality of runoff and the effect on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005. Skøien, J. O., Merz, R. and Blöschl, G.: Top-kriging - Geostatistics on stream networks, Hydrol. Earth Syst. Sci., 10(2), 277-287, 2006.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011HESSD...8..715A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011HESSD...8..715A"><span>An operational hydrological ensemble prediction system for the city of Zurich (Switzerland): skill, case studies and scenarios</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Addor, N.; Jaun, S.; Zappa, M.</p> <p>2011-01-01</p> <p>The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This models chain relies on limited-area atmospheric forecasts provided by the deterministic model COSMO-7 and the probabilistic model COSMO-LEPS. These atmospheric forecasts are used to force a semi-distributed hydrological model (PREVAH), coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework to compare the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added-value conveyed by the probability information, a reforecast was made for the period June 2007 to December 2009 for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain can be of up to 2 days lead time for the catchment considered. Brier skill scores show that COSMO-LEPS-based hydrological forecasts overall outperform their COSMO-7 based counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts, as shown by comparisons with a reference run driven by observed meteorological parameters. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. The two most intense events of the study period are investigated utilising a novel graphical representation of probability forecasts and used to generate high discharge scenarios. They highlight challenges for making decisions on the basis of hydrological predictions, and indicate the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014SGeo...35.1023L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014SGeo...35.1023L"><span>Applying Fractal Dimensions and Energy-Budget Analysis to Characterize Fracturing Processes During Magma Migration and Eruption: 2011-2012 El Hierro (Canary Islands) Submarine Eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>López, Carmen; Martí, Joan; Abella, Rafael; Tarraga, Marta</p> <p>2014-07-01</p> <p>The impossibility of observing magma migration inside the crust obliges us to rely on geophysical data and mathematical modelling to interpret precursors and to forecast volcanic eruptions. Of the geophysical signals that may be recorded before and during an eruption, deformation and seismicity are two of the most relevant as they are directly related to its dynamic. The final phase of the unrest episode that preceded the 2011-2012 eruption on El Hierro (Canary Islands) was characterized by local and accelerated deformation and seismic energy release indicating an increasing fracturing and a migration of the magma. Application of time varying fractal analysis to the seismic data and the characterization of the seismicity pattern and the strain and the stress rates allow us to identify different stages in the source mechanism and to infer the geometry of the path used by the magma and associated fluids to reach the Earth's surface. The results obtained illustrate the relevance of such studies to understanding volcanic unrest and the causes that govern the initiation of volcanic eruptions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11..299H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11..299H"><span>Probabilistic flood warning using grand ensemble weather forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>He, Y.; Wetterhall, F.; Cloke, H.; Pappenberger, F.; Wilson, M.; Freer, J.; McGregor, G.</p> <p>2009-04-01</p> <p>As the severity of floods increases, possibly due to climate and landuse change, there is urgent need for more effective and reliable warning systems. The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. An ensemble of weather forecasts from one Ensemble Prediction System (EPS), when used on catchment hydrology, can provide improved early flood warning as some of the uncertainties can be quantified. EPS forecasts from a single weather centre only account for part of the uncertainties originating from initial conditions and stochastic physics. Other sources of uncertainties, including numerical implementations and/or data assimilation, can only be assessed if a grand ensemble of EPSs from different weather centres is used. When various models that produce EPS from different weather centres are aggregated, the probabilistic nature of the ensemble precipitation forecasts can be better retained and accounted for. The availability of twelve global EPSs through the 'THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a new opportunity for the design of an improved probabilistic flood forecasting framework. This work presents a case study using the TIGGE database for flood warning on a meso-scale catchment. The upper reach of the River Severn catchment located in the Midlands Region of England is selected due to its abundant data for investigation and its relatively small size (4062 km2) (compared to the resolution of the NWPs). This choice was deliberate as we hypothesize that the uncertainty in the forcing of smaller catchments cannot be represented by a single EPS with a very limited number of ensemble members, but only through the variance given by a large number ensembles and ensemble system. A coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts is set up to study the potential benefits of using the TIGGE database in early flood warning. Physically based and fully distributed LISFLOOD suite of models is selected to simulate discharge and flood inundation consecutively. The results show the TIGGE database is a promising tool to produce forecasts of discharge and flood inundation comparable with the observed discharge and simulated inundation driven by the observed discharge. The spread of discharge forecasts varies from centre to centre, but it is generally large, implying a significant level of uncertainties. Precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial variability of precipitation on a comparatively small catchment. This perhaps indicates the need to improve NWPs resolution and/or disaggregation techniques to narrow down the spatial gap between meteorology and hydrology. It is not necessarily true that early flood warning becomes more reliable when more ensemble forecasts are employed. It is difficult to identify the best forecast centre(s), but in general the chance of detecting floods is increased by using the TIGGE database. Only one flood event was studied because most of the TIGGE data became available after October 2007. It is necessary to test the TIGGE ensemble forecasts with other flood events in other catchments with different hydrological and climatic regimes before general conclusions can be made on its robustness and applicability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70193568','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70193568"><span>A Bayesian method to rank different model forecasts of the same volcanic ash cloud: Chapter 24</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Denlinger, Roger P.; Webley, P.; Mastin, Larry G.; Schwaiger, Hans F.</p> <p>2012-01-01</p> <p>Volcanic eruptions often spew fine ash high into the atmosphere, where it is carried downwind, forming long ash clouds that disrupt air traffic and pose a hazard to air travel. To mitigate such hazards, the community studying ash hazards must assess risk of ash ingestion for any flight path and provide robust and accurate forecasts of volcanic ash dispersal. We provide a quantitative and objective method to evaluate the efficacy of ash dispersal estimates from different models, using Bayes theorem to assess the predictions that each model makes about ash dispersal. We incorporate model and measurement uncertainty and produce a posterior probability for model input parameters. The integral of the posterior over all possible combinations of model inputs determines the evidence for each model and is used to compare models. We compare two different types of transport models, an Eulerian model (Ash3d) and a Langrangian model (PUFF), as applied to the 2010 eruptions of Eyjafjallajökull volcano in Iceland. The evidence for each model benefits from common physical characteristics of ash dispersal from an eruption column and provides a measure of how well each model forecasts cloud transport. Given the complexity of the wind fields, we find that the differences between these models depend upon the differences in the way the models disperse ash into the wind from the source plume. With continued observation, the accuracy of the estimates made by each model increases, increasing the efficacy of each model’s ability to simulate ash dispersal.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70037304','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70037304"><span>Improved prediction and tracking of volcanic ash clouds</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Mastin, Larry G.; Webley, Peter</p> <p>2009-01-01</p> <p>During the past 30??years, more than 100 airplanes have inadvertently flown through clouds of volcanic ash from erupting volcanoes. Such encounters have caused millions of dollars in damage to the aircraft and have endangered the lives of tens of thousands of passengers. In a few severe cases, total engine failure resulted when ash was ingested into turbines and coating turbine blades. These incidents have prompted the establishment of cooperative efforts by the International Civil Aviation Organization and the volcanological community to provide rapid notification of eruptive activity, and to monitor and forecast the trajectories of ash clouds so that they can be avoided by air traffic. Ash-cloud properties such as plume height, ash concentration, and three-dimensional ash distribution have been monitored through non-conventional remote sensing techniques that are under active development. Forecasting the trajectories of ash clouds has required the development of volcanic ash transport and dispersion models that can calculate the path of an ash cloud over the scale of a continent or a hemisphere. Volcanological inputs to these models, such as plume height, mass eruption rate, eruption duration, ash distribution with altitude, and grain-size distribution, must be assigned in real time during an event, often with limited observations. Databases and protocols are currently being developed that allow for rapid assignment of such source parameters. In this paper, we summarize how an interdisciplinary working group on eruption source parameters has been instigating research to improve upon the current understanding of volcanic ash cloud characterization and predictions. Improved predictions of ash cloud movement and air fall will aid in making better hazard assessments for aviation and for public health and air quality. ?? 2008 Elsevier B.V.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.nature.com/ncomms/2015/150803/ncomms8860/full/ncomms8860.html','USGSPUBS'); return false;" href="http://www.nature.com/ncomms/2015/150803/ncomms8860/full/ncomms8860.html"><span>Hail formation triggers rapid ash aggregation in volcanic plumes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Van Eaton, Alexa R.; Mastin, Larry G.; Herzog, M.; Schwaiger, Hans F.; Schneider, David J.; Wallace, Kristi; Clarke, Amanda B</p> <p>2015-01-01</p> <p>During explosive eruptions, airborne particles collide and stick together, accelerating the fallout of volcanic ash and climate-forcing aerosols. This aggregation process remains a major source of uncertainty both in ash dispersal forecasting and interpretation of eruptions from the geological record. Here we illuminate the mechanisms and timescales of particle aggregation from a well-characterized ‘wet’ eruption. The 2009 eruption of Redoubt Volcano in Alaska incorporated water from the surface (in this case, a glacier), which is a common occurrence during explosive volcanism worldwide. Observations from C-band weather radar, fall deposits, and numerical modeling demonstrate that volcanic hail formed rapidly in the eruption plume, leading to mixed-phase aggregation of ~95% of the fine ash and stripping much of the cloud out of the atmosphere within 30 minutes. Based on these findings, we propose a mechanism of hail-like aggregation that contributes to the anomalously rapid fallout of fine ash and the occurrence of concentrically-layered aggregates in volcanic deposits.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1995EOSTr..76..452S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1995EOSTr..76..452S"><span>Legendary Mount Vesuvius is subject of intensive volcanological study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Spera, Frank</p> <p></p> <p>The Roman population centers of Pompeii and Herculaneum (circa 15,000 inhabitants) were destroyed when Mount Vesuvius erupted in 79 A.D. after centuries of repose. Many times since then its eruptions have claimed human lives; basaltic lava flows from an eruption in 1631 killed 3,000. Vesuvius' location, near the heart of the Roman empire—a center of learning in the ancient world—led it to become the site ofsome of the earliest volcanological studies on record.In letters to Tacitus, Pliny the Younger documented the sequence of events of the 79 A.D. plinian eruption. Geophysical studies of volcanoes were pioneered by Italian volcanologists who installed seismographs in an observatory on the flanks of Vesuvius to study volcano seismology and to forecast and monitor eruptions early this century. It is easy to understand why interest in Vesuvius has been so keen: it is accessible, persistently active, and a large population resides nearby. Today, around 1 million people live within the shadow of this potentially explosive and dangerous volcano.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27619897','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27619897"><span>Thermomechanical controls on magma supply and volcanic deformation: application to Aira caldera, Japan.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hickey, James; Gottsmann, Joachim; Nakamichi, Haruhisa; Iguchi, Masato</p> <p>2016-09-13</p> <p>Ground deformation often precedes volcanic eruptions, and results from complex interactions between source processes and the thermomechanical behaviour of surrounding rocks. Previous models aiming to constrain source processes were unable to include realistic mechanical and thermal rock properties, and the role of thermomechanical heterogeneity in magma accumulation was unclear. Here we show how spatio-temporal deformation and magma reservoir evolution are fundamentally controlled by three-dimensional thermomechanical heterogeneity. Using the example of continued inflation at Aira caldera, Japan, we demonstrate that magma is accumulating faster than it can be erupted, and the current uplift is approaching the level inferred prior to the violent 1914 Plinian eruption. Magma storage conditions coincide with estimates for the caldera-forming reservoir ~29,000 years ago, and the inferred magma supply rate indicates a ~130-year timeframe to amass enough magma to feed a future 1914-sized eruption. These new inferences are important for eruption forecasting and risk mitigation, and have significant implications for the interpretations of volcanic deformation worldwide.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5020646','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5020646"><span>Thermomechanical controls on magma supply and volcanic deformation: application to Aira caldera, Japan</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hickey, James; Gottsmann, Joachim; Nakamichi, Haruhisa; Iguchi, Masato</p> <p>2016-01-01</p> <p>Ground deformation often precedes volcanic eruptions, and results from complex interactions between source processes and the thermomechanical behaviour of surrounding rocks. Previous models aiming to constrain source processes were unable to include realistic mechanical and thermal rock properties, and the role of thermomechanical heterogeneity in magma accumulation was unclear. Here we show how spatio-temporal deformation and magma reservoir evolution are fundamentally controlled by three-dimensional thermomechanical heterogeneity. Using the example of continued inflation at Aira caldera, Japan, we demonstrate that magma is accumulating faster than it can be erupted, and the current uplift is approaching the level inferred prior to the violent 1914 Plinian eruption. Magma storage conditions coincide with estimates for the caldera-forming reservoir ~29,000 years ago, and the inferred magma supply rate indicates a ~130-year timeframe to amass enough magma to feed a future 1914-sized eruption. These new inferences are important for eruption forecasting and risk mitigation, and have significant implications for the interpretations of volcanic deformation worldwide. PMID:27619897</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3997806','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3997806"><span>On the fate of pumice rafts formed during the 2012 Havre submarine eruption</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Jutzeler, Martin; Marsh, Robert; Carey, Rebecca J.; White, James D. L.; Talling, Peter J.; Karlstrom, Leif</p> <p>2014-01-01</p> <p>Pumice rafts are floating mobile accumulations of low-density pumice clasts generated by silicic volcanic eruptions. Pumice in rafts can drift for years, become waterlogged and sink, or become stranded on shorelines. Here we show that the pumice raft formed by the impressive, deep submarine eruption of the Havre caldera volcano (Southwest Pacific) in July 2012 can be mapped by satellite imagery augmented by sailing crew observations. Far from coastal interference, the eruption produced a single >400 km2 raft in 1 day, thus initiating a gigantic, high-precision, natural experiment relevant to both modern and prehistoric oceanic surface dispersal dynamics. Observed raft dispersal can be accurately reproduced by simulating drift and dispersal patterns using currents from an eddy-resolving ocean model hindcast. For future eruptions that produce potentially hazardous pumice rafts, our technique allows real-time forecasts of dispersal routes, in addition to inference of ash/pumice deposit distribution in the deep ocean. PMID:24755668</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1715030B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1715030B"><span>Exploiting teleconnection indices for probabilistic forecasting of drought class transitions in Sicily region (Italy)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bonaccorso, Brunella; Cancelliere, Antonino</p> <p>2015-04-01</p> <p>In the present study two probabilistic models for short-medium term drought forecasting able to include information provided by teleconnection indices are proposed and applied to Sicily region (Italy). Drought conditions are expressed in terms of the Standardized Precipitation-Evapotranspiration Index (SPEI) at different aggregation time scales. More specifically, a multivariate approach based on normal distribution is developed in order to estimate: 1) on the one hand transition probabilities to future SPEI drought classes and 2) on the other hand, SPEI forecasts at a generic time horizon M, as functions of past values of SPEI and the selected teleconnection index. To this end, SPEI series at 3, 4 and 6 aggregation time scales for Sicily region are extracted from the Global SPEI database, SPEIbase , available at Web repository of the Spanish National Research Council (http://sac.csic.es/spei/database.html), and averaged over the study area. In particular, SPEIbase v2.3 with spatial resolution of 0.5° lat/lon and temporal coverage between January 1901 and December 2013 is used. A preliminary correlation analysis is carried out to investigate the link between the drought index and different teleconnection patterns, namely: the North Atlantic Oscillation (NAO), the Scandinavian (SCA) and the East Atlantic-West Russia (EA-WR) patterns. Results of such analysis indicate a strongest influence of NAO on drought conditions in Sicily with respect to other teleconnection indices. Then, the proposed forecasting methodology is applied and the skill in forecasting of the proposed models is quantitatively assessed through the application of a simple score approach and of performance indices. Results indicate that inclusion of NAO index generally enhance model performance thus confirming the suitability of the models for short- medium term forecast of drought conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018NHESS..18..969Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018NHESS..18..969Z"><span>A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun</p> <p>2018-03-01</p> <p>Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy...48...71K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy...48...71K"><span>Dynamical-statistical seasonal prediction for western North Pacific typhoons based on APCC multi-models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, Ok-Yeon; Kim, Hye-Mi; Lee, Myong-In; Min, Young-Mi</p> <p>2017-01-01</p> <p>This study aims at predicting the seasonal number of typhoons (TY) over the western North Pacific with an Asia-Pacific Climate Center (APCC) multi-model ensemble (MME)-based dynamical-statistical hybrid model. The hybrid model uses the statistical relationship between the number of TY during the typhoon season (July-October) and the large-scale key predictors forecasted by APCC MME for the same season. The cross validation result from the MME hybrid model demonstrates high prediction skill, with a correlation of 0.67 between the hindcasts and observation for 1982-2008. The cross validation from the hybrid model with individual models participating in MME indicates that there is no single model which consistently outperforms the other models in predicting typhoon number. Although the forecast skill of MME is not always the highest compared to that of each individual model, the skill of MME presents rather higher averaged correlations and small variance of correlations. Given large set of ensemble members from multi-models, a relative operating characteristic score reveals an 82 % (above-) and 78 % (below-normal) improvement for the probabilistic prediction of the number of TY. It implies that there is 82 % (78 %) probability that the forecasts can successfully discriminate between above normal (below-normal) from other years. The forecast skill of the hybrid model for the past 7 years (2002-2008) is more skillful than the forecast from the Tropical Storm Risk consortium. Using large set of ensemble members from multi-models, the APCC MME could provide useful deterministic and probabilistic seasonal typhoon forecasts to the end-users in particular, the residents of tropical cyclone-prone areas in the Asia-Pacific region.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.4351B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.4351B"><span>Analysis of surface deformation during the eruptive process of El Hierro Island (Canary Islands, Spain): Detection, Evolution and Forecasting.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Berrocoso, M.; Fernandez-Ros, A.; Prates, G.; Martin, M.; Hurtado, R.; Pereda, J.; Garcia, M. J.; Garcia-Cañada, L.; Ortiz, R.; Garcia, A.</p> <p>2012-04-01</p> <p>The surface deformation has been an essential parameter for the onset and evolution of the eruptive process of the island of El Hierro (October 2011) as well as for forecasting changes in seismic and volcanic activity during the crisis period. From GNSS-GPS observations the reactivation is early detected by analizing the change in the deformation of the El Hierro Island regional geodynamics. It is found that the surface deformation changes are detected before the occurrence of seismic activity using the station FRON (GRAFCAN). The evolution of the process has been studied by the analysis of time series of topocentric coordinates and the variation of the distance between stations on the island of El Hierro (GRAFCAN station;IGN network; and UCA-CSIC points) and LPAL-IGS station on the island of La Palma. In this work the main methodologies and their results are shown: •The location (and its changes) of the litospheric pressure source obtained by applying the Mogi model. •Kalman filtering technique for high frequency time series, used to make the forecasts issued for volcanic emergency management. •Correlations between deformation of the different GPS stations and their relationship with seismovolcanic settings.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016NHESS..16..675E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016NHESS..16..675E"><span>Chronology and impact of the 2011 Cordón Caulle eruption, Chile</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Elissondo, Manuela; Baumann, Valérie; Bonadonna, Costanza; Pistolesi, Marco; Cioni, Raffaello; Bertagnini, Antonella; Biass, Sébastien; Herrero, Juan-Carlos; Gonzalez, Rafael</p> <p>2016-03-01</p> <p>We present a detailed chronological reconstruction of the 2011 eruption of the Cordón Caulle volcano (Chile) based on information derived from newspapers, scientific reports and satellite images. Chronology of associated volcanic processes and their local and regional effects (i.e. precursory activity, tephra fallout, lahars, pyroclastic density currents, lava flows) are also presented. The eruption had a severe impact on the ecosystem and on various economic sectors, including aviation, tourism, agriculture and fishing industry. Urban areas and critical infrastructures, such as airports, hospitals and roads, were also impacted. The concentration of PM10 (particulate matter ≤ 10 µm) was measured during and after the eruption, showing that maximum safety threshold levels of daily and annual exposures were surpassed in several occasions. Probabilistic analyses suggest that this combination of atmospheric and eruptive conditions has a probability of occurrence of about 1 %. The management of the crisis, including evacuation of people, is discussed, as well as the comparison with the impact associated with other recent eruptions located in similar areas and having similar characteristics (i.e. Quizapu, Hudson and Chaitén volcanoes). This comparison shows that the regions downwind and very close to the erupting volcanoes suffered very similar problems, without a clear relation to the intensity of the eruption (e.g. health problems, damage to vegetation, death of animals, roof collapse, air traffic disruptions, road closure, lahars and flooding). This suggests that a detailed collection of impact data can be largely beneficial for the development of plans for the management of an eruptive crisis and the mitigation of associated risk of the Andean region.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24789559','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24789559"><span>On the reliability of seasonal climate forecasts.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Weisheimer, A; Palmer, T N</p> <p>2014-07-06</p> <p>Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1-5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that 'goodness' should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a '5' should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of 'goodness' rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching '5' across all regions and variables in 30 years time.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFMNG34B..01M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFMNG34B..01M"><span>Acceleration to failure in geophysical signals prior to laboratory rock failure and volcanic eruptions (Invited)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Main, I. G.; Bell, A. F.; Greenhough, J.; Heap, M. J.; Meredith, P. G.</p> <p>2010-12-01</p> <p>The nucleation processes that ultimately lead to earthquakes, volcanic eruptions, rock bursts in mines, and landslides from cliff slopes are likely to be controlled at some scale by brittle failure of the Earth’s crust. In laboratory brittle deformation experiments geophysical signals commonly exhibit an accelerating trend prior to dynamic failure. Similar signals have been observed prior to volcanic eruptions, including volcano-tectonic earthquake event and moment release rates. Despite a large amount of effort in the search, no such statistically robust systematic trend is found prior to natural earthquakes. Here we describe the results of a suite of laboratory tests on Mount Etna Basalt and other rocks to examine the nature of the non-linear scaling from laboratory to field conditions, notably using laboratory ‘creep’ tests to reduce the boundary strain rate to conditions more similar to those in the field. Seismic event rate, seismic moment release rate and rate of porosity change show a classic ‘bathtub’ graph that can be derived from a simple damage model based on separate transient and accelerating sub-critical crack growth mechanisms, resulting from separate processes of negative and positive feedback in the population dynamics. The signals exhibit clear precursors based on formal statistical model tests using maximum likelihood techniques with Poisson errors. After correcting for the finite loading time of the signal, the results show a transient creep rate that decays as a classic Omori law for earthquake aftershocks, and remarkably with an exponent near unity, as commonly observed for natural earthquake sequences. The accelerating trend follows an inverse power law when fitted in retrospect, i.e. with prior knowledge of the failure time. In contrast the strain measured on the sample boundary shows a less obvious but still accelerating signal that is often absent altogether in natural strain data prior to volcanic eruptions. To test the forecasting power of such constitutive rules in prospective mode, we examine the forecast quality of several synthetic trials, by adding representative statistical fluctuations, due to finite real-time sampling effects, to an underlying accelerating trend. Metrics of forecast quality change systematically and dramatically with time. In particular the model accuracy increases, and the forecast bias decreases, as the failure time approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..1411009S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..1411009S"><span>Using Seismic Signals to Forecast Volcanic Processes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Salvage, R.; Neuberg, J. W.</p> <p>2012-04-01</p> <p>Understanding seismic signals generated during volcanic unrest have the ability to allow scientists to more accurately predict and understand active volcanoes since they are intrinsically linked to rock failure at depth (Voight, 1988). In particular, low frequency long period signals (LP events) have been related to the movement of fluid and the brittle failure of magma at depth due to high strain rates (Hammer and Neuberg, 2009). This fundamentally relates to surface processes. However, there is currently no physical quantitative model for determining the likelihood of an eruption following precursory seismic signals, or the timing or type of eruption that will ensue (Benson et al., 2010). Since the beginning of its current eruptive phase, accelerating LP swarms (< 10 events per hour) have been a common feature at Soufriere Hills volcano, Montserrat prior to surface expressions such as dome collapse or eruptions (Miller et al., 1998). The dynamical behaviour of such swarms can be related to accelerated magma ascent rates since the seismicity is thought to be a consequence of magma deformation as it rises to the surface. In particular, acceleration rates can be successfully used in collaboration with the inverse material failure law; a linear relationship against time (Voight, 1988); in the accurate prediction of volcanic eruption timings. Currently, this has only been investigated for retrospective events (Hammer and Neuberg, 2009). The identification of LP swarms on Montserrat and analysis of their dynamical characteristics allows a better understanding of the nature of the seismic signals themselves, as well as their relationship to surface processes such as magma extrusion rates. Acceleration and deceleration rates of seismic swarms provide insights into the plumbing system of the volcano at depth. The application of the material failure law to multiple LP swarms of data allows a critical evaluation of the accuracy of the method which further refines current understanding of the relationship between seismic signals and volcanic eruptions. It is hoped that such analysis will assist the development of real time forecasting models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70190323','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70190323"><span>The development of a probabilistic approach to forecast coastal change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Lentz, Erika E.; Hapke, Cheryl J.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.</p> <p>2011-01-01</p> <p>This study demonstrates the applicability of a Bayesian probabilistic model as an effective tool in predicting post-storm beach changes along sandy coastlines. Volume change and net shoreline movement are modeled for two study sites at Fire Island, New York in response to two extratropical storms in 2007 and 2009. Both study areas include modified areas adjacent to unmodified areas in morphologically different segments of coast. Predicted outcomes are evaluated against observed changes to test model accuracy and uncertainty along 163 cross-shore transects. Results show strong agreement in the cross validation of predictions vs. observations, with 70-82% accuracies reported. Although no consistent spatial pattern in inaccurate predictions could be determined, the highest prediction uncertainties appeared in locations that had been recently replenished. Further testing and model refinement are needed; however, these initial results show that Bayesian networks have the potential to serve as important decision-support tools in forecasting coastal change.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19875154','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19875154"><span>Forecasting risk along a river basin using a probabilistic and deterministic model for environmental risk assessment of effluents through ecotoxicological evaluation and GIS.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente</p> <p>2009-12-20</p> <p>This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014QuRes..82..405G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014QuRes..82..405G"><span>Automated statistical matching of multiple tephra records exemplified using five long maar sequences younger than 75 ka, Auckland, New Zealand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Green, Rebecca M.; Bebbington, Mark S.; Cronin, Shane J.; Jones, Geoff</p> <p>2014-09-01</p> <p>Detailed tephrochronologies are built to underpin probabilistic volcanic hazard forecasting, and to understand the dynamics and history of diverse geomorphic, climatic, soil-forming and environmental processes. Complicating factors include highly variable tephra distribution over time; difficulty in correlating tephras from site to site based on physical and chemical properties; and uncertain age determinations. Multiple sites permit construction of more accurate composite tephra records, but correctly merging individual site records by recognizing common events and site-specific gaps is complex. We present an automated procedure for matching tephra sequences between multiple deposition sites using stochastic local optimization techniques. If individual tephra age determinations are not significantly different between sites, they are matched and a more precise age is assigned. Known stratigraphy and mineralogical or geochemical compositions are used to constrain tephra matches. We apply this method to match tephra records from five long sediment cores (≤ 75 cal ka BP) in Auckland, New Zealand. Sediments at these sites preserve basaltic tephras from local eruptions of the Auckland Volcanic Field as well as distal rhyolitic and andesitic tephras from Okataina, Taupo, Egmont, Tongariro, and Tuhua (Mayor Island) volcanic centers. The new correlated record compiled is statistically more likely than previously published arrangements from this area.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1918513B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1918513B"><span>Integrating predictive information into an agro-economic model to guide agricultural planning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Block, Paul; Zhang, Ying; You, Liangzhi</p> <p>2017-04-01</p> <p>Seasonal climate forecasts can inform long-range planning, including water resources utilization and allocation, however quantifying the value of this information on the economy is often challenging. For rain-fed farmers, skillful season-ahead predictions may lead to superior planning, as compared to business as usual strategies, resulting in additional benefits or reduced losses. In this study, regional-level probabilistic precipitation forecasts of the major rainy season in Ethiopia are fed into an agro-economic model, adapted from the International Food Policy Research Institute, to evaluate economic outcomes (GDP, poverty rates, etc.) as compared with a no-forecast approach. Based on forecasted conditions, farmers can select various actions: adjusting crop area and crop type, purchasing drought resistant seed, or applying additional fertilizer. Preliminary results favor the forecast-based approach, particularly through crop area reallocation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AtmRe..92..318S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AtmRe..92..318S"><span>Operational 0-3 h probabilistic quantitative precipitation forecasts: Recent performance and potential enhancements</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sokol, Z.; Kitzmiller, D.; Pešice, P.; Guan, S.</p> <p>2009-05-01</p> <p>The NOAA National Weather Service has maintained an automated, centralized 0-3 h prediction system for probabilistic quantitative precipitation forecasts since 2001. This advective-statistical system (ADSTAT) produces probabilities that rainfall will exceed multiple threshold values up to 50 mm at some location within a 40-km grid box. Operational characteristics and development methods for the system are described. Although development data were stratified by season and time of day, ADSTAT utilizes only a single set of nation-wide equations that relate predictor variables derived from radar reflectivity, lightning, satellite infrared temperatures, and numerical prediction model output to rainfall occurrence. A verification study documented herein showed that the operational ADSTAT reliably models regional variations in the relative frequency of heavy rain events. This was true even in the western United States, where no regional-scale, gridded hourly precipitation data were available during the development period in the 1990s. An effort was recently launched to improve the quality of ADSTAT forecasts by regionalizing the prediction equations and to adapt the model for application in the Czech Republic. We have experimented with incorporating various levels of regional specificity in the probability equations. The geographic localization study showed that in the warm season, regional climate differences and variations in the diurnal temperature cycle have a marked effect on the predictor-predictand relationships, and thus regionalization would lead to better statistical reliability in the forecasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015NHESD...3.5383E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015NHESD...3.5383E"><span>Chronology and impact of the 2011 Puyehue-Cordón Caulle eruption, Chile</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Elissondo, M.; Baumann, V.; Bonadonna, C.; Pistolesi, M.; Cioni, R.; Bertagnini, A.; Biass, S.; Herrero, J. C.; Gonzalez, R.</p> <p>2015-09-01</p> <p>We present a detailed chronological reconstruction of the 2011 eruption of Puyehue-Cordón Caulle volcano (Chile) based on information derived from newspapers, scientific reports and satellite images. Chronology of associated volcanic processes and their local and regional effects (i.e. precursory activity, tephra fallout, lahars, pyroclastic density currents, lava flows) are also presented. The eruption had a severe impact on the ecosystem and on various economic sectors, including aviation, tourism, agriculture, and fishing industry. Urban areas and critical infrastructures, such as airports, hospitals and roads, were also impacted. The concentration of PM10 (Particulate Matter ≤ 10 μm) was measured during and after the eruption, showing that maximum safety threshold levels of daily and annual exposures were surpassed in several occasions. Probabilistic analysis of atmospheric and eruptive conditions have shown that the main direction of dispersal is directly towards east of the volcano and that the climactic phase of the eruption, dispersed toward south-east, has a probability of occurrence within 1 %. The management of the crisis, including evacuation of people, is discussed, as well as the comparison with the impact associated with other recent eruptions located in similar areas and having similar characteristics (i.e. Quizapu, Hudson, and Chaitén volcanoes). This comparison shows that the regions downwind and very close to the erupting volcanoes suffered very similar problems, without a clear relation with the intensity of the eruption (e.g. health problems, damage to vegetation, death of animals, roof collapse, air traffic disruptions, road closure, lahars and flooding). This suggests that a detailed collection of impact data can be largely beneficial for the development of plans for the management of an eruptive crisis and the mitigation of associated risk of the Andean region.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1215259S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1215259S"><span>How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Smith, Leonard A.</p> <p>2010-05-01</p> <p>This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMGC13K0882L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMGC13K0882L"><span>Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lazarus, S. M.; Holman, B. P.; Splitt, M. E.</p> <p>2017-12-01</p> <p>A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27980205','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27980205"><span>Inflation-predictable behavior and co-eruption deformation at Axial Seamount.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Nooner, Scott L; Chadwick, William W</p> <p>2016-12-16</p> <p>Deformation of the ground surface at active volcanoes provides information about magma movements at depth. Improved seafloor deformation measurements between 2011 and 2015 documented a fourfold increase in magma supply and confirmed that Axial Seamount's eruptive behavior is inflation-predictable, probably triggered by a critical level of magmatic pressure. A 2015 eruption was successfully forecast on the basis of this deformation pattern and marked the first time that deflation and tilt were captured in real time by a new seafloor cabled observatory, revealing the timing, location, and volume of eruption-related magma movements. Improved modeling of the deformation suggests a steeply dipping prolate-spheroid pressure source beneath the eastern caldera that is consistent with the location of the zone of highest melt within the subcaldera magma reservoir determined from multichannel seismic results. Copyright © 2016, American Association for the Advancement of Science.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012JVGR..247..168D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012JVGR..247..168D"><span>The use of belief-based probabilistic methods in volcanology: Scientists' views and implications for risk assessments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Donovan, Amy; Oppenheimer, Clive; Bravo, Michael</p> <p>2012-12-01</p> <p>This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1614625K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1614625K"><span>Deterministic and Probabilistic Metrics of Surface Air Temperature and Precipitation in the MiKlip Decadal Prediction System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Pohlmann, Holger; Müller, Wolfgang; Cubasch, Ulrich</p> <p>2014-05-01</p> <p>Decadal forecasting of climate variability is a growing need for different parts of society, industry and economy. The German initiative MiKlip (www.fona-miklip.de) focuses on the ongoing processes of medium-term climate prediction. The scientific major project funded by the Federal Ministry of Education and Research in Germany (BMBF) develops a forecast system, that aims for reliable predictions on decadal timescales. Using a single earth system model from the Max-Planck institute (MPI-ESM) and moving from the uninitialized runs on to the first initialized 'Coupled Model Intercomparison Project Phase 5' (CMIP5) hindcast experiments identified possibilities and open scientific tasks. The MiKlip decadal prediction system was improved on different aspects through new initialization techniques and datasets of the ocean and atmosphere. To accompany and emphasize such an improvement of a forecast system, a standardized evaluation system designed by the MiKlip sub-project 'Integrated data and evaluation system for decadal scale prediction' (INTEGRATION) analyzes every step of its evolution. This study aims at combining deterministic and probabilistic skill scores of this prediction system from its unitialized state to anomaly and then full-field oceanic initialization. The improved forecast skill in these different decadal hindcast experiments of surface air temperature and precipitation in the Pacific region and the complex area of the North Atlantic illustrate potential sources of skill. A standardized evaluation leads prediction systems depending on development to find its way to produce reliable forecasts. Different aspects of these research dependencies, e.g. ensemble size, resolution, initializations, etc. will be discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.2522L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.2522L"><span>Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong</p> <p>2017-04-01</p> <p>Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums shows a mean improvement of more than 40% in CRPS when compared to bilinearly interpolated uncalibrated ensemble forecasts. The validation on randomly selected grid points, representing the true height distribution over Austria, still indicates a mean improvement of 35%. The applied statistical model is currently set up for 6-hourly and daily accumulation periods, but will be extended to a temporal resolution of 1-3 hours within a new probabilistic nowcasting system operated by ZAMG.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1815327B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1815327B"><span>Staged decision making based on probabilistic forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris</p> <p>2016-04-01</p> <p>Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in flood event management, the more damage can be reduced. And with decisions based on probabilistic forecasts, partial decisions can be made earlier in time (with a lower probability) and can be scaled up or down later in time when there is more certainty; whether the event takes place or not. Partial decisions are often more cheap, or shorten the final mitigation-time at the moment when there is more certainty. The proposed method is tested on Stonehaven, on the Carron River in Scotland. Decisions to implement demountable defences in the town are currently made based on a very short lead-time due to the absence of certainty. Application showed that staged decision making is possible and gives the decision maker more time to respond to a situation. The decision maker is able to take a lower regret decision with higher uncertainty and less related negative consequences. Although it is not possible to quantify intangible effects, it is part of the analysis to reduce these effects. Above all, the proposed approach has shown to be a possible improvement in economic terms and opens up possibilities of more flexible and robust decision making.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JVGR..309..139W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JVGR..309..139W"><span>Volcano-tectonic earthquakes: A new tool for estimating intrusive volumes and forecasting eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>White, Randall; McCausland, Wendy</p> <p>2016-01-01</p> <p>We present data on 136 high-frequency earthquakes and swarms, termed volcano-tectonic (VT) seismicity, which preceded 111 eruptions at 83 volcanoes, plus data on VT swarms that preceded intrusions at 21 other volcanoes. We find that VT seismicity is usually the earliest reported seismic precursor for eruptions at volcanoes that have been dormant for decades or more, and precedes eruptions of all magma types from basaltic to rhyolitic and all explosivities from VEI 0 to ultraplinian VEI 6 at such previously long-dormant volcanoes. Because large eruptions occur most commonly during resumption of activity at long-dormant volcanoes, VT seismicity is an important precursor for the Earth's most dangerous eruptions. VT seismicity precedes all explosive eruptions of VEI ≥ 5 and most if not all VEI 4 eruptions in our data set. Surprisingly we find that the VT seismicity originates at distal locations on tectonic fault structures at distances of one or two to tens of kilometers laterally from the site of the eventual eruption, and rarely if ever starts beneath the eruption site itself. The distal VT swarms generally occur at depths almost equal to the horizontal distance of the swarm from the summit out to about 15 km distance, beyond which hypocenter depths level out. We summarize several important characteristics of this distal VT seismicity including: swarm-like nature, onset days to years prior to the beginning of magmatic eruptions, peaking of activity at the time of the initial eruption whether phreatic or magmatic, and large non-double couple component to focal mechanisms. Most importantly we show that the intruded magma volume can be simply estimated from the cumulative seismic moment of the VT seismicity from: Log10 V = 0.77 Log ΣMoment - 5.32, with volume, V, in cubic meters and seismic moment in Newton meters. Because the cumulative seismic moment can be approximated from the size of just the few largest events, and is quite insensitive to precise locations, the intruded magma volume can be quickly and easily estimated with few short-period seismic stations. Notable cases in which distal VT events preceded eruptions at long-dormant volcanoes include: Nevado del Ruiz (1984-1985), Pinatubo (1991), Unzen (1989-1995), Soufriere Hills (1995), Shishaldin (1989-1999), Tacana' (1985-1986), Pacaya (1980-1984), Rabaul (1994), and Cotopaxi (2001). Additional cases are recognized at frequently active volcanoes including Popocateptl (2001-2003) and Mauna Loa (1984). We present four case studies (Pinatubo, Soufriere Hills, Unzen, and Tacana') in which we demonstrate the above mentioned VT characteristics prior to eruptions. Using regional data recorded by NEIC, we recognized in near-real time that a huge distal VT swarm was occurring, deduced that a proportionately huge magmatic intrusion was taking place beneath the long dormant Sulu Range, New Britain Island, Papua New Guinea, that it was likely to lead to eruptive activity, and warned Rabaul Volcano Observatory days before a phreatic eruption occurred. This confirms the value of this technique for eruption forecasting. We also present a counter-example where we deduced that a VT swarm at Volcan Cosiguina, Nicaragua, indicated a small intrusion, insufficient to reach the surface and erupt. Finally, we discuss limitations of the method and propose a mechanism by which this distal VT seismicity is triggered by magmatic intrusion.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27226843','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27226843"><span>Novel methodology for pharmaceutical expenditure forecast.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher</p> <p>2014-01-01</p> <p>The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011HESS...15.2327A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011HESS...15.2327A"><span>An operational hydrological ensemble prediction system for the city of Zurich (Switzerland): skill, case studies and scenarios</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Addor, N.; Jaun, S.; Fundel, F.; Zappa, M.</p> <p>2011-07-01</p> <p>The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This model chain relies on limited-area atmospheric forecasts provided by the deterministic model COSMO-7 and the probabilistic model COSMO-LEPS. These atmospheric forecasts are used to force a semi-distributed hydrological model (PREVAH), coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework with which to compare the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added-value conveyed by the probability information, a reforecast was made for the period June 2007 to December 2009 for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain can be of up to 2 days lead time for the catchment considered. Brier skill scores show that overall COSMO-LEPS-based hydrological forecasts outperforms their COSMO-7-based counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts, as shown by comparisons with a reference run driven by observed meteorological parameters. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. The two most intense events of the study period are investigated utilising a novel graphical representation of probability forecasts, and are used to generate high discharge scenarios. They highlight challenges for making decisions on the basis of hydrological predictions, and indicate the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment. No definitive conclusion on the model chain capacity to forecast flooding events endangering the city of Zurich could be drawn because of the under-sampling of extreme events. Further research on the form of the reforecasts needed to infer on floods associated to return periods of several decades, centuries, is encouraged.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.6092S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.6092S"><span>Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory, and the recent volcanic eruption of El Hierro</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.</p> <p>2012-04-01</p> <p>The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 years, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterise the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. Shortly after the publication of this method an eruption in the island of El Hierro took place for the first time in historical times, supporting our method and contributing towards the validation of our results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70033746','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70033746"><span>The critical role of volcano monitoring in risk reduction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Tilling, R.I.</p> <p>2008-01-01</p> <p>Data from volcano-monitoring studies constitute the only scientifically valid basis for short-term forecasts of a future eruption, or of possible changes during an ongoing eruption. Thus, in any effective hazards-mitigation program, a basic strategy in reducing volcano risk is the initiation or augmentation of volcano monitoring at historically active volcanoes and also at geologically young, but presently dormant, volcanoes with potential for reactivation. Beginning with the 1980s, substantial progress in volcano-monitoring techniques and networks - ground-based as well space-based - has been achieved. Although some geochemical monitoring techniques (e.g., remote measurement of volcanic gas emissions) are being increasingly applied and show considerable promise, seismic and geodetic methods to date remain the techniques of choice and are the most widely used. Availability of comprehensive volcano-monitoring data was a decisive factor in the successful scientific and governmental responses to the reawakening of Mount St. Helens (Washington, USA) in 1980 and, more recently, to the powerful explosive eruptions at Mount Pinatubo (Luzon, Philippines) in 1991. However, even with the ever-improving state-ofthe-art in volcano monitoring and predictive capability, the Mount St. Helens and Pinatubo case histories unfortunately still represent the exceptions, rather than the rule, in successfully forecasting the most likely outcome of volcano unrest.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017HESS...21.5493B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017HESS...21.5493B"><span>Technical note: Combining quantile forecasts and predictive distributions of streamflows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano</p> <p>2017-11-01</p> <p>The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFM.S33B2087W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFM.S33B2087W"><span>Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.</p> <p>2010-12-01</p> <p>On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017WRR....5310085B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017WRR....5310085B"><span>Using Meteorological Analogues for Reordering Postprocessed Precipitation Ensembles in Hydrological Forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bellier, Joseph; Bontron, Guillaume; Zin, Isabella</p> <p>2017-12-01</p> <p>Meteorological ensemble forecasts are nowadays widely used as input of hydrological models for probabilistic streamflow forecasting. These forcings are frequently biased and have to be statistically postprocessed, using most of the time univariate techniques that apply independently to individual locations, lead times and weather variables. Postprocessed ensemble forecasts therefore need to be reordered so as to reconstruct suitable multivariate dependence structures. The Schaake shuffle and ensemble copula coupling are the two most popular methods for this purpose. This paper proposes two adaptations of them that make use of meteorological analogues for reconstructing spatiotemporal dependence structures of precipitation forecasts. Performances of the original and adapted techniques are compared through a multistep verification experiment using real forecasts from the European Centre for Medium-Range Weather Forecasts. This experiment evaluates not only multivariate precipitation forecasts but also the corresponding streamflow forecasts that derive from hydrological modeling. Results show that the relative performances of the different reordering methods vary depending on the verification step. In particular, the standard Schaake shuffle is found to perform poorly when evaluated on streamflow. This emphasizes the crucial role of the precipitation spatiotemporal dependence structure in hydrological ensemble forecasting.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2003EAEJA....13402S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2003EAEJA....13402S"><span>Empirical seasonal forecasts of the NAO</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sanchezgomez, E.; Ortizbevia, M.</p> <p>2003-04-01</p> <p>We present here seasonal forecasts of the North Atlantic Oscillation (NAO) issued from ocean predictors with an empirical procedure. The Singular Values Decomposition (SVD) of the cross-correlation matrix between predictor and predictand fields at the lag used for the forecast lead is at the core of the empirical model. The main predictor field are sea surface temperature anomalies, although sea ice cover anomalies are also used. Forecasts are issued in probabilistic form. The model is an improvement over a previous version (1), where Sea Level Pressure Anomalies were first forecast, and the NAO Index built from this forecast field. Both correlation skill between forecast and observed field, and number of forecasts that hit the correct NAO sign, are used to assess the forecast performance , usually above those values found in the case of forecasts issued assuming persistence. For certain seasons and/or leads, values of the skill are above the .7 usefulness treshold. References (1) SanchezGomez, E. and Ortiz Bevia M., 2002, Estimacion de la evolucion pluviometrica de la Espana Seca atendiendo a diversos pronosticos empiricos de la NAO, in 'El Agua y el Clima', Publicaciones de la AEC, Serie A, N 3, pp 63-73, Palma de Mallorca, Spain</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AdSR....8...53S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AdSR....8...53S"><span>On the predictability of outliers in ensemble forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Siegert, S.; Bröcker, J.; Kantz, H.</p> <p>2012-03-01</p> <p>In numerical weather prediction, ensembles are used to retrieve probabilistic forecasts of future weather conditions. We consider events where the verification is smaller than the smallest, or larger than the largest ensemble member of a scalar ensemble forecast. These events are called outliers. In a statistically consistent K-member ensemble, outliers should occur with a base rate of 2/(K+1). In operational ensembles this base rate tends to be higher. We study the predictability of outlier events in terms of the Brier Skill Score and find that forecast probabilities can be calculated which are more skillful than the unconditional base rate. This is shown analytically for statistically consistent ensembles. Using logistic regression, forecast probabilities for outlier events in an operational ensemble are calculated. These probabilities exhibit positive skill which is quantitatively similar to the analytical results. Possible causes of these results as well as their consequences for ensemble interpretation are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29760089','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29760089"><span>Uncertainty in forecasts of long-run economic growth.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Christensen, P; Gillingham, K; Nordhaus, W</p> <p>2018-05-22</p> <p>Forecasts of long-run economic growth are critical inputs into policy decisions being made today on the economy and the environment. Despite its importance, there is a sparse literature on long-run forecasts of economic growth and the uncertainty in such forecasts. This study presents comprehensive probabilistic long-run projections of global and regional per-capita economic growth rates, comparing estimates from an expert survey and a low-frequency econometric approach. Our primary results suggest a median 2010-2100 global growth rate in per-capita gross domestic product of 2.1% per year, with a standard deviation (SD) of 1.1 percentage points, indicating substantially higher uncertainty than is implied in existing forecasts. The larger range of growth rates implies a greater likelihood of extreme climate change outcomes than is currently assumed and has important implications for social insurance programs in the United States.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22667302-relationship-between-distribution-magnetic-decay-index-filament-eruptions','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22667302-relationship-between-distribution-magnetic-decay-index-filament-eruptions"><span>RELATIONSHIP BETWEEN DISTRIBUTION OF MAGNETIC DECAY INDEX AND FILAMENT ERUPTIONS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Li, H.; Liu, Y.; Elmhamdi, A.</p> <p>2016-10-20</p> <p>The decay index n of a horizontal magnetic field is considered to be an important parameter in judging the stability of a flux rope. However, the spatial distribution of this parameter has not been extensively explored so far. In this paper, we present a delineative study of the three-dimensional maps of n for two eruptive events, in which filaments underwent asymmetrical eruptions. The corresponding n -distributions are both found to show that the filaments tend to erupt at abnormal regions (dubbed ABN regions) of n . These ABN regions appear to be divided into two subregions, with larger and smallermore » n . Moreover, an analysis of the magnetic topological configuration of the ABN regions has been also performed. The results indicate that these ABN regions are associated with a kind of special quasi-separatrix layer across which the connectivity of magnetic field is discontinuous. The presented observations and analyses strongly suggest that the torus instability in ABN regions may play a crucial role for the triggering of an asymmetrical eruption. Additionally, our investigation can provide a way of forecasting how a filament might erupt, and predicting the location for an asymmetrically eruptive filament to be split through analyzing the spatial structure of n .« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1990JVGR...43...91H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1990JVGR...43...91H"><span>Bayesian analysis of volcanic eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ho, Chih-Hsiang</p> <p>1990-10-01</p> <p>The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16844648','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16844648"><span>Monitoring super-volcanoes: geophysical and geochemical signals at Yellowstone and other large caldera systems.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lowenstern, Jacob B; Smith, Robert B; Hill, David P</p> <p>2006-08-15</p> <p>Earth's largest calderas form as the ground collapses during immense volcanic eruptions, when hundreds to thousands of cubic kilometres of magma are explosively withdrawn from the Earth's crust over a period of days to weeks. Continuing long after such great eruptions, the resulting calderas often exhibit pronounced unrest, with frequent earthquakes, alternating uplift and subsidence of the ground, and considerable heat and mass flux. Because many active and extinct calderas show evidence for repetition of large eruptions, such systems demand detailed scientific study and monitoring. Two calderas in North America, Yellowstone (Wyoming) and Long Valley (California), are in areas of youthful tectonic complexity. Scientists strive to understand the signals generated when tectonic, volcanic and hydrothermal (hot ground water) processes intersect. One obstacle to accurate forecasting of large volcanic events is humanity's lack of familiarity with the signals leading up to the largest class of volcanic eruptions. Accordingly, it may be difficult to recognize the difference between smaller and larger eruptions. To prepare ourselves and society, scientists must scrutinize a spectrum of volcanic signals and assess the many factors contributing to unrest and toward diverse modes of eruption.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JPhCS.753c2042G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JPhCS.753c2042G"><span>Wind power forecasting: IEA Wind Task 36 & future research issues</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Giebel, G.; Cline, J.; Frank, H.; Shaw, W.; Pinson, P.; Hodge, B.-M.; Kariniotakis, G.; Madsen, J.; Möhrlen, C.</p> <p>2016-09-01</p> <p>This paper presents the new International Energy Agency Wind Task 36 on Forecasting, and invites to collaborate within the group. Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Forecasting for Wind Energy tries to organise international collaboration, among national meteorological centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, MetOffice, met.no, DMI,...), operational forecaster and forecast users. The Task is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions. As first results, an overview of current issues for research in short-term forecasting of wind power is presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4032526','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4032526"><span>On the reliability of seasonal climate forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Weisheimer, A.; Palmer, T. N.</p> <p>2014-01-01</p> <p>Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1–5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that ‘goodness’ should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a ‘5’ should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of ‘goodness’ rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching ‘5’ across all regions and variables in 30 years time. PMID:24789559</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19850018449','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19850018449"><span>Probabilistic computer model of optimal runway turnoffs</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.</p> <p>1985-01-01</p> <p>Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20170012221','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20170012221"><span>Building Reliable Forecasts of Solar Activity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Kitiashvili, Irina; Wray, Alan; Mansour, Nagi</p> <p>2017-01-01</p> <p>Solar ionizing radiation critically depends on the level of the Sun’s magnetic activity. For robust physics-based forecasts, we employ the procedure of data assimilation, which combines theoretical modeling and observational data such that uncertainties in both the model and the observations are taken into account. Currently we are working in two major directions: 1) development of a new long-term forecast procedure on time-scales of the 11-year solar cycle, using a 2-dimensional mean-field dynamo model and synoptic magnetograms; 2) development of 3-dimensional radiative MHD (Magnetohydrodynamic) simulations to investigate the origin and precursors of local manifestations of magnetic activity, such as the formation of magnetic structures and eruptive dynamics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JHyd..522..697P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JHyd..522..697P"><span>How do I know if my forecasts are better? Using benchmarks in hydrological ensemble prediction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pappenberger, F.; Ramos, M. H.; Cloke, H. L.; Wetterhall, F.; Alfieri, L.; Bogner, K.; Mueller, A.; Salamon, P.</p> <p>2015-03-01</p> <p>The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are 'toughest to beat' and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.V53A3076R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.V53A3076R"><span>Evaluation of Kilauea Eruptions By Using Stable Isotope Analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rahimi, K. E.; Bursik, M. I.</p> <p>2016-12-01</p> <p>Kilauea, on the island of Hawaii, is a large volcanic edifice with numerous named vents scattered across its surface. Halema`uma`u crater sits with Kilauea caldera, above the magma reservoir, which is the main source of lava feeding most vents on Kilauea volcano. Halema`uma`u crater produces basaltic explosive activity ranging from weak emission to sub-Plinian. Changes in the eruption style are thought to be due to the interplay between external water and magma (phreatomagmatic/ phreatic), or to segregation of gas from magma (magmatic) at shallow depths. Since there are three different eruption mechanisms (phreatomagmatic, phreatic, and magmatic), each eruption has its own isotope ratios. The aim of this study is to evaluate the eruption mechanism by using stable isotope analysis. Studying isotope ratios of D/H and δ18O within fluid inclusion and volcanic glass will provide an evidence of what driven the eruption. The results would be determined the source of water that drove an eruption by correlating the values with water sources (groundwater, rainwater, and magmatic water) since each water source has a diagnostic value of D/H and δ18O. These results will provide the roles of volatiles in eruptions. The broader application of this research is that these methods could help volcanologists forecasting and predicting the current volcanic activity by mentoring change in volatiles concentration within deposits.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.9176C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.9176C"><span>Emulation for probabilistic weather forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cornford, Dan; Barillec, Remi</p> <p>2010-05-01</p> <p>Numerical weather prediction models are typically very expensive to run due to their complexity and resolution. Characterising the sensitivity of the model to its initial condition and/or to its parameters requires numerous runs of the model, which is impractical for all but the simplest models. To produce probabilistic forecasts requires knowledge of the distribution of the model outputs, given the distribution over the inputs, where the inputs include the initial conditions, boundary conditions and model parameters. Such uncertainty analysis for complex weather prediction models seems a long way off, given current computing power, with ensembles providing only a partial answer. One possible way forward that we develop in this work is the use of statistical emulators. Emulators provide an efficient statistical approximation to the model (or simulator) while quantifying the uncertainty introduced. In the emulator framework, a Gaussian process is fitted to the simulator response as a function of the simulator inputs using some training data. The emulator is essentially an interpolator of the simulator output and the response in unobserved areas is dictated by the choice of covariance structure and parameters in the Gaussian process. Suitable parameters are inferred from the data in a maximum likelihood, or Bayesian framework. Once trained, the emulator allows operations such as sensitivity analysis or uncertainty analysis to be performed at a much lower computational cost. The efficiency of emulators can be further improved by exploiting the redundancy in the simulator output through appropriate dimension reduction techniques. We demonstrate this using both Principal Component Analysis on the model output and a new reduced-rank emulator in which an optimal linear projection operator is estimated jointly with other parameters, in the context of simple low order models, such as the Lorenz 40D system. We present the application of emulators to probabilistic weather forecasting, where the construction of the emulator training set replaces the traditional ensemble model runs. Thus the actual forecast distributions are computed using the emulator conditioned on the ‘ensemble runs' which are chosen to explore the plausible input space using relatively crude experimental design methods. One benefit here is that the ensemble does not need to be a sample from the true distribution of the input space, rather it should cover that input space in some sense. The probabilistic forecasts are computed using Monte Carlo methods sampling from the input distribution and using the emulator to produce the output distribution. Finally we discuss the limitations of this approach and briefly mention how we might use similar methods to learn the model error within a framework that incorporates a data assimilation like aspect, using emulators and learning complex model error representations. We suggest future directions for research in the area that will be necessary to apply the method to more realistic numerical weather prediction models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/of/2009/1133/','USGSPUBS'); return false;" href="https://pubs.usgs.gov/of/2009/1133/"><span>Preliminary Spreadsheet of Eruption Source Parameters for Volcanoes of the World</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Mastin, Larry G.; Guffanti, Marianne; Ewert, John W.; Spiegel, Jessica</p> <p>2009-01-01</p> <p>Volcanic eruptions that spew tephra into the atmosphere pose a hazard to jet aircraft. For this reason, the International Civil Aviation Organization (ICAO) has designated nine Volcanic Ash and Aviation Centers (VAACs) around the world whose purpose is to track ash clouds from eruptions and notify aircraft so that they may avoid these ash clouds. During eruptions, VAACs and their collaborators run volcanic-ashtransport- and-dispersion (VATD) models that forecast the location and movement of ash clouds. These models require as input parameters the plume height H, the mass-eruption rate , duration D, erupted volume V (in cubic kilometers of bubble-free or 'dense rock equivalent' [DRE] magma), and the mass fraction of erupted tephra with a particle size smaller than 63 um (m63). Some parameters, such as mass-eruption rate and mass fraction of fine debris, are not obtainable by direct observation; others, such as plume height or duration, are obtainable from observations but may be unavailable in the early hours of an eruption when VATD models are being initiated. For this reason, ash-cloud modelers need to have at their disposal source parameters for a particular volcano that are based on its recent eruptive history and represent the most likely anticipated eruption. They also need source parameters that encompass the range of uncertainty in eruption size or characteristics. In spring of 2007, a workshop was held at the U.S. Geological Survey (USGS) Cascades Volcano Observatory to derive a protocol for assigning eruption source parameters to ash-cloud models during eruptions. The protocol derived from this effort was published by Mastin and others (in press), along with a world map displaying the assigned eruption type for each of the world's volcanoes. Their report, however, did not include the assigned eruption types in tabular form. Therefore, this Open-File Report presents that table in the form of an Excel spreadsheet. These assignments are preliminary and will be modified to follow upcoming recommendations by the volcanological and aviation communities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMNH51A1601L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMNH51A1601L"><span>A Study on Management Standards and Manual of Water supply system for the response of Mt. Baekdu Volcanic Eruption in South Korea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lee, G.; Jee, Y.; Kim, J.</p> <p>2013-12-01</p> <p>Korea is regarded as a safety area from the volcanic disaster, however, the countermeasures for Mt. Baekdu volcanic eruption has been discussed because the possibility of the volcanic eruption had been heightened and various experimental results show risk of Mt. Baekdu volcanic eruption. The purpose of study is to establish management standards and manual for water supply system through the analysis of the volcanic ash effect to the water supply systems. In this study, similar case study for the water supply system to the volcanic ash damage had been investigated. Present status of water supply system and response manual for water supply systems also had been investigated. And then problems of present response manual using had been estimated. As the result, damage according to Mt. Baekdu volcanic eruption on the water supply system could be forecasted. And the direction of management standard and response manual has been established. Acknowledgments This research was supported by a grant [NEMA-BAEKDUSAN-2012-2-2] from the Volcanic Disaster Preparedness Research Center sponsored by National Emergency Management Agency of Korea.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28634369','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28634369"><span>Forecasting Effusive Dynamics and Decompression Rates by Magmastatic Model at Open-vent Volcanoes.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ripepe, Maurizio; Pistolesi, Marco; Coppola, Diego; Delle Donne, Dario; Genco, Riccardo; Lacanna, Giorgio; Laiolo, Marco; Marchetti, Emanuele; Ulivieri, Giacomo; Valade, Sébastien</p> <p>2017-06-20</p> <p>Effusive eruptions at open-conduit volcanoes are interpreted as reactions to a disequilibrium induced by the increase in magma supply. By comparing four of the most recent effusive eruptions at Stromboli volcano (Italy), we show how the volumes of lava discharged during each eruption are linearly correlated to the topographic positions of the effusive vents. This correlation cannot be explained by an excess of pressure within a deep magma chamber and raises questions about the actual contributions of deep magma dynamics. We derive a general model based on the discharge of a shallow reservoir and the magmastatic crustal load above the vent, to explain the linear link. In addition, we show how the drastic transition from effusive to violent explosions can be related to different decompression rates. We suggest that a gravity-driven model can shed light on similar cases of lateral effusive eruptions in other volcanic systems and can provide evidence of the roles of slow decompression rates in triggering violent paroxysmal explosive eruptions, which occasionally punctuate the effusive phases at basaltic volcanoes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFM.V31A1942S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFM.V31A1942S"><span>Preliminary investigation of the effects of eruption source parameters on volcanic ash transport and dispersion modeling using HYSPLIT</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Stunder, B.</p> <p>2009-12-01</p> <p>Atmospheric transport and dispersion (ATD) models are used in real-time at Volcanic Ash Advisory Centers to predict the location of airborne volcanic ash at a future time because of the hazardous nature of volcanic ash. Transport and dispersion models usually do not include eruption column physics, but start with an idealized eruption column. Eruption source parameters (ESP) input to the models typically include column top, eruption start time and duration, volcano latitude and longitude, ash particle size distribution, and total mass emission. An example based on the Okmok, Alaska, eruption of July 12-14, 2008, was used to qualitatively estimate the effect of various model inputs on transport and dispersion simulations using the NOAA HYSPLIT model. Variations included changing the ash column top and bottom, eruption start time and duration, particle size specifications, simulations with and without gravitational settling, and the effect of different meteorological model data. Graphical ATD model output of ash concentration from the various runs was qualitatively compared. Some parameters such as eruption duration and ash column depth had a large effect, while simulations using only small particles or changing the particle shape factor had much less of an effect. Some other variations such as using only large particles had a small effect for the first day or so after the eruption, then a larger effect on subsequent days. Example probabilistic output will be shown for an ensemble of dispersion model runs with various model inputs. Model output such as this may be useful as a means to account for some of the uncertainties in the model input. To improve volcanic ash ATD models, a reference database for volcanic eruptions is needed, covering many volcanoes. The database should include three major components: (1) eruption source, (2) ash observations, and (3) analyses meteorology. In addition, information on aggregation or other ash particle transformation processes would be useful.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.7612S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.7612S"><span>Exploring the full natural variability of eruption sizes within probabilistic hazard assessment of tephra dispersal</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Selva, Jacopo; Sandri, Laura; Costa, Antonio; Tonini, Roberto; Folch, Arnau; Macedonio, Giovanni</p> <p>2014-05-01</p> <p>The intrinsic uncertainty and variability associated to the size of next eruption strongly affects short to long-term tephra hazard assessment. Often, emergency plans are established accounting for the effects of one or a few representative scenarios (meant as a specific combination of eruptive size and vent position), selected with subjective criteria. On the other hand, probabilistic hazard assessments (PHA) consistently explore the natural variability of such scenarios. PHA for tephra dispersal needs the definition of eruptive scenarios (usually by grouping possible eruption sizes and vent positions in classes) with associated probabilities, a meteorological dataset covering a representative time period, and a tephra dispersal model. PHA results from combining simulations considering different volcanological and meteorological conditions through a weight given by their specific probability of occurrence. However, volcanological parameters, such as erupted mass, eruption column height and duration, bulk granulometry, fraction of aggregates, typically encompass a wide range of values. Because of such a variability, single representative scenarios or size classes cannot be adequately defined using single values for the volcanological inputs. Here we propose a method that accounts for this within-size-class variability in the framework of Event Trees. The variability of each parameter is modeled with specific Probability Density Functions, and meteorological and volcanological inputs are chosen by using a stratified sampling method. This procedure allows avoiding the bias introduced by selecting single representative scenarios and thus neglecting most of the intrinsic eruptive variability. When considering within-size-class variability, attention must be paid to appropriately weight events falling within the same size class. While a uniform weight to all the events belonging to a size class is the most straightforward idea, this implies a strong dependence on the thresholds dividing classes: under this choice, the largest event of a size class has a much larger weight than the smallest event of the subsequent size class. In order to overcome this problem, in this study, we propose an innovative solution able to smoothly link the weight variability within each size class to the variability among the size classes through a common power law, and, simultaneously, respect the probability of different size classes conditional to the occurrence of an eruption. Embedding this procedure into the Bayesian Event Tree scheme enables for tephra fall PHA, quantified through hazard curves and maps representing readable results applicable in planning risk mitigation actions, and for the quantification of its epistemic uncertainties. As examples, we analyze long-term tephra fall PHA at Vesuvius and Campi Flegrei. We integrate two tephra dispersal models (the analytical HAZMAP and the numerical FALL3D) into BET_VH. The ECMWF reanalysis dataset are used for exploring different meteorological conditions. The results obtained clearly show that PHA accounting for the whole natural variability significantly differs from that based on a representative scenarios, as in volcanic hazard common practice.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018BVol...80...56B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018BVol...80...56B"><span>National-level long-term eruption forecasts by expert elicitation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bebbington, Mark S.; Stirling, Mark W.; Cronin, Shane; Wang, Ting; Jolly, Gill</p> <p>2018-06-01</p> <p>Volcanic hazard estimation is becoming increasingly quantitative, creating the potential for land-use decisions and engineering design to use volcanic information in an analogous manner to seismic codes. The initial requirement is to characterize the possible hazard sources, quantifying the likely timing, magnitude and location of the next eruption in each case. This is complicated by the extremely different driving processes at individual volcanoes, and incomplete and uneven records of past activity at various volcanoes. To address these issues, we carried out an expert elicitation approach to estimate future eruption potential for 12 volcanoes of interest in New Zealand. A total of 28 New Zealand experts provided estimates that were combined using Cooke's classical method to arrive at a hazard estimate. In 11 of the 12 cases, the elicited eruption duration increased with VEI, and was correlated with expected repose, differing little between volcanoes. Most of the andesitic volcanoes had very similar elicited distributions for the VEI of a future eruption, except that Taranaki was expected to produce a larger eruption, due to the current long repose. Elicited future vent locations for Tongariro and Okataina reflect strongly the most recent eruptions. In the poorly studied Bay of Islands volcanic field, the estimated vent location distribution was centred on the centroid of the previous vent locations, while in the Auckland field, it was focused on regions within the field without past eruptions. The elicited median dates for the next eruptions ranged from AD2022 (Whakaari/White Island) to AD4390 (Tuhua/Mayor Island).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1913441D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1913441D"><span>ICE CONTROL - Towards optimizing wind energy production during icing events</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dorninger, Manfred; Strauss, Lukas; Serafin, Stefano; Beck, Alexander; Wittmann, Christoph; Weidle, Florian; Meier, Florian; Bourgeois, Saskia; Cattin, René; Burchhart, Thomas; Fink, Martin</p> <p>2017-04-01</p> <p>Forecasts of wind power production loss caused by icing weather conditions are produced by a chain of physical models. The model chain consists of a numerical weather prediction model, an icing model and a production loss model. Each element of the model chain is affected by significant uncertainty, which can be quantified using targeted observations and a probabilistic forecasting approach. In this contribution, we present preliminary results from the recently launched project ICE CONTROL, an Austrian research initiative on measurements, probabilistic forecasting, and verification of icing on wind turbine blades. ICE CONTROL includes an experimental field phase, consisting of measurement campaigns in a wind park in Rhineland-Palatinate, Germany, in the winters 2016/17 and 2017/18. Instruments deployed during the campaigns consist of a conventional icing detector on the turbine hub and newly devised ice sensors (eologix Sensor System) on the turbine blades, as well as meteorological sensors for wind, temperature, humidity, visibility, and precipitation type and spectra. Liquid water content and spectral characteristics of super-cooled water droplets are measured using a Fog Monitor FM-120. Three cameras document the icing conditions on the instruments and on the blades. Different modelling approaches are used to quantify the components of the model-chain uncertainties. The uncertainty related to the initial conditions of the weather prediction is evaluated using the existing global ensemble prediction system (EPS) of the European Centre for Medium-Range Weather Forecasts (ECMWF). Furthermore, observation system experiments are conducted with the AROME model and its 3D-Var data assimilation to investigate the impact of additional observations (such as Mode-S aircraft data, SCADA data and MSG cloud mask initialization) on the numerical icing forecast. The uncertainty related to model formulation is estimated from multi-physics ensembles based on the Weather Research and Forecasting model (WRF) by perturbing parameters in the physical parameterization schemes. In addition, uncertainties of the icing model and of its adaptations to the rotating turbine blade are addressed. The model forecasts combined with the suite of instruments and their measurements make it possible to conduct a step-wise verification of all the components of the model chain - a novel aspect compared to similar ongoing and completed forecasting projects.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.1513B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.1513B"><span>The impact of the characteristics of volcanic ash on forecasting.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Beckett, Frances; Hort, Matthew; Millington, Sarah; Stevenson, John; Witham, Claire</p> <p>2013-04-01</p> <p>The eruption of Eyjafjallajökull during April - May 2010 and Grímsvötn in May 2011, Iceland, caused the widespread dispersion of volcanic ash across the NE Atlantic, and ultimately into UK and European airspace. This resulted in thousands of flights to and from affected countries across Europe to be cancelled. The Met Office, UK, is the home of the London VAAC, a Volcanic Ash Advisory Centre, and as such is responsible for providing reports and forecasts for the movement of volcanic ash clouds covering the UK, Iceland and the north-eastern part of the North Atlantic ocean. To forecast the dispersion of volcanic ash requires that the sedimentation of ash particles through the atmosphere is effectively modelled. The settling velocity of an ash particle is a function of its size, shape and density, plus the density and viscosity of the air through which it is falling. We consider the importance of characterising the physical properties of ash when modelling the long range dispersion of ash particles through the atmosphere. Using the Reynolds number dependent scheme employed by NAME, the Lagrangian particle model used operationally by the Met Office, we calculate the settling velocity and thus the maximum travel distance of an ash particle through an idealised atmosphere as a function of its size, shape and density. The results are compared to measured particle sizes from deposits across Europe following the eruption of Eyjafjallajökull in 2010. Further, the particle size distribution (PSD) of ash in a volcanic cloud with time is modelled using NAME: the particle density distribution and particle shape factor are varied and the modelled PSD compared to the PSD measured in the ash cloud during the eruption of Eyjafjallajökull in 2010 by the FAAM research aircraft. The influence of the weather on PSD is also considered by comparing model output using an idealised atmosphere to output using NWP driven meteorological fields. We discuss the sensitivity of forecasts of the dispersion of volcanic ash to the representation of particle characteristics in NAME, the importance of representing the weather in ash fall models, and the implications of these results for the operational forecasting of volcanic ash dispersion at the London VAAC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JGRB..120.2330N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JGRB..120.2330N"><span>Quantifying volcanic hazard at Campi Flegrei caldera (Italy) with uncertainty assessment: 2. Pyroclastic density current invasion maps</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano</p> <p>2015-04-01</p> <p>Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12626249','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12626249"><span>A strategy for the observation of volcanism on Earth from space.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wadge, G</p> <p>2003-01-15</p> <p>Heat, strain, topography and atmospheric emissions associated with volcanism are well observed by satellites orbiting the Earth. Gravity and electromagnetic transients from volcanoes may also prove to be measurable from space. The nature of eruptions means that the best strategy for measuring their dynamic properties remotely from space is to employ two modes with different spatial and temporal samplings: eruption mode and background mode. Such observational programmes are best carried out at local or regional volcano observatories by coupling them with numerical models of volcanic processes. Eventually, such models could become multi-process, operational forecast models that assimilate the remote and other observables to constrain their uncertainties. The threat posed by very large magnitude explosive eruptions is global and best addressed by a spaceborne observational programme with a global remit.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H11J1341S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H11J1341S"><span>Exploring the interactions between forecast accuracy, risk perception and perceived forecast reliability in reservoir operator's decision to use forecast</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Shafiee-Jood, M.; Cai, X.</p> <p>2017-12-01</p> <p>Advances in streamflow forecasts at different time scales offer a promise for proactive flood management and improved risk management. Despite the huge potential, previous studies have found that water resources managers are often not willing to incorporate streamflow forecasts information in decisions making, particularly in risky situations. While low accuracy of forecasts information is often cited as the main reason, some studies have found that implementation of streamflow forecasts sometimes is impeded by institutional obstacles and behavioral factors (e.g., risk perception). In fact, a seminal study by O'Connor et al. (2005) found that risk perception is the strongest determinant of forecast use while managers' perception about forecast reliability is not significant. In this study, we aim to address this issue again. However, instead of using survey data and regression analysis, we develop a theoretical framework to assess the user-perceived value of streamflow forecasts. The framework includes a novel behavioral component which incorporates both risk perception and perceived forecast reliability. The framework is then used in a hypothetical problem where reservoir operator should react to probabilistic flood forecasts with different reliabilities. The framework will allow us to explore the interactions among risk perception and perceived forecast reliability, and among the behavioral components and information accuracy. The findings will provide insights to improve the usability of flood forecasts information through better communication and education.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016HESS...20.3549C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016HESS...20.3549C"><span>Action-based flood forecasting for triggering humanitarian action</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin</p> <p>2016-09-01</p> <p>Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1335576','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1335576"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Giebel, G.; Cline, J.; Frank, H.</p> <p></p> <p>Here, this paper presents the new International Energy Agency Wind Task 36 on Forecasting, and invites to collaborate within the group. Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Forecasting for Wind Energy tries to organise international collaboration, among national meteorological centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, MetOffice, met.no, DMI,...), operational forecaster and forecast users. The Taskmore » is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions. As first results, an overview of current issues for research in short-term forecasting of wind power is presented.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014HESSD..1111281H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014HESSD..1111281H"><span>Performance and robustness of probabilistic river forecasts computed with quantile regression based on multiple independent variables in the North Central USA</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hoss, F.; Fischbeck, P. S.</p> <p>2014-10-01</p> <p>This study further develops the method of quantile regression (QR) to predict exceedance probabilities of flood stages by post-processing forecasts. Using data from the 82 river gages, for which the National Weather Service's North Central River Forecast Center issues forecasts daily, this is the first QR application to US American river gages. Archived forecasts for lead times up to six days from 2001-2013 were analyzed. Earlier implementations of QR used the forecast itself as the only independent variable (Weerts et al., 2011; López López et al., 2014). This study adds the rise rate of the river stage in the last 24 and 48 h and the forecast error 24 and 48 h ago to the QR model. Including those four variables significantly improved the forecasts, as measured by the Brier Skill Score (BSS). Mainly, the resolution increases, as the original QR implementation already delivered high reliability. Combining the forecast with the other four variables results in much less favorable BSSs. Lastly, the forecast performance does not depend on the size of the training dataset, but on the year, the river gage, lead time and event threshold that are being forecast. We find that each event threshold requires a separate model configuration or at least calibration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1210318S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1210318S"><span>Emergency preparedness: community-based short-term eruption forecasting at Campi Flegrei</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Selva, Jacopo; Marzocchi, Warner; Civetta, Lucia; Del Pezzo, Edoardo; Papale, Paolo</p> <p>2010-05-01</p> <p>A key element in emergency preparedness is to define advance tools to assist decision makers and emergency management groups during crises. Such tools must be prepared in advance, accounting for all of expertise and scientific knowledge accumulated through time. During a pre-eruptive phase, the key for sound short-term eruption forecasting is the analysis of the monitoring signals. This involves the capability (i) to recognize anomalous signals and to relate single or combined anomalies to physical processes, assigning them probability values, and (ii) to quickly provide an answer to the observed phenomena even when unexpected. Here we present a > 4 years long process devoted to define the pre-eruptive Event Tree (ET) for Campi Flegrei. A community of about 40 experts in volcanology and volcano monitoring participating to two Italian Projects on Campi Flegrei funded by the Italian Civil Protection, has been constituted and trained during periodic meetings on the statistical methods and the model BET_EF (Marzocchi et al., 2008) that forms the statistical package tool for ET definition. Model calibration has been carried out through public elicitation sessions, preceded and followed by devoted meetings and web forum discussion on the monitoring parameters, their accuracy and relevance, and their potential meanings. The calibrated ET allows anomalies in the monitored parameters to be recognized and interpreted, assigning probability values to each set of data. This process de-personalizes the difficult task of interpreting multi-parametric sets of data during on-going emergencies, and provides a view of the observed variations that accounts for the averaged, weighted opinion of the scientific community. An additional positive outcome of the described ET calibration process is that of providing a picture of the degree of confidence by the expert community on the capability of the many different monitored quantities of recognizing significant variations in the state of the volcano. This picture is particularly useful since it can be used to guide future implementations in the monitoring network, as well as research investments aimed at substantially improving the capability to forecast the short-term volcanic hazard.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMNG14A..02C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMNG14A..02C"><span>Stochastic and Perturbed Parameter Representations of Model Uncertainty in Convection Parameterization</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Christensen, H. M.; Moroz, I.; Palmer, T.</p> <p>2015-12-01</p> <p>It is now acknowledged that representing model uncertainty in atmospheric simulators is essential for the production of reliable probabilistic ensemble forecasts, and a number of different techniques have been proposed for this purpose. Stochastic convection parameterization schemes use random numbers to represent the difference between a deterministic parameterization scheme and the true atmosphere, accounting for the unresolved sub grid-scale variability associated with convective clouds. An alternative approach varies the values of poorly constrained physical parameters in the model to represent the uncertainty in these parameters. This study presents new perturbed parameter schemes for use in the European Centre for Medium Range Weather Forecasts (ECMWF) convection scheme. Two types of scheme are developed and implemented. Both schemes represent the joint uncertainty in four of the parameters in the convection parametrisation scheme, which was estimated using the Ensemble Prediction and Parameter Estimation System (EPPES). The first scheme developed is a fixed perturbed parameter scheme, where the values of uncertain parameters are changed between ensemble members, but held constant over the duration of the forecast. The second is a stochastically varying perturbed parameter scheme. The performance of these schemes was compared to the ECMWF operational stochastic scheme, Stochastically Perturbed Parametrisation Tendencies (SPPT), and to a model which does not represent uncertainty in convection. The skill of probabilistic forecasts made using the different models was evaluated. While the perturbed parameter schemes improve on the stochastic parametrisation in some regards, the SPPT scheme outperforms the perturbed parameter approaches when considering forecast variables that are particularly sensitive to convection. Overall, SPPT schemes are the most skilful representations of model uncertainty due to convection parametrisation. Reference: H. M. Christensen, I. M. Moroz, and T. N. Palmer, 2015: Stochastic and Perturbed Parameter Representations of Model Uncertainty in Convection Parameterization. J. Atmos. Sci., 72, 2525-2544.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70048233','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70048233"><span>Probabilistic accounting of uncertainty in forecasts of species distributions under climate change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Wenger, Seth J.; Som, Nicholas A.; Dauwalter, Daniel C.; Isaak, Daniel J.; Neville, Helen M.; Luce, Charles H.; Dunham, Jason B.; Young, Michael K.; Fausch, Kurt D.; Rieman, Bruce E.</p> <p>2013-01-01</p> <p>Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing models (model uncertainty), and uncertainty in future climate conditions (climate uncertainty) to produce site-specific frequency distributions of occurrence probabilities across a species’ range. We illustrated the method by forecasting suitable habitat for bull trout (Salvelinus confluentus) in the Interior Columbia River Basin, USA, under recent and projected 2040s and 2080s climate conditions. The 95% interval of total suitable habitat under recent conditions was estimated at 30.1–42.5 thousand km; this was predicted to decline to 0.5–7.9 thousand km by the 2080s. Projections for the 2080s showed that the great majority of stream segments would be unsuitable with high certainty, regardless of the climate data set or bull trout model employed. The largest contributor to uncertainty in total suitable habitat was climate uncertainty, followed by parameter uncertainty and model uncertainty. Our approach makes it possible to calculate a full distribution of possible outcomes for a species, and permits ready graphical display of uncertainty for individual locations and of total habitat.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.T11D2653R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.T11D2653R"><span>From the Slab to the Surface: Origin, Storage, Ascent, and Eruption of Volatile-Bearing Magmas in the Aleutian arc</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Roman, D.; Plank, T. A.; Hauri, E. H.; Rasmussen, D. J.; Power, J. A.; Lyons, J. J.; Haney, M. M.; Werner, C. A.; Kern, C.; Lopez, T. M.; Izbekov, P. E.; Stelling, P. L.</p> <p>2016-12-01</p> <p>We present initial results from an integrated geochemical-geophysical study of the Unimak-Cleveland corridor of the Aleutian volcanic arc, which encompasses six volcanoes spanning 450 km of the arc that have erupted in the past 25 years with a wide range of magmatic water contents. This relatively small corridor also exhibits a range of deep and upper-crustal seismicity, apparent magma storage depths, and depths to the subducting tectonic plate. The ultimate goal of this study is to link two normally disconnected big-picture problems: 1) the deep origin of magmas and volatiles, and 2) the formation and eruption of crustal magma reservoirs, which we will do by establishing the depth(s) of crustal magma reservoirs and pre-eruptive volatile contents throughout the corridor. Our preliminary work focuses on the geographic end members Shishaldin Volcano, which last erupted in 2014-2015, and Cleveland Volcano, which last erupted in April-May of this year (2016). Both systems are persistently degassing, open-vent volcanoes whose frequent eruptions are typically characterized by minimal precursory seismicity, making eruption forecasting challenging. At Cleveland, we analyze data from a 12-station broadband seismic network deployed from August 2015-July 2016, which is complemented by two permanent seismo-acoustic stations operated by the Alaska Volcano Observatory (AVO). We also analyze tephras from recent eruptions (including 2016) and conducted ground- and helicopter-based gas emission surveys. At Shishaldin, we analyze data from the permanent AVO network, which is comprised of mainly short-period, single-component seismic stations. We also present preliminary analyses of samples of recent eruptive deposits and gas emission data. Through integration of these various datasets we present preliminary interpretations related to the origin, storage, ascent and eruption of volatile-bearing magmas at Cleveland and Shishaldin volcanoes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ClDy...50.2121Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ClDy...50.2121Y"><span>Divergent responses of tropical cyclone genesis factors to strong volcanic eruptions at different latitudes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yan, Qing; Zhang, Zhongshi; Wang, Huijun</p> <p>2018-03-01</p> <p>To understand the behaviors of tropical cyclones (TCs), it is very important to explore how TCs respond to anthropogenic greenhouse gases and natural forcings. Volcanic eruptions are a major natural forcing mechanism because they inject sulphate aerosols into the stratosphere, which modulate the global climate by absorbing and scattering solar radiation. The number of Atlantic hurricanes is thought to be reduced following strong tropical eruptions, but whether the response of TCs varies with the locations of the volcanoes and the different ocean basins remains unknown. Here, we use the Community Earth System Model-Last Millennium Ensemble to investigate the response of the large-scale environmental factors that spawn TCs to strong volcanic eruptions at different latitudes. A composite analysis indicates that tropical and northern hemisphere volcanic eruptions lead to significantly unfavorable conditions for TC genesis over the whole Pacific basin and the North Atlantic during the 3 years post-eruption, relative to the preceding 3 years. Southern hemisphere volcanic eruptions result in obviously unfavorable conditions for TC formation over the southwestern Pacific, but more favorable conditions over the North Atlantic. The mean response over the Indian Ocean is generally muted and insignificant. It should be noted that volcanic eruptions impact on environmental conditions through both the direct effect (i.e. on radiative forcing) and the indirect effect (i.e. on El Niño-Southern Oscillation), which is not differentiated in this study. In addition, the spread of the TC genesis response is considerably large for each category of eruptions over each ocean basin, which is also seen in the observational/proxy-based records. This large spread is attributed to the differences in stratospheric aerosol distributions, initial states and eruption intensities, and makes the short-term forecast of TC activity following the next large eruption challenging.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1990JVGR...42..117P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1990JVGR...42..117P"><span>Volcanic hazard maps of the Nevado del Ruiz volcano, Colombia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Parra, Eduardo; Cepeda, Hector</p> <p>1990-07-01</p> <p>Although the potential hazards associated with an eruption of Nevado del Ruiz volcano were known to civil authorities before the catastrophic eruption there in November 1985, their low perception of risk and the long quiescent period since the last eruption (140 years), caused them to wait for stronger activity before developing an eruption alert system. Unfortunately, the eruption occurred suddenly after a period of relative quiet, and as a result more than 25,000 people were killed. Although it was accurate and reasonably comprehensive, the hazard map that existed before the eruption was poorly understood by the authorities and even less so by the general population, because the scientific terminology and probabilistic approach to natural hazards were unfamiliar to many of them. This confusion was shared by the communication media, which at critical times placed undue emphasis on the possibility of lava flows rather than on the more imminent threat from mudflows, in keeping with the popular but often inaccurate perception of volcanic eruptions. This work presents an updated hazard map of Nevado del Ruiz that combines information on various hazardous phenomena with their relative probability of occurrence in order to depict numerical "hazard levels" that are easily comprehensible to nonspecialists and therefore less susceptible to misinterpretation. The scale of relative risk is arbitrary, ranging from five to one, and is intended to provide an intuitive indication of danger to people, property and crops. The map is meant to facilitate emergency preparedness and management by political and civil authorities, to educate the public concerning volcanic hazards and to assist in land-use planning decisions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMPA11B0218S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMPA11B0218S"><span>Can we use Earth Observations to improve monthly water level forecasts?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Slater, L. J.; Villarini, G.</p> <p>2017-12-01</p> <p>Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUOSPO13A..01P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUOSPO13A..01P"><span>Towards the Olympic Games: Guanabara Bay Forecasting System and its Application on the Floating Debris Cleaning Actions.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pimentel, F. P.; Marques Da Cruz, L.; Cabral, M. M.; Miranda, T. C.; Garção, H. F.; Oliveira, A. L. S. C.; Carvalho, G. V.; Soares, F.; São Tiago, P. M.; Barmak, R. B.; Rinaldi, F.; dos Santos, F. A.; Da Rocha Fragoso, M.; Pellegrini, J. C.</p> <p>2016-02-01</p> <p>Marine debris is a widespread pollution issue that affects almost all water bodies and is remarkably relevant in estuaries and bays. Rio de Janeiro city will host the 2016 Olympic Games and Guanabara Bay will be the venue for the sailing competitions. Historically serving as deposit for all types of waste, this water body suffers with major environmental problems, one of them being the massive presence of floating garbage. Therefore, it is of great importance to count on effective contingency actions to address this issue. In this sense, an operational ocean forecasting system was designed and it is presently being used by the Rio de Janeiro State Government to manage and control the cleaning actions on the bay. The forecasting system makes use of high resolution hydrodynamic and atmospheric models and a lagragian particle transport model, in order to provide probabilistic forecasts maps of the areas where the debris are most probably accumulating. All the results are displayed on an interactive GIS web platform along with the tracks of the boats that make the garbage collection, so the decision makers can easily command the actions, enhancing its efficiency. The integration of in situ data and advanced techniques such as Lyapunov exponent analysis are also being developed in the system, so to increase its forecast reliability. Additionally, the system also gathers and compiles on its database all the information on the debris collection, including quantity, type, locations, accumulation areas and their correlation with the environmental factors that drive the runoff and surface drift. Combining probabilistic, deterministic and statistical approaches, the forecasting system of Guanabara Bay has been proving to be a powerful tool for the environmental management and will be of great importance on helping securing the safety and fairness of the Olympic sailing competitions. The system design, its components and main results are presented in this paper.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70197880','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70197880"><span>2018 one‐year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Rukstales, Kenneth S.; McNamara, Daniel E.; Williams, Robert A.; Shumway, Allison; Powers, Peter; Earle, Paul; Llenos, Andrea L.; Michael, Andrew J.; Rubinstein, Justin L.; Norbeck, Jack; Cochran, Elizabeth S.</p> <p>2018-01-01</p> <p>This article describes the U.S. Geological Survey (USGS) 2018 one‐year probabilistic seismic hazard forecast for the central and eastern United States from induced and natural earthquakes. For consistency, the updated 2018 forecast is developed using the same probabilistic seismicity‐based methodology as applied in the two previous forecasts. Rates of earthquakes across the United States M≥3.0">M≥3.0 grew rapidly between 2008 and 2015 but have steadily declined over the past 3 years, especially in areas of Oklahoma and southern Kansas where fluid injection has decreased. The seismicity pattern in 2017 was complex with earthquakes more spatially dispersed than in the previous years. Some areas of west‐central Oklahoma experienced increased activity rates where industrial activity increased. Earthquake rates in Oklahoma (429 earthquakes of M≥3">M≥3 and 4 M≥4">M≥4), Raton basin (Colorado/New Mexico border, six earthquakes M≥3">M≥3), and the New Madrid seismic zone (11 earthquakes M≥3">M≥3) continue to be higher than historical levels. Almost all of these earthquakes occurred within the highest hazard regions of the 2017 forecast. Even though rates declined over the past 3 years, the short‐term hazard for damaging ground shaking across much of Oklahoma remains at high levels due to continuing high rates of smaller earthquakes that are still hundreds of times higher than at any time in the state’s history. Fine details and variability between the 2016–2018 forecasts are obscured by significant uncertainties in the input model. These short‐term hazard levels are similar to active regions in California. During 2017, M≥3">M≥3 earthquakes also occurred in or near Ohio, West Virginia, Missouri, Kentucky, Tennessee, Arkansas, Illinois, Oklahoma, Kansas, Colorado, New Mexico, Utah, and Wyoming.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.6073T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.6073T"><span>Evaluations of Extended-Range tropical Cyclone Forecasts in the Western North Pacific by using the Ensemble Reforecasts: Preliminary Results</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tsai, Hsiao-Chung; Chen, Pang-Cheng; Elsberry, Russell L.</p> <p>2017-04-01</p> <p>The objective of this study is to evaluate the predictability of the extended-range forecasts of tropical cyclone (TC) in the western North Pacific using reforecasts from National Centers for Environmental Prediction (NCEP) Global Ensemble Forecast System (GEFS) during 1996-2015, and from the Climate Forecast System (CFS) during 1999-2010. Tsai and Elsberry have demonstrated that an opportunity exists to support hydrological operations by using the extended-range TC formation and track forecasts in the western North Pacific from the ECMWF 32-day ensemble. To demonstrate this potential for the decision-making processes regarding water resource management and hydrological operation in Taiwan reservoir watershed areas, special attention is given to the skill of the NCEP GEFS and CFS models in predicting the TCs affecting the Taiwan area. The first objective of this study is to analyze the skill of NCEP GEFS and CFS TC forecasts and quantify the forecast uncertainties via verifications of categorical binary forecasts and probabilistic forecasts. The second objective is to investigate the relationships among the large-scale environmental factors [e.g., El Niño Southern Oscillation (ENSO), Madden-Julian Oscillation (MJO), etc.] and the model forecast errors by using the reforecasts. Preliminary results are indicating that the skill of the TC activity forecasts based on the raw forecasts can be further improved if the model biases are minimized by utilizing these reforecasts.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1335576-wind-power-forecasting-iea-wind-task-amp-future-research-issues','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1335576-wind-power-forecasting-iea-wind-task-amp-future-research-issues"><span>Wind power forecasting: IEA Wind Task 36 & future research issues</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Giebel, G.; Cline, J.; Frank, H.; ...</p> <p>2016-10-03</p> <p>Here, this paper presents the new International Energy Agency Wind Task 36 on Forecasting, and invites to collaborate within the group. Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Forecasting for Wind Energy tries to organise international collaboration, among national meteorological centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, MetOffice, met.no, DMI,...), operational forecaster and forecast users. The Taskmore » is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions. As first results, an overview of current issues for research in short-term forecasting of wind power is presented.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2010-09-14/pdf/2010-22822.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2010-09-14/pdf/2010-22822.pdf"><span>75 FR 55846 - Public Meeting/Working Group With Industry on Volcanic Ash</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2010-09-14</p> <p>... operational requirements for the reporting and forecasting of volcanic eruptions and the associated ash cloud... Industry on Volcanic Ash AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of public... operational needs for Volcanic Ash information in support of aviation from stakeholders. DATES: The meeting...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120014195','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120014195"><span>Prior Flaring as a Complement to Free Magnetic Energy for Forecasting Solar Eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor</p> <p>2012-01-01</p> <p>From a large database of (1) 40,000 SOHO/MDI line-of-sight magnetograms covering the passage of 1,300 sunspot active regions across the 30 deg radius central disk of the Sun, (2) a proxy of each active region's free magnetic energy measured from each of the active region's central-disk-passage magnetograms, and (3) each active region's full-disk-passage history of production of major flares and fast coronal mass ejections (CMEs), we find new statistical evidence that (1) there are aspects of an active region's magnetic field other than the free energy that are strong determinants of the active region's productivity of major flares and fast CMEs in the coming few days, (2) an active region's recent productivity of major flares, in addition to reflecting the amount of free energy in the active region, also reflects these other determinants of coming productivity of major eruptions, and (3) consequently, the knowledge of whether an active region has recently had a major flare, used in combination with the active region's free-energy proxy measured from a magnetogram, can greatly alter the forecast chance that the active region will have a major eruption in the next few days after the time of the magnetogram. The active-region magnetic conditions that, in addition to the free energy, are reflected by recent major flaring are presumably the complexity and evolution of the field.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.5030B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.5030B"><span>Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana</p> <p>2014-05-01</p> <p>In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70193606','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70193606"><span>Pushing the Volcanic Explosivity Index to its limit and beyond: Constraints from exceptionally weak explosive eruptions at Kīlauea in 2008</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Houghton, Bruce F.; Swanson, Don; Rausch, J.; Carey, R.J.; Fagents, S.A.; Orr, Tim R.</p> <p>2013-01-01</p> <p>Estimating the mass, volume, and dispersal of the deposits of very small and/or extremely weak explosive eruptions is difficult, unless they can be sampled on eruption. During explosive eruptions of Halema‘uma‘u Crater (Kīlauea, Hawaii) in 2008, we constrained for the first time deposits of bulk volumes as small as 9–300 m3 (1 × 104 to 8 × 105 kg) and can demonstrate that they show simple exponential thinning with distance from the vent. There is no simple fit for such products within classifications such as the Volcanic Explosivity Index (VEI). The VEI is being increasingly used as the measure of magnitude of explosive eruptions, and as an input for both hazard modeling and forecasting of atmospheric dispersal of tephra. The 2008 deposits demonstrate a problem for the use of the VEI, as originally defined, which classifies small, yet ballistic-producing, explosive eruptions at Kīlauea and other basaltic volcanoes as nonexplosive. We suggest a simple change to extend the scale in a fashion inclusive of such very small deposits, and to make the VEI more consistent with other magnitude scales such as the Richter scale for earthquakes. Eruptions of this magnitude constitute a significant risk at Kīlauea and elsewhere because of their high frequency and the growing number of “volcano tourists” visiting basaltic volcanoes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFM.V12C..02V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFM.V12C..02V"><span>VHub - Cyberinfrastructure for volcano eruption and hazards modeling and simulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Valentine, G. A.; Jones, M. D.; Bursik, M. I.; Calder, E. S.; Gallo, S. M.; Connor, C.; Carn, S. A.; Rose, W. I.; Moore-Russo, D. A.; Renschler, C. S.; Pitman, B.; Sheridan, M. F.</p> <p>2009-12-01</p> <p>Volcanic risk is increasing as populations grow in active volcanic regions, and as national economies become increasingly intertwined. In addition to their significance to risk, volcanic eruption processes form a class of multiphase fluid dynamics with rich physics on many length and time scales. Risk significance, physics complexity, and the coupling of models to complex dynamic spatial datasets all demand the development of advanced computational techniques and interdisciplinary approaches to understand and forecast eruption dynamics. Innovative cyberinfrastructure is needed to enable global collaboration and novel scientific creativity, while simultaneously enabling computational thinking in real-world risk mitigation decisions - an environment where quality control, documentation, and traceability are key factors. Supported by NSF, we are developing a virtual organization, referred to as VHub, to address this need. Overarching goals of the VHub project are: Dissemination. Make advanced modeling and simulation capabilities and key data sets readily available to researchers, students, and practitioners around the world. Collaboration. Provide a mechanism for participants not only to be users but also co-developers of modeling capabilities, and contributors of experimental and observational data sets for use in modeling and simulation, in a collaborative environment that reaches far beyond local work groups. Comparison. Facilitate comparison between different models in order to provide the practitioners with guidance for choosing the "right" model, depending upon the intended use, and provide a platform for multi-model analysis of specific problems and incorporation into probabilistic assessments. Application. Greatly accelerate access and application of a wide range of modeling tools and related data sets to agencies around the world that are charged with hazard planning, mitigation, and response. Education. Provide resources that will promote the training of the next generation of volcanologists and hazards specialists such that modeling and simulation form part of a tripartite foundation of approaches, alongside observational data and experimentation. Adaptation. Conduct ongoing, rigorous self-assessment to study the impact of the virtual organization and promote continual adaptation to optimize its impact, as well as to understand emergent collective learning and collaborative patterns. VHub development is just beginning and we are very interested in input from the community and the addition of new partners to the effort. Current partners include A. Costa, A. Neri, W. Marzocchi, R.S.J. Sparks, S.J. Cronin, S. Takarada, Joan Marti, J.-C. Komorowski, T.H. Druitt, T. Koyaguchi, J.L. Macias, and S. Dartevelle.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014PhDT.......230M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014PhDT.......230M"><span>Model-Data Fusion and Adaptive Sensing for Large Scale Systems: Applications to Atmospheric Release Incidents</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Madankan, Reza</p> <p></p> <p>All across the world, toxic material clouds are emitted from sources, such as industrial plants, vehicular traffic, and volcanic eruptions can contain chemical, biological or radiological material. With the growing fear of natural, accidental or deliberate release of toxic agents, there is tremendous interest in precise source characterization and generating accurate hazard maps of toxic material dispersion for appropriate disaster management. In this dissertation, an end-to-end framework has been developed for probabilistic source characterization and forecasting of atmospheric release incidents. The proposed methodology consists of three major components which are combined together to perform the task of source characterization and forecasting. These components include Uncertainty Quantification, Optimal Information Collection, and Data Assimilation. Precise approximation of prior statistics is crucial to ensure performance of the source characterization process. In this work, an efficient quadrature based method has been utilized for quantification of uncertainty in plume dispersion models that are subject to uncertain source parameters. In addition, a fast and accurate approach is utilized for the approximation of probabilistic hazard maps, based on combination of polynomial chaos theory and the method of quadrature points. Besides precise quantification of uncertainty, having useful measurement data is also highly important to warranty accurate source parameter estimation. The performance of source characterization is highly affected by applied sensor orientation for data observation. Hence, a general framework has been developed for the optimal allocation of data observation sensors, to improve performance of the source characterization process. The key goal of this framework is to optimally locate a set of mobile sensors such that measurement of textit{better} data is guaranteed. This is achieved by maximizing the mutual information between model predictions and observed data, given a set of kinetic constraints on mobile sensors. Dynamic Programming method has been utilized to solve the resulting optimal control problem. To complete the loop of source characterization process, two different estimation techniques, minimum variance estimation framework and Bayesian Inference method has been developed to fuse model forecast with measurement data. Incomplete information regarding the distribution of associated noise signal in measurement data, is another major challenge in the source characterization of plume dispersion incidents. This frequently happens in data assimilation of atmospheric data by using the satellite imagery. This occurs due to the fact that satellite imagery data can be polluted with noise, depending on weather conditions, clouds, humidity, etc. Unfortunately, there is no accurate procedure to quantify the error in recorded satellite data. Hence, using classical data assimilation methods in this situation is not straight forward. In this dissertation, the basic idea of a novel approach has been proposed to tackle these types of real world problems with more accuracy and robustness. A simple example demonstrating the real-world scenario is presented to validate the developed methodology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017E%26PSL.472..309I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017E%26PSL.472..309I"><span>Understanding the environmental impacts of large fissure eruptions: Aerosol and gas emissions from the 2014-2015 Holuhraun eruption (Iceland)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ilyinskaya, Evgenia; Schmidt, Anja; Mather, Tamsin A.; Pope, Francis D.; Witham, Claire; Baxter, Peter; Jóhannsson, Thorsteinn; Pfeffer, Melissa; Barsotti, Sara; Singh, Ajit; Sanderson, Paul; Bergsson, Baldur; McCormick Kilbride, Brendan; Donovan, Amy; Peters, Nial; Oppenheimer, Clive; Edmonds, Marie</p> <p>2017-08-01</p> <p>The 2014-2015 Holuhraun eruption in Iceland, emitted ∼11 Tg of SO2 into the troposphere over 6 months, and caused one of the most intense and widespread volcanogenic air pollution events in centuries. This study provides a number of source terms for characterisation of plumes in large fissure eruptions, in Iceland and elsewhere. We characterised the chemistry of aerosol particle matter (PM) and gas in the Holuhraun plume, and its evolution as the plume dispersed, both via measurements and modelling. The plume was sampled at the eruptive vent, and in two populated areas in Iceland. The plume caused repeated air pollution events, exceeding hourly air quality standards (350 μg/m3) for SO2 on 88 occasions in Reykjahlíð town (100 km distance), and 34 occasions in Reykjavík capital area (250 km distance). Average daily concentration of volcanogenic PM sulphate exceeded 5 μg/m3 on 30 days in Reykjavík capital area, which is the maximum concentration measured during non-eruptive background interval. There are currently no established air quality standards for sulphate. Combining the results from direct sampling and dispersion modelling, we identified two types of plume impacting the downwind populated areas. The first type was characterised by high concentrations of both SO2 and S-bearing PM, with a high Sgas/SPM mass ratio (SO2(g)/SO42-(PM) > 10). The second type had a low Sgas/SPM ratio (<10). We suggest that this second type was a mature plume where sulphur had undergone significant gas-to-aerosol conversion in the atmosphere. Both types of plume were rich in fine aerosol (predominantly PM1 and PM2.5), sulphate (on average ∼90% of the PM mass) and various trace species, including heavy metals. The fine size of the volcanic PM mass (75-80% in PM2.5), and the high environmental lability of its chemical components have potential adverse implications for environmental and health impacts. However, only the dispersion of volcanic SO2 was forecast in public warnings and operationally monitored during the eruption. We make a recommendation that sulphur gas-to-aerosol conversion processes, and a sufficiently large model domain to contain the transport of a tropospheric plume on the timescale of days be utilized for public health and environmental impact forecasting in future eruptions in Iceland and elsewhere in the world.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26620731','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26620731"><span>Influence of metabolic-linked early life factors on the eruption timing of the first primary tooth.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Un Lam, Carolina; Hsu, Chin-Ying Stephen; Yee, Robert; Koh, David; Lee, Yung Seng; Chong, Mary Foong-Fong; Cai, Meijin; Kwek, Kenneth; Saw, Seang Mei; Godfrey, Keith; Gluckman, Peter; Chong, Yap Seng</p> <p>2016-11-01</p> <p>Early eruption of permanent teeth has been associated with childhood obesity and diabetes mellitus, suggesting links between tooth eruption and metabolic conditions. This longitudinal study aimed to identify pre-, peri- and postnatal factors with metabolic consequences during infancy that may affect the eruption timing of the first primary tooth (ETFT) in children from an ethnically heterogeneous population residing within the same community. Participants were recruited (n = 1033) through the GUSTO (Growing Up in Singapore Towards healthy Outcomes) birth cohort (n = 1237). Oral examinations were performed at 3-month intervals from 6 to 18 months of age. Crude and adjusted analyses, with generalized linear modelling, were conducted to link ETFT to potential determinants occurring during pregnancy, delivery/birth and early infancy. Overall mean eruption age of the first primary tooth was 8.5 (SD 2.6) months. Earlier tooth eruption was significantly associated with infant's rate of weight gain during the first 3 months of life and increased maternal childbearing age. Compared to their Chinese counterparts, Malay and Indian children experienced significantly delayed tooth eruption by 1.2 and 1.7 months, respectively. Infant weight gain from birth to 3 months, ethnicity and maternal childbearing age were significant determinants of first tooth eruption timing. Early life influences can affect primary tooth development, possibly via metabolic pathways. Timing of tooth eruption is linked to general growth and metabolic function. Therefore, it has potential in forecasting oral and systemic conditions such as caries and obesity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013ACPD...1313439V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013ACPD...1313439V"><span>Simulation of the dispersion of the Eyjafjallajökull plume over Europe with COSMO-ART in the operational mode</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vogel, H.; Förstner, J.; Vogel, B.; Hanisch, T.; Mühr, B.; Schättler, U.; Schad, T.</p> <p>2013-05-01</p> <p>An extended version of the German operational weather forecast model was used to simulate the ash dispersion during the eruption of the Eyjafjallajökull. Sensitivity runs show the ability of the model to simulate thin ash layers when an increased vertical resolution is used. Calibration of the model results with measured data allows for a quantitative forecast of the ash concentration. An independent comparison of the simulated number concentration of 3 μm particles and observations reveals nearly perfect agreement. However, this perfect agreement could only be reached after modification of the emissions. As an operational forecast was launched every six hours, a time-lagged ensemble was obtained. Hence, the probability of violation of a certain threshold can be calculated. This is valuable information for the forecasters advising the organizations responsible for the closing of the airspace.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011NHESS..11.2741S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011NHESS..11.2741S"><span>Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.</p> <p>2011-10-01</p> <p>The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 yr, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterize the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. This is done in three steps: First, we analyze the historical eruptive series to assess independence and homogeneity of the process. Second, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Third, we analyze the non-homogeneous Poisson process with a generalized Pareto distribution as the intensity function.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JVGR..349..323D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JVGR..349..323D"><span>MrLavaLoba: A new probabilistic model for the simulation of lava flows as a settling process</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>de'Michieli Vitturi, Mattia; Tarquini, Simone</p> <p>2018-01-01</p> <p>A new code to simulate lava flow spread, MrLavaLoba, is presented. In the code, erupted lava is itemized in parcels having an elliptical shape and prescribed volume. New parcels bud from existing ones according to a probabilistic law influenced by the local steepest slope direction and by tunable input settings. MrLavaLoba must be accounted among the probabilistic codes for the simulation of lava flows, because it is not intended to mimic the actual process of flowing or to provide directly the progression with time of the flow field, but rather to guess the most probable inundated area and final thickness of the lava deposit. The code's flexibility allows it to produce variable lava flow spread and emplacement according to different dynamics (e.g. pahoehoe or channelized-'a'ā). For a given scenario, it is shown that model outputs converge, in probabilistic terms, towards a single solution. The code is applied to real cases in Hawaii and Mt. Etna, and the obtained maps are shown. The model is written in Python and the source code is available at http://demichie.github.io/MrLavaLoba/.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11.1934P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11.1934P"><span>Analysis of five years of continuous GPS recording at Piton de La Fournaise (R</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Peltier, A.; Staudacher, T.; Boissier, P.; Lauret, F.; Kowalski, P.</p> <p>2009-04-01</p> <p>A network of twelve permanent GPS stations has been implemented since 2004 at Piton de La Fournaise (hot spot basaltic volcano of La Réunion Island, Indian Ocean) to follow the ground deformation associated with its high eruptive activity. During the period covered by the continuous GPS recording, 12 eruptions occurred. The compilation of the data recorded between 2004 and 2008 allows us to define two time scales of ground deformation systematically associated with this eruptive activity: (1) Large short-term displacements, reaching up to 14 mm/min, monitored a few min to hours prior each eruption during magma injections toward the surface (co-eruptive deformation); (2) But also, small long-term ground displacements recorded during inter-eruptive periods. Between 2 weeks and 5 months before each eruption a slight summit inflation occurs (0.4-0.7 mm/day); whereas a post-eruptive summit deflation lasting 1 to 3 months is only recorded after the largest distal eruptions (0.3 - 1.3 mm/day). These two time scales ground deformation precursors allowed us to forecast all eruptions up to five months in advance. And the follow up of the large short-term displacement in real-time allowed us to evaluated the approximate location of the eruptive fissure a few min to hours before its opening (i.e. inside the summit crater, northern flank or southern flank). The large short-term ground displacements have been attributed to the dyke propagation toward the surface, whereas the long-term ground displacements, which were also recorded by the extensometer network since 2000, have been attributed to a continuous over pressurization of the shallow magma reservoir located at about 2300m depth. The continuous over-pressurization of the shallow magma reservoir would explain the high eruptive activity observed since 1998; 27 eruptions in 10 years.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.8220L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.8220L"><span>Propagation of radar rainfall uncertainty in urban flood simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liguori, Sara; Rico-Ramirez, Miguel</p> <p>2013-04-01</p> <p>This work discusses the results of the implementation of a novel probabilistic system designed to improve ensemble sewer flow predictions for the drainage network of a small urban area in the North of England. The probabilistic system has been developed to model the uncertainty associated to radar rainfall estimates and propagate it through radar-based ensemble sewer flow predictions. The assessment of this system aims at outlining the benefits of addressing the uncertainty associated to radar rainfall estimates in a probabilistic framework, to be potentially implemented in the real-time management of the sewer network in the study area. Radar rainfall estimates are affected by uncertainty due to various factors [1-3] and quality control and correction techniques have been developed in order to improve their accuracy. However, the hydrological use of radar rainfall estimates and forecasts remains challenging. A significant effort has been devoted by the international research community to the assessment of the uncertainty propagation through probabilistic hydro-meteorological forecast systems [4-5], and various approaches have been implemented for the purpose of characterizing the uncertainty in radar rainfall estimates and forecasts [6-11]. A radar-based ensemble stochastic approach, similar to the one implemented for use in the Southern-Alps by the REAL system [6], has been developed for the purpose of this work. An ensemble generator has been calibrated on the basis of the spatial-temporal characteristics of the residual error in radar estimates assessed with reference to rainfall records from around 200 rain gauges available for the year 2007, previously post-processed and corrected by the UK Met Office [12-13]. Each ensemble member is determined by summing a perturbation field to the unperturbed radar rainfall field. The perturbations are generated by imposing the radar error spatial and temporal correlation structure to purely stochastic fields. A hydrodynamic sewer network model implemented in the Infoworks software was used to model the rainfall-runoff process in the urban area. The software calculates the flow through the sewer conduits of the urban model using rainfall as the primary input. The sewer network is covered by 25 radar pixels with a spatial resolution of 1 km2. The majority of the sewer system is combined, carrying both urban rainfall runoff as well as domestic and trade waste water [11]. The urban model was configured to receive the probabilistic radar rainfall fields. The results showed that the radar rainfall ensembles provide additional information about the uncertainty in the radar rainfall measurements that can be propagated in urban flood modelling. The peaks of the measured flow hydrographs are often bounded within the uncertainty area produced by using the radar rainfall ensembles. This is in fact one of the benefits of using radar rainfall ensembles in urban flood modelling. More work needs to be done in improving the urban models, but this is out of the scope of this research. The rainfall uncertainty cannot explain the whole uncertainty shown in the flow simulations, and additional sources of uncertainty will come from the structure of the urban models as well as the large number of parameters required by these models. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and the UK Environment Agency for providing the various data sets. We also thank Yorkshire Water Services Ltd for providing the urban model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1. References [1] Browning KA, 1978. Meteorological applications of radar. Reports on Progress in Physics 41 761 Doi: 10.1088/0034-4885/41/5/003 [2] Rico-Ramirez MA, Cluckie ID, Shepherd G, Pallot A, 2007. A high-resolution radar experiment on the island of Jersey. Meteorological Applications 14: 117-129. [3] Villarini G, Krajewski WF, 2010. Review of the different sources of uncertainty in single polarization radar-based estimates of rainfall. Surveys in Geophysics 31: 107-129. [4] Rossa A, Liechti K, Zappa M, Bruen M, Germann U, Haase G, Keil C, Krahe P, 2011. The COST 731 Action: A review on uncertainty propagation in advanced hydrometeorological forecast systems. Atmospheric Research 100, 150-167. [5] Rossa A, Bruen M, Germann U, Haase G, Keil C, Krahe P, Zappa M, 2010. Overview and Main Results on the interdisciplinary effort in flood forecasting COST 731-Propagation of Uncertainty in Advanced Meteo-Hydrological Forecast Systems. Proceedings of Sixth European Conference on Radar in Meteorology and Hydrology ERAD 2010. [6] Germann U, Berenguer M, Sempere-Torres D, Zappa M, 2009. REAL - ensemble radar precipitation estimation for hydrology in a mountainous region. Quarterly Journal of the Royal Meteorological Society 135: 445-456. [8] Bowler NEH, Pierce CE, Seed AW, 2006. STEPS: a probabilistic precipitation forecasting scheme which merges and extrapolation nowcast with downscaled NWP. Quarterly Journal of the Royal Meteorological Society 132: 2127-2155. [9] Zappa M, Rotach MW, Arpagaus M, Dorninger M, Hegg C, Montani A, Ranzi R, Ament F, Germann U, Grossi G et al., 2008. MAP D-PHASE: real-time demonstration of hydrological ensemble prediction systems. Atmospheric Science Letters 9, 80-87. [10] Liguori S, Rico-Ramirez MA. Quantitative assessment of short-term rainfall forecasts from radar nowcasts and MM5 forecasts. Hydrological Processes, accepted article. DOI: 10.1002/hyp.8415 [11] Liguori S, Rico-Ramirez MA, Schellart ANA, Saul AJ, 2012. Using probabilistic radar rainfall nowcasts and NWP forecasts for flow prediction in urban catchments. Atmospheric Research 103: 80-95. [12] Harrison DL, Driscoll SJ, Kitchen M, 2000. Improving precipitation estimates from weather radar using quality control and correction techniques. Meteorological Applications 7: 135-144. [13] Harrison DL, Scovell RW, Kitchen M, 2009. High-resolution precipitation estimates for hydrological uses. Proceedings of the Institution of Civil Engineers - Water Management 162: 125-135.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016E%26PSL.442..218D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016E%26PSL.442..218D"><span>Short-period volcanic gas precursors to phreatic eruptions: Insights from Poás Volcano, Costa Rica</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>de Moor, J. M.; Aiuppa, A.; Pacheco, J.; Avard, G.; Kern, C.; Liuzzo, M.; Martínez, M.; Giudice, G.; Fischer, T. P.</p> <p>2016-05-01</p> <p>Volcanic eruptions involving interaction with water are amongst the most violent and unpredictable geologic phenomena on Earth. Phreatic eruptions are exceptionally difficult to forecast by traditional geophysical techniques. Here we report on short-term precursory variations in gas emissions related to phreatic blasts at Poás volcano, Costa Rica, as measured with an in situ multiple gas analyzer that was deployed at the edge of the erupting lake. Gas emitted from this hyper-acid crater lake approaches magmatic values of SO2/CO2 1-6 days prior to eruption. The SO2 flux derived from magmatic degassing through the lake is measureable by differential optical absorption spectrometry (sporadic campaign measurements), which allows us to constrain lake gas output and input for the major gas species during eruptive and non-eruptive periods. We can further calculate power supply to the hydrothermal system using volatile mass balance and thermodynamics, which indicates that the magmatic heat flux into the shallow hydrothermal system increases from ∼27 MW during quiescence to ∼59 MW during periods of phreatic events. These transient pulses of gas and heat from the deeper magmatic system generate both phreatic eruptions and the observed short-term changes in gas composition, because at high gas flux scrubbing of sulfur by the hydrothermal system is both kinetically and thermodynamically inhibited whereas CO2 gas is always essentially inert in hyperacid conditions. Thus, the SO2/CO2 of lake emissions approaches magmatic values as gas and power supply to the sub-limnic hydrothermal system increase, vaporizing fluids and priming the hydrothermal system for eruption. Our results suggest that high-frequency real-time gas monitoring could provide useful short-term eruptive precursors at volcanoes prone to phreatic explosions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70175484','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70175484"><span>Short-period volcanic gas precursors to phreatic eruptions: Insights from Poás Volcano, Costa Rica</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>de Moor, Maarten; Aiuppa, Alessandro; Pacheco, Javier; Avard, Geoffroy; Kern, Christoph; Liuzzo, Marco; Martinez, Maria; Giudice, Gaetano; Fischer, Tobias P.</p> <p>2016-01-01</p> <p>Volcanic eruptions involving interaction with water are amongst the most violent and unpredictable geologic phenomena on Earth. Phreatic eruptions are exceptionally difficult to forecast by traditional geophysical techniques. Here we report on short-term precursory variations in gas emissions related to phreatic blasts at Poás volcano, Costa Rica, as measured with an in situ multiple gas analyzer that was deployed at the edge of the erupting lake. Gas emitted from this hyper-acid crater lake approaches magmatic values of SO2/CO2 1–6 days prior to eruption. The SO2 flux derived from magmatic degassing through the lake is measureable by differential optical absorption spectrometry (sporadic campaign measurements), which allows us to constrain lake gas output and input for the major gas species during eruptive and non-eruptive periods. We can further calculate power supply to the hydrothermal system using volatile mass balance and thermodynamics, which indicates that the magmatic heat flux into the shallow hydrothermal system increases from ∼27 MW during quiescence to ∼59 MW during periods of phreatic events. These transient pulses of gas and heat from the deeper magmatic system generate both phreatic eruptions and the observed short-term changes in gas composition, because at high gas flux scrubbing of sulfur by the hydrothermal system is both kinetically and thermodynamically inhibited whereas CO2gas is always essentially inert in hyperacid conditions. Thus, the SO2/CO2 of lake emissions approaches magmatic values as gas and power supply to the sub-limnic hydrothermal system increase, vaporizing fluids and priming the hydrothermal system for eruption. Our results suggest that high-frequency real-time gas monitoring could provide useful short-term eruptive precursors at volcanoes prone to phreatic explosions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015GeoRL..42.6043J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015GeoRL..42.6043J"><span>A new aircraft hurricane wind climatology and applications in assessing the predictive skill of tropical cyclone intensity using high-resolution ensemble forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Judt, Falko; Chen, Shuyi S.</p> <p>2015-07-01</p> <p>Hurricane surface wind is a key measure of storm intensity. However, a climatology of hurricane winds is lacking to date, largely because hurricanes are relatively rare events and difficult to observe over the open ocean. Here we present a new hurricane wind climatology based on objective surface wind analyses, which are derived from Stepped Frequency Microwave Radiometer measurements acquired by NOAA WP-3D and U.S. Air Force WC-130J hurricane hunter aircraft. The wind data were collected during 72 aircraft reconnaissance missions into 21 western Atlantic hurricanes from 1998 to 2012. This climatology provides an opportunity to validate hurricane intensity forecasts beyond the simplistic maximum wind speed metric and allows evaluating the predictive skill of probabilistic hurricane intensity forecasts using high-resolution model ensembles. An example of application is presented here using a 1.3 km grid spacing Weather Research and Forecasting model ensemble forecast of Hurricane Earl (2010).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.1762V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.1762V"><span>Avoiding the ensemble decorrelation problem using member-by-member post-processing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Van Schaeybroeck, Bert; Vannitsem, Stéphane</p> <p>2014-05-01</p> <p>Forecast calibration or post-processing has become a standard tool in atmospheric and climatological science due to the presence of systematic initial condition and model errors. For ensemble forecasts the most competitive methods derive from the assumption of a fixed ensemble distribution. However, when independently applying such 'statistical' methods at different locations, lead times or for multiple variables the correlation structure for individual ensemble members is destroyed. Instead of reastablishing the correlation structure as in Schefzik et al. (2013) we instead propose a calibration method that avoids such problem by correcting each ensemble member individually. Moreover, we analyse the fundamental mechanisms by which the probabilistic ensemble skill can be enhanced. In terms of continuous ranked probability score, our member-by-member approach amounts to skill gain that extends for lead times far beyond the error doubling time and which is as good as the one of the most competitive statistical approach, non-homogeneous Gaussian regression (Gneiting et al. 2005). Besides the conservation of correlation structure, additional benefits arise including the fact that higher-order ensemble moments like kurtosis and skewness are inherited from the uncorrected forecasts. Our detailed analysis is performed in the context of the Kuramoto-Sivashinsky equation and different simple models but the results extent succesfully to the ensemble forecast of the European Centre for Medium-Range Weather Forecasts (Van Schaeybroeck and Vannitsem, 2013, 2014) . References [1] Gneiting, T., Raftery, A. E., Westveld, A., Goldman, T., 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Weather Rev. 133, 1098-1118. [2] Schefzik, R., T.L. Thorarinsdottir, and T. Gneiting, 2013: Uncertainty Quantification in Complex Simulation Models Using Ensemble Copula Coupling. To appear in Statistical Science 28. [3] Van Schaeybroeck, B., and S. Vannitsem, 2013: Reliable probabilities through statistical post-processing of ensemble forecasts. Proceedings of the European Conference on Complex Systems 2012, Springer proceedings on complexity, XVI, p. 347-352. [4] Van Schaeybroeck, B., and S. Vannitsem, 2014: Ensemble post-processing using member-by-member approaches: theoretical aspects, under review.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009HESS...13..793A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009HESS...13..793A"><span>Inclusion of potential vorticity uncertainties into a hydrometeorological forecasting chain: application to a medium size basin of Mediterranean Spain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Amengual, A.; Romero, R.; Vich, M.; Alonso, S.</p> <p>2009-06-01</p> <p>The improvement of the short- and mid-range numerical runoff forecasts over the flood-prone Spanish Mediterranean area is a challenging issue. This work analyses four intense precipitation events which produced floods of different magnitude over the Llobregat river basin, a medium size catchment located in Catalonia, north-eastern Spain. One of them was a devasting flash flood - known as the "Montserrat" event - which produced 5 fatalities and material losses estimated at about 65 million euros. The characterization of the Llobregat basin's hydrological response to these floods is first assessed by using rain-gauge data and the Hydrologic Engineering Center's Hydrological Modeling System (HEC-HMS) runoff model. In second place, the non-hydrostatic fifth-generation Pennsylvania State University/NCAR mesoscale model (MM5) is nested within the ECMWF large-scale forecast fields in a set of 54 h period simulations to provide quantitative precipitation forecasts (QPFs) for each hydrometeorological episode. The hydrological model is forced with these QPFs to evaluate the reliability of the resulting discharge forecasts, while an ensemble prediction system (EPS) based on perturbed atmospheric initial and boundary conditions has been designed to test the value of a probabilistic strategy versus the previous deterministic approach. Specifically, a Potential Vorticity (PV) Inversion technique has been used to perturb the MM5 model initial and boundary states (i.e. ECMWF forecast fields). For that purpose, a PV error climatology has been previously derived in order to introduce realistic PV perturbations in the EPS. Results show the benefits of using a probabilistic approach in those cases where the deterministic QPF presents significant deficiencies over the Llobregat river basin in terms of the rainfall amounts, timing and localization. These deficiences in precipitation fields have a major impact on flood forecasts. Our ensemble strategy has been found useful to reduce the biases at different hydrometric sections along the watershed. Therefore, in an operational context, the devised methodology could be useful to expand the lead times associated with the prediction of similar future floods, helping to alleviate their possible hazardous consequences.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009HESSD...6..535A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009HESSD...6..535A"><span>Inclusion of potential vorticity uncertainties into a hydrometeorological forecasting chain: application to a medium size basin of Mediterranean Spain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Amengual, A.; Romero, R.; Vich, M.; Alonso, S.</p> <p>2009-01-01</p> <p>The improvement of the short- and mid-range numerical runoff forecasts over the flood-prone Spanish Mediterranean area is a challenging issue. This work analyses four intense precipitation events which produced floods of different magnitude over the Llobregat river basin, a medium size catchment located in Catalonia, north-eastern Spain. One of them was a devasting flash flood - known as the "Montserrat" event - which produced 5 fatalities and material losses estimated at about 65 million euros. The characterization of the Llobregat basin's hydrological response to these floods is first assessed by using rain-gauge data and the Hydrologic Engineering Center's Hydrological Modeling System (HEC-HMS) runoff model. In second place, the non-hydrostatic fifth-generation Pennsylvania State University/NCAR mesoscale model (MM5) is nested within the ECMWF large-scale forecast fields in a set of 54 h period simulations to provide quantitative precipitation forecasts (QPFs) for each hydrometeorological episode. The hydrological model is forced with these QPFs to evaluate the reliability of the resulting discharge forecasts, while an ensemble prediction system (EPS) based on perturbed atmospheric initial and boundary conditions has been designed to test the value of a probabilistic strategy versus the previous deterministic approach. Specifically, a Potential Vorticity (PV) Inversion technique has been used to perturb the MM5 model initial and boundary states (i.e. ECMWF forecast fields). For that purpose, a PV error climatology has been previously derived in order to introduce realistic PV perturbations in the EPS. Results show the benefits of using a probabilistic approach in those cases where the deterministic QPF presents significant deficiencies over the Llobregat river basin in terms of the rainfall amounts, timing and localization. These deficiences in precipitation fields have a major impact on flood forecasts. Our ensemble strategy has been found useful to reduce the biases at different hydrometric sections along the watershed. Therefore, in an operational context, the devised methodology could be useful to expand the lead times associated with the prediction of similar future floods, helping to alleviate their possible hazardous consequences.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010pcms.confE..62C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010pcms.confE..62C"><span>Severe rainfall prediction systems for civil protection purposes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Comellas, A.; Llasat, M. C.; Molini, L.; Parodi, A.; Siccardi, F.</p> <p>2010-09-01</p> <p>One of the most common natural hazards impending on Mediterranean regions is the occurrence of severe weather structures able to produce heavy rainfall. Floods have killed about 1000 people across all Europe in last 10 years. With the aim of mitigating this kind of risk, quantitative precipitation forecasts (QPF) and rain probability forecasts are two tools nowadays available for national meteorological services and institutions responsible for weather forecasting in order to and predict rainfall, by using either the deterministic or the probabilistic approach. This study provides an insight of the different approaches used by Italian (DPC) and Catalonian (SMC) Civil Protection and the results they achieved with their peculiar issuing-system for early warnings. For the former, the analysis considers the period between 2006-2009 in which the predictive ability of the forecasting system, based on the numerical weather prediction model COSMO-I7, has been put into comparison with ground based observations (composed by more than 2000 raingauge stations, Molini et al., 2009). Italian system is mainly focused on regional-scale warnings providing forecasts for periods never shorter than 18 hours and very often have a 36-hour maximum duration . The information contained in severe weather bulletins is not quantitative and usually is referred to a specific meteorological phenomena (thunderstorms, wind gales et c.). Updates and refining have a usual refresh time of 24 hours. SMC operates within the Catalonian boundaries and uses a warning system that mixes both quantitative and probabilistic information. For each administrative region ("comarca") Catalonia is divided into, forecasters give an approximate value of the average predicted rainfall and the probability of overcoming that threshold. Usually warnings are re-issued every 6 hours and their duration depends on the predicted time extent of the storm. In order to provide a comprehensive QPF verification, the rainfall predicted by Mesoscale Model 5 (MM5), the SMC forecast operational model, is compared with the local rain gauge network for year 2008 (Comellas et al., 2010). This study presents benefits and drawbacks of both Italian and Catalonian systems. Moreover, a particular attention is paid on the link between system's predictive ability and the predicted severe weather type as a function of its space-time development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AdSR...14..227L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AdSR...14..227L"><span>Wind power application research on the fusion of the determination and ensemble prediction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lan, Shi; Lina, Xu; Yuzhu, Hao</p> <p>2017-07-01</p> <p>The fused product of wind speed for the wind farm is designed through the use of wind speed products of ensemble prediction from the European Centre for Medium-Range Weather Forecasts (ECMWF) and professional numerical model products on wind power based on Mesoscale Model5 (MM5) and Beijing Rapid Update Cycle (BJ-RUC), which are suitable for short-term wind power forecasting and electric dispatch. The single-valued forecast is formed by calculating the different ensemble statistics of the Bayesian probabilistic forecasting representing the uncertainty of ECMWF ensemble prediction. Using autoregressive integrated moving average (ARIMA) model to improve the time resolution of the single-valued forecast, and based on the Bayesian model averaging (BMA) and the deterministic numerical model prediction, the optimal wind speed forecasting curve and the confidence interval are provided. The result shows that the fusion forecast has made obvious improvement to the accuracy relative to the existing numerical forecasting products. Compared with the 0-24 h existing deterministic forecast in the validation period, the mean absolute error (MAE) is decreased by 24.3 % and the correlation coefficient (R) is increased by 12.5 %. In comparison with the ECMWF ensemble forecast, the MAE is reduced by 11.7 %, and R is increased 14.5 %. Additionally, MAE did not increase with the prolongation of the forecast ahead.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1343403-data-driven-multi-model-methodology-deep-feature-selection-short-term-wind-forecasting','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1343403-data-driven-multi-model-methodology-deep-feature-selection-short-term-wind-forecasting"><span>A data-driven multi-model methodology with deep feature selection for short-term wind forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias</p> <p></p> <p>With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=267480&Lab=NERL&keyword=discrete&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=267480&Lab=NERL&keyword=discrete&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Meteorology, Emissions, and Grid Resolution: Effects on Discrete and Probabilistic Model Performance</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>In this study, we analyze the impacts of perturbations in meteorology and emissions and variations in grid resolution on air quality forecast simulations. The meteorological perturbations con-sidered in this study introduce a typical variability of ~1°C, 250 - 500 m, 1 m/s, and 1...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/46465','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/46465"><span>Probabilistic and spatially variable niches inferred from demography</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p>Jeffrey M. Diez; Itamar Giladi; Robert Warren; H. Ronald Pulliam</p> <p>2014-01-01</p> <p>Summary 1. Mismatches between species distributions and habitat suitability are predicted by niche theory and have important implications for forecasting how species may respond to environmental changes. Quantifying these mismatches is challenging, however, due to the high dimensionality of species niches and the large spatial and temporal variability in population...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28920893','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28920893"><span>A Functional-Genetic Scheme for Seizure Forecasting in Canine Epilepsy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bou Assi, Elie; Nguyen, Dang K; Rihana, Sandy; Sawan, Mohamad</p> <p>2018-06-01</p> <p>The objective of this work is the development of an accurate seizure forecasting algorithm that considers brain's functional connectivity for electrode selection. We start by proposing Kmeans-directed transfer function, an adaptive functional connectivity method intended for seizure onset zone localization in bilateral intracranial EEG recordings. Electrodes identified as seizure activity sources and sinks are then used to implement a seizure-forecasting algorithm on long-term continuous recordings in dogs with naturally-occurring epilepsy. A precision-recall genetic algorithm is proposed for feature selection in line with a probabilistic support vector machine classifier. Epileptic activity generators were focal in all dogs confirming the diagnosis of focal epilepsy in these animals while sinks spanned both hemispheres in 2 of 3 dogs. Seizure forecasting results show performance improvement compared to previous studies, achieving average sensitivity of 84.82% and time in warning of 0.1. Achieved performances highlight the feasibility of seizure forecasting in canine epilepsy. The ability to improve seizure forecasting provides promise for the development of EEG-triggered closed-loop seizure intervention systems for ambulatory implantation in patients with refractory epilepsy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ApJ...855L..16J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ApJ...855L..16J"><span>An Observationally Constrained Model of a Flux Rope that Formed in the Solar Corona</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>James, Alexander W.; Valori, Gherardo; Green, Lucie M.; Liu, Yang; Cheung, Mark C. M.; Guo, Yang; van Driel-Gesztelyi, Lidia</p> <p>2018-03-01</p> <p>Coronal mass ejections (CMEs) are large-scale eruptions of plasma from the coronae of stars. Understanding the plasma processes involved in CME initiation has applications for space weather forecasting and laboratory plasma experiments. James et al. used extreme-ultraviolet (EUV) observations to conclude that a magnetic flux rope formed in the solar corona above NOAA Active Region 11504 before it erupted on 2012 June 14 (SOL2012-06-14). In this work, we use data from the Solar Dynamics Observatory (SDO) to model the coronal magnetic field of the active region one hour prior to eruption using a nonlinear force-free field extrapolation, and find a flux rope reaching a maximum height of 150 Mm above the photosphere. Estimations of the average twist of the strongly asymmetric extrapolated flux rope are between 1.35 and 1.88 turns, depending on the choice of axis, although the erupting structure was not observed to kink. The decay index near the apex of the axis of the extrapolated flux rope is comparable to typical critical values required for the onset of the torus instability, so we suggest that the torus instability drove the eruption.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26666396','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26666396"><span>Degassing during quiescence as a trigger of magma ascent and volcanic eruptions.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Girona, Társilo; Costa, Fidel; Schubert, Gerald</p> <p>2015-12-15</p> <p>Understanding the mechanisms that control the start-up of volcanic unrest is crucial to improve the forecasting of eruptions at active volcanoes. Among the most active volcanoes in the world are the so-called persistently degassing ones (e.g., Etna, Italy; Merapi, Indonesia), which emit massive amounts of gas during quiescence (several kilotonnes per day) and erupt every few months or years. The hyperactivity of these volcanoes results from frequent pressurizations of the shallow magma plumbing system, which in most cases are thought to occur by the ascent of magma from deep to shallow reservoirs. However, the driving force that causes magma ascent from depth remains unknown. Here we demonstrate that magma ascent can be triggered by the passive release of gas during quiescence, which induces the opening of pathways connecting deep and shallow magma reservoirs. This top-down mechanism for volcanic eruptions contrasts with the more common bottom-up mechanisms in which magma ascent is only driven by processes occurring at depth. A cause-effect relationship between passive degassing and magma ascent can explain the fact that repose times are typically much longer than unrest times preceding eruptions, and may account for the so frequent unrest episodes of persistently degassing volcanoes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.3573C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.3573C"><span>Preparing for floods: flood forecasting and early warning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cloke, Hannah</p> <p>2016-04-01</p> <p>Flood forecasting and early warning has continued to stride ahead in strengthening the preparedness phases of disaster risk management, saving lives and property and reducing the overall impact of severe flood events. For example, continental and global scale flood forecasting systems such as the European Flood Awareness System and the Global Flood Awareness System provide early information about upcoming floods in real time to various decisionmakers. Studies have found that there are monetary benefits to implementing these early flood warning systems, and with the science also in place to provide evidence of benefit and hydrometeorological institutional outlooks warming to the use of probabilistic forecasts, the uptake over the last decade has been rapid and sustained. However, there are many further challenges that lie ahead to improve the science supporting flood early warning and to ensure that appropriate decisions are made to maximise flood preparedness.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1158496-joint-seasonal-arma-approach-modeling-load-forecast-errors-planning-studies','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1158496-joint-seasonal-arma-approach-modeling-load-forecast-errors-planning-studies"><span>Joint Seasonal ARMA Approach for Modeling of Load Forecast Errors in Planning Studies</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.</p> <p>2014-04-14</p> <p>To make informed and robust decisions in the probabilistic power system operation and planning process, it is critical to conduct multiple simulations of the generated combinations of wind and load parameters and their forecast errors to handle the variability and uncertainty of these time series. In order for the simulation results to be trustworthy, the simulated series must preserve the salient statistical characteristics of the real series. In this paper, we analyze day-ahead load forecast error data from multiple balancing authority locations and characterize statistical properties such as mean, standard deviation, autocorrelation, correlation between series, time-of-day bias, and time-of-day autocorrelation.more » We then construct and validate a seasonal autoregressive moving average (ARMA) model to model these characteristics, and use the model to jointly simulate day-ahead load forecast error series for all BAs.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010ems..confE.126P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010ems..confE.126P"><span>Evaluation of precipitation nowcasting techniques for the Alpine region</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Panziera, L.; Mandapaka, P.; Atencia, A.; Hering, A.; Germann, U.; Gabella, M.; Buzzi, M.</p> <p>2010-09-01</p> <p>This study presents a large sample evaluation of different nowcasting systems over the Southern Swiss Alps. Radar observations are taken as a reference against which to assess the performance of the following short-term quantitative precipitation forecasting methods: -Eulerian persistence: the current radar image is taken as forecast. -Lagrangian persistence: precipitation patterns are advected following the field of storm motion (the MAPLE algorithm is used). -NORA: novel nowcasting system which exploits the presence of the orographic forcing; by comparing meteorological predictors estimated in real-time with those from the large historical data set, the events with the highest resemblance are picked to produce the forecast. -COSMO2, the limited area numerical model operationally used at MeteoSwiss -Blending of the aforementioned nowcasting tools precipitation forecasts. The investigation is aimed to set up a probabilistic radar rainfall runoff model experiment for steep Alpine catchments as part of the European research project IMPRINTS.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PApGe.tmp.1283K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PApGe.tmp.1283K"><span>Probabilistic Nowcasting of Low-Visibility Procedure States at Vienna International Airport During Cold Season</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kneringer, Philipp; Dietz, Sebastian J.; Mayr, Georg J.; Zeileis, Achim</p> <p>2018-04-01</p> <p>Airport operations are sensitive to visibility conditions. Low-visibility events may lead to capacity reduction, delays and economic losses. Different levels of low-visibility procedures (lvp) are enacted to ensure aviation safety. A nowcast of the probabilities for each of the lvp categories helps decision makers to optimally schedule their operations. An ordered logistic regression (OLR) model is used to forecast these probabilities directly. It is applied to cold season forecasts at Vienna International Airport for lead times of 30-min out to 2 h. Model inputs are standard meteorological measurements. The skill of the forecasts is accessed by the ranked probability score. OLR outperforms persistence, which is a strong contender at the shortest lead times. The ranked probability score of the OLR is even better than the one of nowcasts from human forecasters. The OLR-based nowcasting system is computationally fast and can be updated instantaneously when new data become available.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015CG.....79...38T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015CG.....79...38T"><span>PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary</p> <p>2015-06-01</p> <p>PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70192378','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70192378"><span>A multidisciplinary effort to assign realistic source parameters to models of volcanic ash-cloud transport and dispersion during eruptions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Mastin, Larry G.; Guffanti, Marianne C.; Servranckx, R.; Webley, P.; Barsotti, S.; Dean, K.; Durant, A.; Ewert, John W.; Neri, A.; Rose, W.I.; Schneider, David J.; Siebert, L.; Stunder, B.; Swanson, G.; Tupper, A.; Volentik, A.; Waythomas, Christopher F.</p> <p>2009-01-01</p> <p>During volcanic eruptions, volcanic ash transport and dispersion models (VATDs) are used to forecast the location and movement of ash clouds over hours to days in order to define hazards to aircraft and to communities downwind. Those models use input parameters, called “eruption source parameters”, such as plume height H, mass eruption rate Ṁ, duration D, and the mass fraction m63 of erupted debris finer than about 4ϕ or 63 μm, which can remain in the cloud for many hours or days. Observational constraints on the value of such parameters are frequently unavailable in the first minutes or hours after an eruption is detected. Moreover, observed plume height may change during an eruption, requiring rapid assignment of new parameters. This paper reports on a group effort to improve the accuracy of source parameters used by VATDs in the early hours of an eruption. We do so by first compiling a list of eruptions for which these parameters are well constrained, and then using these data to review and update previously studied parameter relationships. We find that the existing scatter in plots of H versus Ṁ yields an uncertainty within the 50% confidence interval of plus or minus a factor of four in eruption rate for a given plume height. This scatter is not clearly attributable to biases in measurement techniques or to well-recognized processes such as elutriation from pyroclastic flows. Sparse data on total grain-size distribution suggest that the mass fraction of fine debris m63 could vary by nearly two orders of magnitude between small basaltic eruptions (∼ 0.01) and large silicic ones (> 0.5). We classify eleven eruption types; four types each for different sizes of silicic and mafic eruptions; submarine eruptions; “brief” or Vulcanian eruptions; and eruptions that generate co-ignimbrite or co-pyroclastic flow plumes. For each eruption type we assign source parameters. We then assign a characteristic eruption type to each of the world's ∼ 1500 Holocene volcanoes. These eruption types and associated parameters can be used for ash-cloud modeling in the event of an eruption, when no observational constraints on these parameters are available.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.9707M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.9707M"><span>Supporting inland waterway transport on German waterways by operational forecasting services - water-levels, discharges, river ice</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Meißner, Dennis; Klein, Bastian; Ionita, Monica; Hemri, Stephan; Rademacher, Silke</p> <p>2017-04-01</p> <p>Inland waterway transport (IWT) is an important commercial sector significantly vulnerable to hydrological impacts. River ice and floods limit the availability of the waterway network and may cause considerable damages to waterway infrastructure. Low flows significantly affect IWT's operation efficiency usually several months a year due to the close correlation of (low) water levels / water depths and (high) transport costs. Therefore "navigation-related" hydrological forecasts focussing on the specific requirements of water-bound transport (relevant forecast locations, target parameters, skill characteristics etc.) play a major role in order to mitigate IWT's vulnerability to hydro-meteorological impacts. In light of continuing transport growth within the European Union, hydrological forecasts for the waterways are essential to stimulate the use of the free capacity IWT still offers more consequently. An overview of the current operational and pre-operational forecasting systems for the German waterways predicting water levels, discharges and river ice thickness on various time-scales will be presented. While short-term (deterministic) forecasts have a long tradition in navigation-related forecasting, (probabilistic) forecasting services offering extended lead-times are not yet well-established and are still subject to current research and development activities (e.g. within the EU-projects EUPORIAS and IMPREX). The focus is on improving technical aspects as well as on exploring adequate ways of disseminating and communicating probabilistic forecast information. For the German stretch of the River Rhine, one of the most frequented inland waterways worldwide, the existing deterministic forecast scheme has been extended by ensemble forecasts combined with statistical post-processing modules applying EMOS (Ensemble Model Output Statistics) and ECC (Ensemble Copula Coupling) in order to generate water level predictions up to 10 days and to estimate its predictive uncertainty properly. Additionally for the key locations at the international waterways Rhine, Elbe and Danube three competing forecast approaches are currently tested in a pre-operational set-up in order to generate monthly to seasonal (up to 3 months) forecasts: (1) the well-known Ensemble Streamflow Prediction approach (ensemble based on historical meteorology), (2) coupling hydrological models with post-processed outputs from ECMWF's general circulation model (System 4), and (3) a purely statistical approach based on the stable relationship (teleconnection) of global or regional oceanic, climate and hydrological data with river flows. The current results, still pre-operational, reveal the existence of a valuable predictability of water levels and streamflow also at monthly up to seasonal time-scales along the larger rivers used as waterways in Germany. Last but not least insight into the technical set-up of the aforementioned forecasting systems operated at the Federal Institute of Hydrology, which are based on a Delft-FEWS application, will be given focussing on the step-wise extension of the former system by integrating new components in order to meet the growing needs of the customers and to improve and extend the forecast portfolio for waterway users.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A11D1917R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A11D1917R"><span>Stochastic Forcing for High-Resolution Regional and Global Ocean and Atmosphere-Ocean Coupled Ensemble Forecast System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rowley, C. D.; Hogan, P. J.; Martin, P.; Thoppil, P.; Wei, M.</p> <p>2017-12-01</p> <p>An extended range ensemble forecast system is being developed in the US Navy Earth System Prediction Capability (ESPC), and a global ocean ensemble generation capability to represent uncertainty in the ocean initial conditions has been developed. At extended forecast times, the uncertainty due to the model error overtakes the initial condition as the primary source of forecast uncertainty. Recently, stochastic parameterization or stochastic forcing techniques have been applied to represent the model error in research and operational atmospheric, ocean, and coupled ensemble forecasts. A simple stochastic forcing technique has been developed for application to US Navy high resolution regional and global ocean models, for use in ocean-only and coupled atmosphere-ocean-ice-wave ensemble forecast systems. Perturbation forcing is added to the tendency equations for state variables, with the forcing defined by random 3- or 4-dimensional fields with horizontal, vertical, and temporal correlations specified to characterize different possible kinds of error. Here, we demonstrate the stochastic forcing in regional and global ensemble forecasts with varying perturbation amplitudes and length and time scales, and assess the change in ensemble skill measured by a range of deterministic and probabilistic metrics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MNRAS.473.2753C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MNRAS.473.2753C"><span>Forecasted masses for 7000 Kepler Objects of Interest</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chen, Jingjing; Kipping, David M.</p> <p>2018-01-01</p> <p>Recent transit surveys have discovered thousands of planetary candidates with directly measured radii, but only a small fraction have measured masses. Planetary mass is crucial in assessing the feasibility of numerous observational signatures, such as radial velocities (RVs), atmospheres, moons and rings. In the absence of a direct measurement, a data-driven, probabilistic forecast enables observational planning, and so here we compute posterior distributions for the forecasted mass of ∼7000 Kepler Objects of Interest (KOIs). Our forecasts reveal that the predicted RV amplitudes of Neptunian planets are relatively consistent, as a result of transit survey detection bias, hovering around a few m s-1 level. We find that mass forecasts are unlikely to improve through more precise planetary radii, with the error budget presently dominated by the intrinsic model uncertainty. Our forecasts identify a couple of dozen KOIs near the Terran-Neptunian divide with particularly large RV semi-amplitudes, which could be promising targets to follow up, particularly in the near-infrared. With several more transit surveys planned in the near-future, the need to quickly forecast observational signatures is likely to grow, and the work here provides a template example of such calculations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFM.H23C1368G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFM.H23C1368G"><span>How to improve an un-alterable model forecast? A sequential data assimilation based error updating approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K. T.</p> <p>2012-12-01</p> <p>Accuracy of reservoir inflow forecasts is instrumental for maximizing value of water resources and influences operation of hydropower reservoirs significantly. Improving hourly reservoir inflow forecasts over a 24 hours lead-time is considered with the day-ahead (Elspot) market of the Nordic exchange market in perspectives. The procedure presented comprises of an error model added on top of an un-alterable constant parameter conceptual model, and a sequential data assimilation routine. The structure of the error model was investigated using freely available software for detecting mathematical relationships in a given dataset (EUREQA) and adopted to contain minimum complexity for computational reasons. As new streamflow data become available the extra information manifested in the discrepancies between measurements and conceptual model outputs are extracted and assimilated into the forecasting system recursively using Sequential Monte Carlo technique. Besides improving forecast skills significantly, the probabilistic inflow forecasts provided by the present approach entrains suitable information for reducing uncertainty in decision making processes related to hydropower systems operation. The potential of the current procedure for improving accuracy of inflow forecasts at lead-times unto 24 hours and its reliability in different seasons of the year will be illustrated and discussed thoroughly.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70190368','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70190368"><span>A synoptic view of the Third Uniform California Earthquake Rupture Forecast (UCERF3)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Field, Edward; Jordan, Thomas H.; Page, Morgan T.; Milner, Kevin R.; Shaw, Bruce E.; Dawson, Timothy E.; Biasi, Glenn; Parsons, Thomas E.; Hardebeck, Jeanne L.; Michael, Andrew J.; Weldon, Ray; Powers, Peter; Johnson, Kaj M.; Zeng, Yuehua; Bird, Peter; Felzer, Karen; van der Elst, Nicholas; Madden, Christopher; Arrowsmith, Ramon; Werner, Maximillan J.; Thatcher, Wayne R.</p> <p>2017-01-01</p> <p>Probabilistic forecasting of earthquake‐producing fault ruptures informs all major decisions aimed at reducing seismic risk and improving earthquake resilience. Earthquake forecasting models rely on two scales of hazard evolution: long‐term (decades to centuries) probabilities of fault rupture, constrained by stress renewal statistics, and short‐term (hours to years) probabilities of distributed seismicity, constrained by earthquake‐clustering statistics. Comprehensive datasets on both hazard scales have been integrated into the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3). UCERF3 is the first model to provide self‐consistent rupture probabilities over forecasting intervals from less than an hour to more than a century, and it is the first capable of evaluating the short‐term hazards that result from multievent sequences of complex faulting. This article gives an overview of UCERF3, illustrates the short‐term probabilities with aftershock scenarios, and draws some valuable scientific conclusions from the modeling results. In particular, seismic, geologic, and geodetic data, when combined in the UCERF3 framework, reject two types of fault‐based models: long‐term forecasts constrained to have local Gutenberg–Richter scaling, and short‐term forecasts that lack stress relaxation by elastic rebound.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2001AIPC..553..321W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2001AIPC..553..321W"><span>Natural hazard metaphors for financial crises</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Woo, Gordon</p> <p>2001-02-01</p> <p>Linguistic metaphors drawn from natural hazards are commonly used at times of financial crisis. A brewing storm, a seismic shock, etc., evoke the abruptness and severity of a market collapse. If the language of windstorms, earthquakes and volcanic eruptions is helpful in illustrating a financial crisis, what about the mathematics of natural catastrophes? Already, earthquake prediction methods have been applied to economic recessions, and volcanic eruption forecasting techniques have been applied to market crashes. The purpose of this contribution is to survey broadly the mathematics of natural catastrophes, so as to convey the range of underlying principles, some of which may serve as mathematical metaphors for financial applications.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>