Sample records for event based uncertainty

  1. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  2. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  3. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  4. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Waveform-based Bayesian full moment tensor inversion and uncertainty determination for the induced seismicity in an oil/gas field

    NASA Astrophysics Data System (ADS)

    Gu, Chen; Marzouk, Youssef M.; Toksöz, M. Nafi

    2018-03-01

    Small earthquakes occur due to natural tectonic motions and are induced by oil and gas production processes. In many oil/gas fields and hydrofracking processes, induced earthquakes result from fluid extraction or injection. The locations and source mechanisms of these earthquakes provide valuable information about the reservoirs. Analysis of induced seismic events has mostly assumed a double-couple source mechanism. However, recent studies have shown a non-negligible percentage of non-double-couple components of source moment tensors in hydraulic fracturing events, assuming a full moment tensor source mechanism. Without uncertainty quantification of the moment tensor solution, it is difficult to determine the reliability of these source models. This study develops a Bayesian method to perform waveform-based full moment tensor inversion and uncertainty quantification for induced seismic events, accounting for both location and velocity model uncertainties. We conduct tests with synthetic events to validate the method, and then apply our newly developed Bayesian inversion approach to real induced seismicity in an oil/gas field in the sultanate of Oman—determining the uncertainties in the source mechanism and in the location of that event.

  6. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    PubMed

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  7. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  8. Event-triggered resilient filtering with stochastic uncertainties and successive packet dropouts via variance-constrained approach

    NASA Astrophysics Data System (ADS)

    Jia, Chaoqing; Hu, Jun; Chen, Dongyan; Liu, Yurong; Alsaadi, Fuad E.

    2018-07-01

    In this paper, we discuss the event-triggered resilient filtering problem for a class of time-varying systems subject to stochastic uncertainties and successive packet dropouts. The event-triggered mechanism is employed with hope to reduce the communication burden and save network resources. The stochastic uncertainties are considered to describe the modelling errors and the phenomenon of successive packet dropouts is characterized by a random variable obeying the Bernoulli distribution. The aim of the paper is to provide a resilient event-based filtering approach for addressed time-varying systems such that, for all stochastic uncertainties, successive packet dropouts and filter gain perturbation, an optimized upper bound of the filtering error covariance is obtained by designing the filter gain. Finally, simulations are provided to demonstrate the effectiveness of the proposed robust optimal filtering strategy.

  9. Toward sensor-based context aware systems.

    PubMed

    Sakurai, Yoshitaka; Takada, Kouhei; Anisetti, Marco; Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Tsuruta, Setsuo

    2012-01-01

    This paper proposes a methodology for sensor data interpretation that can combine sensor outputs with contexts represented as sets of annotated business rules. Sensor readings are interpreted to generate events labeled with the appropriate type and level of uncertainty. Then, the appropriate context is selected. Reconciliation of different uncertainty types is achieved by a simple technique that moves uncertainty from events to business rules by generating combs of standard Boolean predicates. Finally, context rules are evaluated together with the events to take a decision. The feasibility of our idea is demonstrated via a case study where a context-reasoning engine has been connected to simulated heartbeat sensors using prerecorded experimental data. We use sensor outputs to identify the proper context of operation of a system and trigger decision-making based on context information.

  10. From mess to mass: a methodology for calculating storm event pollutant loads with their uncertainties, from continuous raw data time series.

    PubMed

    Métadier, M; Bertrand-Krajewski, J-L

    2011-01-01

    With the increasing implementation of continuous monitoring of both discharge and water quality in sewer systems, large data bases are now available. In order to manage large amounts of data and calculate various variables and indicators of interest it is necessary to apply automated methods for data processing. This paper deals with the processing of short time step turbidity time series to estimate TSS (Total Suspended Solids) and COD (Chemical Oxygen Demand) event loads in sewer systems during storm events and their associated uncertainties. The following steps are described: (i) sensor calibration, (ii) estimation of data uncertainties, (iii) correction of raw data, (iv) data pre-validation tests, (v) final validation, and (vi) calculation of TSS and COD event loads and estimation of their uncertainties. These steps have been implemented in an integrated software tool. Examples of results are given for a set of 33 storm events monitored in a stormwater separate sewer system.

  11. Ensemble-based flash-flood modelling: Taking into account hydrodynamic parameters and initial soil moisture uncertainties

    NASA Astrophysics Data System (ADS)

    Edouard, Simon; Vincendon, Béatrice; Ducrocq, Véronique

    2018-05-01

    Intense precipitation events in the Mediterranean often lead to devastating flash floods (FF). FF modelling is affected by several kinds of uncertainties and Hydrological Ensemble Prediction Systems (HEPS) are designed to take those uncertainties into account. The major source of uncertainty comes from rainfall forcing and convective-scale meteorological ensemble prediction systems can manage it for forecasting purpose. But other sources are related to the hydrological modelling part of the HEPS. This study focuses on the uncertainties arising from the hydrological model parameters and initial soil moisture with aim to design an ensemble-based version of an hydrological model dedicated to Mediterranean fast responding rivers simulations, the ISBA-TOP coupled system. The first step consists in identifying the parameters that have the strongest influence on FF simulations by assuming perfect precipitation. A sensitivity study is carried out first using a synthetic framework and then for several real events and several catchments. Perturbation methods varying the most sensitive parameters as well as initial soil moisture allow designing an ensemble-based version of ISBA-TOP. The first results of this system on some real events are presented. The direct perspective of this work will be to drive this ensemble-based version with the members of a convective-scale meteorological ensemble prediction system to design a complete HEPS for FF forecasting.

  12. Station Correction Uncertainty in Multiple Event Location Algorithms and the Effect on Error Ellipses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Jason P.; Carlson, Deborah K.; Ortiz, Anne

    Accurate location of seismic events is crucial for nuclear explosion monitoring. There are several sources of error in seismic location that must be taken into account to obtain high confidence results. Most location techniques account for uncertainties in the phase arrival times (measurement error) and the bias of the velocity model (model error), but they do not account for the uncertainty of the velocity model bias. By determining and incorporating this uncertainty in the location algorithm we seek to improve the accuracy of the calculated locations and uncertainty ellipses. In order to correct for deficiencies in the velocity model, itmore » is necessary to apply station specific corrections to the predicted arrival times. Both master event and multiple event location techniques assume that the station corrections are known perfectly, when in reality there is an uncertainty associated with these corrections. For multiple event location algorithms that calculate station corrections as part of the inversion, it is possible to determine the variance of the corrections. The variance can then be used to weight the arrivals associated with each station, thereby giving more influence to stations with consistent corrections. We have modified an existing multiple event location program (based on PMEL, Pavlis and Booker, 1983). We are exploring weighting arrivals with the inverse of the station correction standard deviation as well using the conditional probability of the calculated station corrections. This is in addition to the weighting already given to the measurement and modeling error terms. We re-locate a group of mining explosions that occurred at Black Thunder, Wyoming, and compare the results to those generated without accounting for station correction uncertainty.« less

  13. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  14. Differential effects of uncertainty on LPP responses to emotional events during explicit and implicit anticipation.

    PubMed

    Lin, Huiyan; Liang, Jiafeng; Jin, Hua; Zhao, Dongmei

    2018-07-01

    Previous studies have investigated whether uncertainty influences neural responses to emotional events. The findings of such studies, particularly with respect to event-related potentials (ERPs), have been controversial due to several factors, such as the stimuli that serve as cues and the emotional content of the events. However, it is still unknown whether the effects of uncertainty on ERP responses to emotional events are influenced by anticipation patterns (e.g., explicit or implicit anticipation). To address this issue, participants in the present study were presented with anticipatory cues and then emotional (negative and neutral) pictures. The cues either did or did not signify the emotional content of the upcoming picture. In the inter-stimulus intervals between cues and pictures, participants were asked to estimate the expected probability of the occurrence of a specific emotional category of the subsequent picture based on a scale in the explicit anticipation condition, while in the implicit condition, participants were asked to indicate, using a number on a scale, which color was different from the others. The results revealed that in the explicit condition, uncertainty increased late positive potential (LPP) responses, particularly for negative pictures, whereas LPP responses were larger for certain negative pictures than for uncertain negative pictures in the implicit condition. The findings in the present study suggest that the anticipation pattern influences the effects of uncertainty when evaluation of negative events. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Treesearch

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  16. An Integrated Gate Turnaround Management Concept Leveraging Big Data/Analytics for NAS Performance Improvements

    NASA Technical Reports Server (NTRS)

    Chung, William; Chachad, Girish; Hochstetler, Ronald

    2016-01-01

    The Integrated Gate Turnaround Management (IGTM) concept was developed to improve the gate turnaround performance at the airport by leveraging relevant historical data to support optimization of airport gate operations, which include: taxi to the gate, gate services, push back, taxi to the runway, and takeoff, based on available resources, constraints, and uncertainties. By analyzing events of gate operations, primary performance dependent attributes of these events were identified for the historical data analysis such that performance models can be developed based on uncertainties to support descriptive, predictive, and prescriptive functions. A system architecture was developed to examine system requirements in support of such a concept. An IGTM prototype was developed to demonstrate the concept using a distributed network and collaborative decision tools for stakeholders to meet on time pushback performance under uncertainties.

  17. New method for probabilistic traffic demand predictions for en route sectors based on uncertain predictions of individual flight events.

    DOT National Transportation Integrated Search

    2011-06-14

    This paper presents a novel analytical approach to and techniques for translating characteristics of uncertainty in predicting sector entry times and times in sector for individual flights into characteristics of uncertainty in predicting one-minute ...

  18. Temporal models for the episodic volcanism of Campi Flegrei caldera (Italy) with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Flandoli, Franco; Neri, Augusto; Isaia, Roberto; Vitale, Stefano

    2016-11-01

    After the large-scale event of Neapolitan Yellow Tuff ( 15 ka B.P.), intense and mostly explosive volcanism has occurred within and along the boundaries of the Campi Flegrei caldera (Italy). Eruptions occurred closely spaced in time, over periods from a few centuries to a few millennia, and were alternated with periods of quiescence lasting up to several millennia. Often events also occurred closely in space, thus generating a cluster of events. This study had two main objectives: (1) to describe the uncertainty in the geologic record by using a quantitative model and (2) to develop, based on the uncertainty assessment, a long-term subdomain specific temporal probability model that describes the temporal and spatial eruptive behavior of the caldera. In particular, the study adopts a space-time doubly stochastic nonhomogeneous Poisson-type model with a local self-excitation feature able to generate clustering of events which are consistent with the reconstructed record of Campi Flegrei. Results allow the evaluation of similarities and differences between the three epochs of activity as well as to derive eruptive base rate of the caldera and its capacity to generate clusters of events. The temporal probability model is also used to investigate the effect of the most recent eruption of Monte Nuovo (A.D. 1538) in a possible reactivation of the caldera and to estimate the time to the next eruption under different volcanological and modeling assumptions.

  19. Dynamic Event Tree advancements and control logic improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less

  20. A multi-scale ensemble-based framework for forecasting compound coastal-riverine flooding: The Hackensack-Passaic watershed and Newark Bay

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Wang, Y.; Georgas, N.; Blumberg, A.; Pullen, J.

    2017-12-01

    Estuarine regions can experience compound impacts from coastal storm surge and riverine flooding. The challenges in forecasting flooding in such areas are multi-faceted due to uncertainties associated with meteorological drivers and interactions between hydrological and coastal processes. The objective of this work is to evaluate how uncertainties from meteorological predictions propagate through an ensemble-based flood prediction framework and translate into uncertainties in simulated inundation extents. A multi-scale framework, consisting of hydrologic, coastal and hydrodynamic models, was used to simulate two extreme flood events at the confluence of the Passaic and Hackensack rivers and Newark Bay. The events were Hurricane Irene (2011), a combination of inland flooding and coastal storm surge, and Hurricane Sandy (2012) where coastal storm surge was the dominant component. The hydrodynamic component of the framework was first forced with measured streamflow and ocean water level data to establish baseline inundation extents with the best available forcing data. The coastal and hydrologic models were then forced with meteorological predictions from 21 ensemble members of the Global Ensemble Forecast System (GEFS) to retrospectively represent potential future conditions up to 96 hours prior to the events. Inundation extents produced by the hydrodynamic model, forced with the 95th percentile of the ensemble-based coastal and hydrologic boundary conditions, were in good agreement with baseline conditions for both events. The USGS reanalysis of Hurricane Sandy inundation extents was encapsulated between the 50th and 95th percentile of the forecasted inundation extents, and that of Hurricane Irene was similar but with caveats associated with data availability and reliability. This work highlights the importance of accounting for meteorological uncertainty to represent a range of possible future inundation extents at high resolution (∼m).

  1. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  2. Optimal Futility Interim Design: A Predictive Probability of Success Approach with Time-to-Event Endpoint.

    PubMed

    Tang, Zhongwen

    2015-01-01

    An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.

  3. Inexact Socio-Dynamic Modeling of Groundwater Contamination Management

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Zhang, X.

    2015-12-01

    Groundwater contamination may alter the behaviors of the public such as adaptation to such a contamination event. On the other hand, social behaviors may affect groundwater contamination and associated risk levels such as through changing ingestion amount of groundwater due to the contamination. Decisions should consider not only the contamination itself, but also social attitudes on such contamination events. Such decisions are inherently associated with uncertainty, such as subjective judgement from decision makers and their implicit knowledge on selection of whether to supply water or reduce the amount of supplied water under the scenario of the contamination. A socio-dynamic model based on the theories of information-gap and fuzzy sets is being developed to address the social behaviors facing the groundwater contamination and applied to a synthetic problem designed based on typical groundwater remediation sites where the effects of social behaviors on decisions are investigated and analyzed. Different uncertainties including deep uncertainty and vague/ambiguous uncertainty are effectively and integrally addressed. The results can provide scientifically-defensible decision supports for groundwater management in face of the contamination.

  4. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  5. Assessment of Observational Uncertainty in Extreme Precipitation Events over the Continental United States

    NASA Astrophysics Data System (ADS)

    Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.

    2017-12-01

    Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed in this analysis, preliminary results from the development of a flexible categorization scheme, that scales with grid resolution, are presented.

  6. Estimation of the displacements among distant events based on parallel tracking of events in seismic traces under uncertainty

    NASA Astrophysics Data System (ADS)

    Huamán Bustamante, Samuel G.; Cavalcanti Pacheco, Marco A.; Lazo Lazo, Juan G.

    2018-07-01

    The method we propose in this paper seeks to estimate interface displacements among strata related with reflection seismic events, in comparison to the interfaces at other reference points. To do so, we search for reflection events in the reference point of a second seismic trace taken from the same 3D survey and close to a well. However, the nature of the seismic data introduces uncertainty in the results. Therefore, we perform an uncertainty analysis using the standard deviation results from several experiments with cross-correlation of signals. To estimate the displacements of events in depth between two seismic traces, we create a synthetic seismic trace with an empirical wavelet and the sonic log of the well, close to the second seismic trace. Then, we relate the events of the seismic traces to the depth of the sonic log. Finally, we test the method with data from the Namorado Field in Brazil. The results show that the accuracy of the event estimated depth depends on the results of parallel cross-correlation, primarily those from the procedures used in the integration of seismic data with data from the well. The proposed approach can correctly identify several similar events in two seismic traces without requiring all seismic traces between two distant points of interest to correlate strata in the subsurface.

  7. Automatic generation of efficient orderings of events for scheduling applications

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.

    1994-01-01

    In scheduling a set of tasks, it is often not known with certainty how long a given event will take. We call this duration uncertainty. Duration uncertainty is a primary obstacle to the successful completion of a schedule. If a duration of one task is longer than expected, the remaining tasks are delayed. The delay may result in the abandonment of the schedule itself, a phenomenon known as schedule breakage. One response to schedule breakage is on-line, dynamic rescheduling. A more recent alternative is called proactive rescheduling. This method uses statistical data about the durations of events in order to anticipate the locations in the schedule where breakage is likely prior to the execution of the schedule. It generates alternative schedules at such sensitive points, which can be then applied by the scheduler at execution time, without the delay incurred by dynamic rescheduling. This paper proposes a technique for making proactive error management more effective. The technique is based on applying a similarity-based method of clustering to the problem of identifying similar events in a set of events.

  8. Assessing dry weather flow contribution in TSS and COD storm events loads in combined sewer systems.

    PubMed

    Métadier, M; Bertrand-Krajewski, J L

    2011-01-01

    Continuous high resolution long term turbidity measurements along with continuous discharge measurements are now recognised as an appropriate technique for the estimation of in sewer total suspended solids (TSS) and Chemical Oxygen Demand (COD) loads during storm events. In the combined system of the Ecully urban catchment (Lyon, France), this technique is implemented since 2003, with more than 200 storm events monitored. This paper presents a method for the estimation of the dry weather (DW) contribution to measured total TSS and COD event loads with special attention devoted to uncertainties assessment. The method accounts for the dynamics of both discharge and turbidity time series at two minutes time step. The study is based on 180 DW days monitored in 2007-2008. Three distinct classes of DW days were evidenced. Variability analysis and quantification showed that no seasonal effect and no trend over the year were detectable. The law of propagation of uncertainties is applicable for uncertainties estimation. The method has then been applied to all measured storm events. This study confirms the interest of long term continuous discharge and turbidity time series in sewer systems, especially in the perspective of wet weather quality modelling.

  9. Deterministic versus evidence-based attitude towards clinical diagnosis.

    PubMed

    Soltani, Akbar; Moayyeri, Alireza

    2007-08-01

    Generally, two basic classes have been proposed for scientific explanation of events. Deductive reasoning emphasizes on reaching conclusions about a hypothesis based on verification of universal laws pertinent to that hypothesis, while inductive or probabilistic reasoning explains an event by calculation of some probabilities for that event to be related to a given hypothesis. Although both types of reasoning are used in clinical practice, evidence-based medicine stresses on the advantages of the second approach for most instances in medical decision making. While 'probabilistic or evidence-based' reasoning seems to involve more mathematical formulas at the first look, this attitude is more dynamic and less imprisoned by the rigidity of mathematics comparing with 'deterministic or mathematical attitude'. In the field of medical diagnosis, appreciation of uncertainty in clinical encounters and utilization of likelihood ratio as measure of accuracy seem to be the most important characteristics of evidence-based doctors. Other characteristics include use of series of tests for refining probability, changing diagnostic thresholds considering external evidences and nature of the disease, and attention to confidence intervals to estimate uncertainty of research-derived parameters.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudsen, J.K.; Smith, C.L.

    The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more thanmore » one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.« less

  11. Cued uncertainty modulates later recognition of emotional pictures: An ERP study.

    PubMed

    Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Zhao, Dongmei; Yin, Desheng; Jin, Hua

    2017-06-01

    Previous studies have shown that uncertainty about the emotional content of an upcoming event modulates event-related potentials (ERPs) during the encoding of the event, and this modulation is affected by whether there are cues (i.e., cued uncertainty) or not (i.e., uncued uncertainty) prior to the encoding of the uncertain event. Recently, we showed that uncued uncertainty affected ERPs in later recognition of the emotional event. However, it is as yet unknown how the ERP effects of recognition are modulated by cued uncertainty. To address this issue, participants were asked to view emotional (negative and neutral) pictures that were presented after cues. The cues either indicated the emotional content of the pictures (the certain condition) or not (the cued uncertain condition). Subsequently, participants had to perform an unexpected old/new task in which old and novel pictures were shown without any cues. ERP data in the old/new task showed smaller P2 amplitudes for neutral pictures in the cued uncertain condition compared to the certain condition, but this uncertainty effect was not observed for negative pictures. Additionally, P3 amplitudes were generally enlarged for pictures in the cued uncertain condition. Taken together, the present findings indicate that cued uncertainty alters later recognition of emotional events in relevance to feature processing and attention allocation. Copyright © 2017. Published by Elsevier B.V.

  12. An annually resolved marine proxy record for the 8.2K cold event from the northern North Sea based on bivalve shells

    NASA Astrophysics Data System (ADS)

    Butler, Paul; Estrella-Martínez, Juan; Scourse, James

    2017-04-01

    The so-called 8.2K cold event is a rapid cooling of about 6° +/- 2° recorded in the Greenland ice core record and thought to be a consequence of a freshwater pulse from the Laurentide ice sheet which reduced deepwater formation in the North Atlantic. In the Greenland ice cores the event is characterized by a maximum extent of 159 years and a central event lasting for 70 years. As discussed by Thomas et al (QSR, 2007), the low resolution and dating uncertainty of much palaeoclimate data makes it difficult to determine the rates of change and causal sequence that characterise the event at different locations. We present here a bivalve shell chronology based on four shells of Arctica islandica from the northern North Sea which (within radiocarbon uncertainty) is coeval with the 8.2K event recorded in the Greenland ice cores. The years of death of each shell based on radiocarbon analysis and crossmatching are 8094, 8134, 8147, and 8208 yrs BP (where "present" = AD 1950), with an associated radiocarbon uncertainty of +/-80 yrs, and their longevities are 106, 122, 112 and 79 years respectively. The total length of the chronology is 192 years (8286 - 8094 BP +/- 80 yrs). The most noticeable feature of the chronology is an 60-year period of increasing growth which may correspond to a similar period of decreasing ice accumulation in the GRIP (central Greenland) ice core record. We tentatively suggest that this reflects increasing food supply to the benthos as summer stratification is weakened by colder seawater temperatures. Stable isotope analyses (results expected to be available when this abstract is presented), will show changes at annual and seasonal resolution, potentially giving a very detailed insight into the causal factors associated with the 8.2K event and its impact in the northern North Sea.

  13. Complementary contributions of basolateral amygdala and orbitofrontal cortex to value learning under uncertainty

    PubMed Central

    Stolyarova, Alexandra; Izquierdo, Alicia

    2017-01-01

    We make choices based on the values of expected outcomes, informed by previous experience in similar settings. When the outcomes of our decisions consistently violate expectations, new learning is needed to maximize rewards. Yet not every surprising event indicates a meaningful change in the environment. Even when conditions are stable overall, outcomes of a single experience can still be unpredictable due to small fluctuations (i.e., expected uncertainty) in reward or costs. In the present work, we investigate causal contributions of the basolateral amygdala (BLA) and orbitofrontal cortex (OFC) in rats to learning under expected outcome uncertainty in a novel delay-based task that incorporates both predictable fluctuations and directional shifts in outcome values. We demonstrate that OFC is required to accurately represent the distribution of wait times to stabilize choice preferences despite trial-by-trial fluctuations in outcomes, whereas BLA is necessary for the facilitation of learning in response to surprising events. DOI: http://dx.doi.org/10.7554/eLife.27483.001 PMID:28682238

  14. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    PubMed

    Schaarup-Jensen, K; Rasmussen, M R; Thorndahl, S

    2009-01-01

    In urban drainage modelling long-term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties with regards to long-term prediction of maximum water levels and combined sewer overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO volumes. Traditionally, long-term rainfall series, from a local rain gauge, are unavailable. In the present case study, however, long and local rain series are available. 2 rainfall gauges have recorded events for approximately 9 years at 2 locations within the catchment. Beside these 2 gauges another 7 gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity can be handled, e.g. by introducing an "averaging procedure" based on the variability within the set of statistics. All simulations are performed by means of the MOUSE LTS model.

  15. Regional Seismic Travel-Time Prediction, Uncertainty, and Location Improvement in Western Eurasia

    NASA Astrophysics Data System (ADS)

    Flanagan, M. P.; Myers, S. C.

    2004-12-01

    We investigate our ability to improve regional travel-time prediction and seismic event location using an a priori, three-dimensional velocity model of Western Eurasia and North Africa: WENA1.0 [Pasyanos et al., 2004]. Our objective is to improve the accuracy of seismic location estimates and calculate representative location uncertainty estimates. As we focus on the geographic region of Western Eurasia, the Middle East, and North Africa, we develop, test, and validate 3D model-based travel-time prediction models for 30 stations in the study region. Three principal results are presented. First, the 3D WENA1.0 velocity model improves travel-time prediction over the iasp91 model, as measured by variance reduction, for regional Pg, Pn, and P phases recorded at the 30 stations. Second, a distance-dependent uncertainty model is developed and tested for the WENA1.0 model. Third, an end-to-end validation test based on 500 event relocations demonstrates improved location performance over the 1-dimensional iasp91 model. Validation of the 3D model is based on a comparison of approximately 11,000 Pg, Pn, and P travel-time predictions and empirical observations from ground truth (GT) events. Ray coverage for the validation dataset is chosen to provide representative, regional-distance sampling across Eurasia and North Africa. The WENA1.0 model markedly improves travel-time predictions for most stations with an average variance reduction of 25% for all ray paths. We find that improvement is station dependent, with some stations benefiting greatly from WENA1.0 predictions (52% at APA, 33% at BKR, and 32% at NIL), some stations showing moderate improvement (12% at KEV, 14% at BOM, and 12% at TAM), some benefiting only slightly (6% at MOX, and 4% at SVE), and some are degraded (-6% at MLR and -18% at QUE). We further test WENA1.0 by comparing location accuracy with results obtained using the iasp91 model. Again, relocation of these events is dependent on ray paths that evenly sample WENA1.0 and therefore provide an unbiased assessment of location performance. A statistically significant sample is achieved by generating 500 location realizations based on 5 events with location accuracy between 1 km and 5 km. Each realization is a randomly selected event with location determined by randomly selecting 5 stations from the available network. In 340 cases (68% of the instances), locations are improved, and average mislocation is reduced from 31 km to 26 km. Preliminary test of uncertainty estimates suggest that our uncertainty model produces location uncertainty ellipses that are representative of location accuracy. These results highlight the importance of accurate GT datasets in assessing regional travel-time models and demonstrate that an a priori 3D model can markedly improve our ability to locate small magnitude events in a regional monitoring context. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-CONF-206386.

  16. Spatial Uncertainty Modeling of Fuzzy Information in Images for Pattern Classification

    PubMed Central

    Pham, Tuan D.

    2014-01-01

    The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction. PMID:25157744

  17. Physically-based modelling of high magnitude torrent events with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wing-Yuen Chow, Candace; Ramirez, Jorge; Zimmermann, Markus; Keiler, Margreth

    2017-04-01

    High magnitude torrent events are associated with the rapid propagation of vast quantities of water and available sediment downslope where human settlements may be established. Assessing the vulnerability of built structures to these events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. The specific contribution of the presented work describes a procedure simulate these damaging events by applying physically-based modelling and to include uncertainty information about the simulated results. This is a first step in the development of vulnerability curves based on several intensity parameters (i.e. maximum velocity, sediment deposition depth and impact pressure). The investigation process begins with the collection, organization and interpretation of detailed post-event documentation and photograph-based observation data of affected structures in three sites that exemplify the impact of highly destructive mudflows and flood occurrences on settlements in Switzerland. Hazard intensity proxies are then simulated with the physically-based FLO-2D model (O'Brien et al., 1993). Prior to modelling, global sensitivity analysis is conducted to support a better understanding of model behaviour, parameterization and the quantification of uncertainties (Song et al., 2015). The inclusion of information describing the degree of confidence in the simulated results supports the credibility of vulnerability curves developed with the modelled data. First, key parameters are identified and selected based on literature review. Truncated a priori ranges of parameter values were then defined by expert solicitation. Local sensitivity analysis is performed based on manual calibration to provide an understanding of the parameters relevant to the case studies of interest. Finally, automated parameter estimation is performed to comprehensively search for optimal parameter combinations and associated values, which are evaluated using the observed data collected in the first stage of the investigation. O'Brien, J.S., Julien, P.Y., Fullerton, W. T., 1993. Two-dimensional water flood and mudflow simulation. Journal of Hydraulic Engineering 119(2): 244-261.
 Song, X., Zhang, J., Zhan, C., Xuan, Y., Ye, M., Xu C., 2015. Global sensitivity analysis in hydrological modeling: Review of concepts, methods, theoretical frameworks, Journal of Hydrology 523: 739-757.

  18. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  19. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  20. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  1. An Event-based Assessment of Uncertainty in Measurements Between Multiple Precipitation Sensors During the North American Monsoon

    NASA Astrophysics Data System (ADS)

    Kautz, M. A.; Keefer, T.; Demaria, E. M.; Goodrich, D. C.; Hazenberg, P.; Petersen, W. A.; Wingo, M. T.; Smith, J.

    2017-12-01

    The USDA - Agricultural Research Service (USDA-ARS) Long-Term Agroecosystem Research network (LTAR) is a partnership between 18 long-term research sites across the United States. As part of the program, LTAR aims to assemble a network of common sensors and measurements of hydrological, meteorological, and biophysical variables to accompany the legacy datasets of individual LTAR sites. Uncertainty remains as to how the common sensor-based measurements will compare to those measured with existing sensors at each site. The USDA-ARS Southwest Watershed Research Center (SWRC) operated Walnut Gulch Experimental Watershed (WGEW) represents the semiarid grazing lands located in southeastern Arizona in the LTAR network. The bimodal precipitation regime of this region is characterized by large-scale frontal precipitation in the winter and isolated, high-intensity, convective thunderstorms in the summer during the North American Monsoon (NAM). SWRC maintains a network of 90 rain gauges across the 150 km2 WGEW and surrounding area, with measurements dating back to the 1950's. The high intensity and isolated nature of the summer storms has historically made it difficult to quantify compared to other regimes in the US. This study assesses the uncertainty of measurement between the common LTAR Belfort All Weather Precipitation Gauge (AEPG 600) and the legacy WGEW weighing-type raingage. Additionally, in a collaboration with NASA Global Precipitation Measurement mission (GPM) and the University of Arizona a dense array of precipitation measuring sensors was installed at WGEW within a 10 meter radius for observation during the NAM, July through October 2017. In addition to two WGEW weighing-type gauges, the array includes: an AEPG 600, a tipping bucket, a weighing-bucket installed with orifice at ground level, an OTT Pluvio2 rain gauge, a Two-Dimensional Video Disdrometer (2DVD), and three OTT Parsivel2 disdrometers. An event-based comparison was made between precipitation sensors using metrics including total depth, peak intensity (1, 15, 30, and 60 minute), event duration, time to peak intensity, and start time of event. These results provide further insight into the uncertainties of measuring point-based precipitation in this unique precipitation regime and representation in large-scale observation networks.

  2. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  3. Jet energy scale and resolution in the CMS experiment in pp collisions at 8 TeV

    NASA Astrophysics Data System (ADS)

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; Adam, W.; Asilar, E.; Bergauer, T.; Brandstetter, J.; Brondolin, E.; Dragicevic, M.; Erö, J.; Flechl, M.; Friedl, M.; Frühwirth, R.; Ghete, V. M.; Hartl, C.; Hörmann, N.; Hrubec, J.; Jeitler, M.; Knünz, V.; König, A.; Krammer, M.; Krätschmer, I.; Liko, D.; Matsushita, T.; Mikulec, I.; Rabady, D.; Rahbaran, B.; Rohringer, H.; Schieck, J.; Schöfbeck, R.; Strauss, J.; Treberer-Treberspurg, W.; Waltenberger, W.; Wulz, C.-E.; Mossolov, V.; Shumeiko, N.; Suarez Gonzalez, J.; Alderweireldt, S.; Cornelis, T.; De Wolf, E. A.; Janssen, X.; Knutsson, A.; Lauwers, J.; Luyckx, S.; Van De Klundert, M.; Van Haevermaet, H.; Van Mechelen, P.; Van Remortel, N.; Van Spilbeeck, A.; Abu Zeid, S.; Blekman, F.; D'Hondt, J.; Daci, N.; De Bruyn, I.; Deroover, K.; Heracleous, N.; Keaveney, J.; Lowette, S.; Moreels, L.; Olbrechts, A.; Python, Q.; Strom, D.; Tavernier, S.; Van Doninck, W.; Van Mulders, P.; Van Onsem, G. P.; Van Parijs, I.; Barria, P.; Brun, H.; Caillol, C.; Clerbaux, B.; De Lentdecker, G.; Fasanella, G.; Favart, L.; Grebenyuk, A.; Karapostoli, G.; Lenzi, T.; Léonard, A.; Maerschalk, T.; Marinov, A.; Perniè, L.; Randle-conde, A.; Reis, T.; Seva, T.; Vander Velde, C.; Vanlaer, P.; Yonamine, R.; Zenoni, F.; Zhang, F.; Beernaert, K.; Benucci, L.; Cimmino, A.; Crucy, S.; Dobur, D.; Fagot, A.; Garcia, G.; Gul, M.; Mccartin, J.; Ocampo Rios, A. A.; Poyraz, D.; Ryckbosch, D.; Salva, S.; Sigamani, M.; Strobbe, N.; Tytgat, M.; Van Driessche, W.; Yazgan, E.; Zaganidis, N.; Basegmez, S.; Beluffi, C.; Bondu, O.; Brochet, S.; Bruno, G.; Caudron, A.; Ceard, L.; Da Silveira, G. G.; Delaere, C.; Favart, D.; Forthomme, L.; Giammanco, A.; Hollar, J.; Jafari, A.; Jez, P.; Komm, M.; Lemaitre, V.; Mertens, A.; Nuttens, C.; Perrini, L.; Pin, A.; Piotrzkowski, K.; Popov, A.; Quertenmont, L.; Selvaggi, M.; Vidal Marono, M.; Beliy, N.; Hammad, G. H.; Aldá Júnior, W. L.; Alves, F. L.; Alves, G. A.; Brito, L.; Correa Martins Junior, M.; Hamer, M.; Hensel, C.; Mora Herrera, C.; Moraes, A.; Pol, M. E.; Rebello Teles, P.; Belchior Batista Das Chagas, E.; Carvalho, W.; Chinellato, J.; Custódio, A.; Da Costa, E. M.; Damiao, D. De Jesus; De Oliveira Martins, C.; Fonseca De Souza, S.; Huertas Guativa, L. M.; Malbouisson, H.; Matos Figueiredo, D.; Mundim, L.; Nogima, H.; Prado Da Silva, W. L.; Santoro, A.; Sznajder, A.; Tonelli Manganote, E. J.; Vilela Pereira, A.; Ahuja, S.; Bernardes, C. A.; De Souza Santos, A.; Dogra, S.; Fernandez Perez Tomei, T. R.; Gregores, E. M.; Mercadante, P. G.; Moon, C. S.; Novaes, S. F.; Padula, Sandra S.; Romero Abad, D.; Ruiz Vargas, J. C.; Aleksandrov, A.; Hadjiiska, R.; Iaydjiev, P.; Rodozov, M.; Stoykova, S.; Sultanov, G.; Vutova, M.; Dimitrov, A.; Glushkov, I.; Litov, L.; Pavlov, B.; Petkov, P.; Ahmad, M.; Bian, J. G.; Chen, G. M.; Chen, H. S.; Chen, M.; Cheng, T.; Du, R.; Jiang, C. H.; Plestina, R.; Romeo, F.; Shaheen, S. M.; Tao, J.; Wang, C.; Wang, Z.; Zhang, H.; Asawatangtrakuldee, C.; Ban, Y.; Li, Q.; Liu, S.; Mao, Y.; Qian, S. J.; Wang, D.; Xu, Z.; Avila, C.; Cabrera, A.; Chaparro Sierra, L. F.; Florez, C.; Gomez, J. P.; Gomez Moreno, B.; Sanabria, J. C.; Godinovic, N.; Lelas, D.; Puljak, I.; Ribeiro Cipriano, P. M.; Antunovic, Z.; Kovac, M.; Brigljevic, V.; Kadija, K.; Luetic, J.; Micanovic, S.; Sudic, L.; Attikis, A.; Mavromanolakis, G.; Mousa, J.; Nicolaou, C.; Ptochos, F.; Razis, P. A.; Rykaczewski, H.; Bodlak, M.; Finger, M.; Finger, M., Jr.; Assran, Y.; Elgammal, S.; Ellithi Kamel, A.; Mahmoud, M. A.; Mohammed, Y.; Calpas, B.; Kadastik, M.; Murumaa, M.; Raidal, M.; Tiko, A.; Veelken, C.; Eerola, P.; Pekkanen, J.; Voutilainen, M.; Härkönen, J.; Karimäki, V.; Kinnunen, R.; Lampén, T.; Lassila-Perini, K.; Lehti, S.; Lindén, T.; Luukka, P.; Mäenpää, T.; Peltola, T.; Tuominen, E.; Tuominiemi, J.; Tuovinen, E.; Wendland, L.; Talvitie, J.; Tuuva, T.; Besancon, M.; Couderc, F.; Dejardin, M.; Denegri, D.; Fabbro, B.; Faure, J. L.; Favaro, C.; Ferri, F.; Ganjour, S.; Givernaud, A.; Gras, P.; Hamel de Monchenault, G.; Jarry, P.; Locci, E.; Machet, M.; Malcles, J.; Rander, J.; Rosowsky, A.; Titov, M.; Zghiche, A.; Antropov, I.; Baffioni, S.; Beaudette, F.; Busson, P.; Cadamuro, L.; Chapon, E.; Charlot, C.; Dahms, T.; Davignon, O.; Filipovic, N.; Florent, A.; Granier de Cassagnac, R.; Lisniak, S.; Mastrolorenzo, L.; Miné, P.; Naranjo, I. N.; Nguyen, M.; Ochando, C.; Ortona, G.; Paganini, P.; Pigard, P.; Regnard, S.; Salerno, R.; Sauvan, J. B.; Sirois, Y.; Strebler, T.; Yilmaz, Y.; Zabi, A.; Agram, J.-L.; Andrea, J.; Aubin, A.; Bloch, D.; Brom, J.-M.; Buttignol, M.; Chabert, E. C.; Chanon, N.; Collard, C.; Conte, E.; Coubez, X.; Fontaine, J.-C.; Gelé, D.; Goerlach, U.; Goetzmann, C.; Le Bihan, A.-C.; Merlin, J. A.; Skovpen, K.; Van Hove, P.; Gadrat, S.; Beauceron, S.; Bernet, C.; Boudoul, G.; Bouvier, E.; Carrillo Montoya, C. A.; Chierici, R.; Contardo, D.; Courbon, B.; Depasse, P.; El Mamouni, H.; Fan, J.; Fay, J.; Gascon, S.; Gouzevitch, M.; Ille, B.; Lagarde, F.; Laktineh, I. B.; Lethuillier, M.; Mirabito, L.; Pequegnot, A. L.; Perries, S.; Ruiz Alvarez, J. D.; Sabes, D.; Sgandurra, L.; Sordini, V.; Vander Donckt, M.; Verdier, P.; Viret, S.; Toriashvili, T.; Tsamalaidze, Z.; Autermann, C.; Beranek, S.; Edelhoff, M.; Feld, L.; Heister, A.; Kiesel, M. K.; Klein, K.; Lipinski, M.; Ostapchuk, A.; Preuten, M.; Raupach, F.; Schael, S.; Schulte, J. F.; Verlage, T.; Weber, H.; Wittmer, B.; Zhukov, V.; Ata, M.; Brodski, M.; Dietz-Laursonn, E.; Duchardt, D.; Endres, M.; Erdmann, M.; Erdweg, S.; Esch, T.; Fischer, R.; Güth, A.; Hebbeker, T.; Heidemann, C.; Hoepfner, K.; Klingebiel, D.; Knutzen, S.; Kreuzer, P.; Merschmeyer, M.; Meyer, A.; Millet, P.; Olschewski, M.; Padeken, K.; Papacz, P.; Pook, T.; Radziej, M.; Reithler, H.; Rieger, M.; Scheuch, F.; Sonnenschein, L.; Teyssier, D.; Thüer, S.; Cherepanov, V.; Erdogan, Y.; Flügge, G.; Geenen, H.; Geisler, M.; Hoehle, F.; Kargoll, B.; Kress, T.; Kuessel, Y.; Künsken, A.; Lingemann, J.; Nehrkorn, A.; Nowack, A.; Nugent, I. M.; Pistone, C.; Pooth, O.; Stahl, A.; Aldaya Martin, M.; Asin, I.; Bartosik, N.; Behnke, O.; Behrens, U.; Bell, A. J.; Borras, K.; Burgmeier, A.; Cakir, A.; Calligaris, L.; Campbell, A.; Choudhury, S.; Costanza, F.; Diez Pardos, C.; Dolinska, G.; Dooling, S.; Dorland, T.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Flucke, G.; Gallo, E.; Garay Garcia, J.; Geiser, A.; Gizhko, A.; Gunnellini, P.; Hauk, J.; Hempel, M.; Jung, H.; Kalogeropoulos, A.; Karacheban, O.; Kasemann, M.; Katsas, P.; Kieseler, J.; Kleinwort, C.; Korol, I.; Lange, W.; Leonard, J.; Lipka, K.; Lobanov, A.; Lohmann, W.; Mankel, R.; Marfin, I.; Melzer-Pellmann, I.-A.; Meyer, A. B.; Mittag, G.; Mnich, J.; Mussgiller, A.; Naumann-Emme, S.; Nayak, A.; Ntomari, E.; Perrey, H.; Pitzl, D.; Placakyte, R.; Raspereza, A.; Roland, B.; Sahin, M. Ö.; Saxena, P.; Schoerner-Sadenius, T.; Schröder, M.; Seitz, C.; Spannagel, S.; Trippkewitz, K. D.; Walsh, R.; Wissing, C.; Blobel, V.; Centis Vignali, M.; Draeger, A. R.; Erfle, J.; Garutti, E.; Goebel, K.; Gonzalez, D.; Görner, M.; Haller, J.; Hoffmann, M.; Höing, R. S.; Junkes, A.; Klanner, R.; Kogler, R.; Lapsien, T.; Lenz, T.; Marchesini, I.; Marconi, D.; Meyer, M.; Nowatschin, D.; Ott, J.; Pantaleo, F.; Peiffer, T.; Perieanu, A.; Pietsch, N.; Poehlsen, J.; Rathjens, D.; Sander, C.; Schettler, H.; Schleper, P.; Schlieckau, E.; Schmidt, A.; Schwandt, J.; Seidel, M.; Sola, V.; Stadie, H.; Steinbrück, G.; Tholen, H.; Troendle, D.; Usai, E.; Vanelderen, L.; Vanhoefer, A.; Vormwald, B.; Akbiyik, M.; Barth, C.; Baus, C.; Berger, J.; Böser, C.; Butz, E.; Chwalek, T.; Colombo, F.; De Boer, W.; Descroix, A.; Dierlamm, A.; Fink, S.; Frensch, F.; Giffels, M.; Gilbert, A.; Haitz, D.; Hartmann, F.; Heindl, S. M.; Husemann, U.; Katkov, I.; Kornmayer, A.; Lobelle Pardo, P.; Maier, B.; Mildner, H.; Mozer, M. U.; Müller, T.; Müller, Th.; Plagge, M.; Quast, G.; Rabbertz, K.; Röcker, S.; Roscher, F.; Simonis, H. J.; Stober, F. M.; Ulrich, R.; Wagner-Kuhr, J.; Wayand, S.; Weber, M.; Weiler, T.; Wöhrmann, C.; Wolf, R.; Anagnostou, G.; Daskalakis, G.; Geralis, T.; Giakoumopoulou, V. A.; Kyriakis, A.; Loukas, D.; Psallidas, A.; Topsis-Giotis, I.; Agapitos, A.; Kesisoglou, S.; Panagiotou, A.; Saoulidou, N.; Tziaferi, E.; Evangelou, I.; Flouris, G.; Foudas, C.; Kokkas, P.; Loukas, N.; Manthos, N.; Papadopoulos, I.; Paradas, E.; Strologas, J.; Bencze, G.; Hajdu, C.; Hazi, A.; Hidas, P.; Horvath, D.; Sikler, F.; Veszpremi, V.; Vesztergombi, G.; Zsigmond, A. J.; Beni, N.; Czellar, S.; Karancsi, J.; Molnar, J.; Szillasi, Z.; Bartók, M.; Makovec, A.; Raics, P.; Trocsanyi, Z. L.; Ujvari, B.; Mal, P.; Mandal, K.; Sahoo, D. K.; Sahoo, N.; Swain, S. K.; Bansal, S.; Beri, S. B.; Bhatnagar, V.; Chawla, R.; Gupta, R.; Bhawandeep, U.; Kalsi, A. K.; Kaur, A.; Kaur, M.; Kumar, R.; Mehta, A.; Mittal, M.; Singh, J. B.; Walia, G.; Kumar, Ashok; Bhardwaj, A.; Choudhary, B. C.; Garg, R. B.; Kumar, A.; Malhotra, S.; Naimuddin, M.; Nishu, N.; Ranjan, K.; Sharma, R.; Sharma, V.; Bhattacharya, S.; Chatterjee, K.; Dey, S.; Dutta, S.; Jain, Sa.; Majumdar, N.; Modak, A.; Mondal, K.; Mukherjee, S.; Mukhopadhyay, S.; Roy, A.; Roy, D.; Chowdhury, S. Roy; Sarkar, S.; Sharan, M.; Abdulsalam, A.; Chudasama, R.; Dutta, D.; Jha, V.; Kumar, V.; Mohanty, A. K.; Pant, L. M.; Shukla, P.; Topkar, A.; Aziz, T.; Banerjee, S.; Bhowmik, S.; Chatterjee, R. M.; Dewanjee, R. K.; Dugad, S.; Ganguly, S.; Ghosh, S.; Guchait, M.; Gurtu, A.; Kole, G.; Kumar, S.; Mahakud, B.; Maity, M.; Majumder, G.; Mazumdar, K.; Mitra, S.; Mohanty, G. B.; Parida, B.; Sarkar, T.; Sur, N.; Sutar, B.; Wickramage, N.; Chauhan, S.; Dube, S.; Sharma, S.; Bakhshiansohi, H.; Behnamian, H.; Etesami, S. M.; Fahim, A.; Goldouzian, R.; Khakzad, M.; Najafabadi, M. Mohammadi; Naseri, M.; Paktinat Mehdiabadi, S.; Rezaei Hosseinabadi, F.; Safarzadeh, B.; Zeinali, M.; Felcini, M.; Grunewald, M.; Abbrescia, M.; Calabria, C.; Caputo, C.; Colaleo, A.; Creanza, D.; Cristella, L.; De Filippis, N.; De Palma, M.; Fiore, L.; Iaselli, G.; Maggi, G.; Maggi, M.; Miniello, G.; My, S.; Nuzzo, S.; Pompili, A.; Pugliese, G.; Radogna, R.; Ranieri, A.; Selvaggi, G.; Silvestris, L.; Venditti, R.; Verwilligen, P.; Abbiendi, G.; Battilana, C.; Benvenuti, A. C.; Bonacorsi, D.; Braibant-Giacomelli, S.; Brigliadori, L.; Campanini, R.; Capiluppi, P.; Castro, A.; Cavallo, F. R.; Chhibra, S. S.; Codispoti, G.; Cuffiani, M.; Dallavalle, G. M.; Fabbri, F.; Fanfani, A.; Fasanella, D.; Giacomelli, P.; Grandi, C.; Guiducci, L.; Marcellini, S.; Masetti, G.; Montanari, A.; Navarria, F. L.; Perrotta, A.; Rossi, A. M.; Rovelli, T.; Siroli, G. P.; Tosi, N.; Travaglini, R.; Cappello, G.; Chiorboli, M.; Costa, S.; Giordano, F.; Potenza, R.; Tricomi, A.; Tuve, C.; Barbagli, G.; Ciulli, V.; Civinini, C.; D'Alessandro, R.; Focardi, E.; Gonzi, S.; Gori, V.; Lenzi, P.; Meschini, M.; Paoletti, S.; Sguazzoni, G.; Tropiano, A.; Viliani, L.; Benussi, L.; Bianco, S.; Fabbri, F.; Piccolo, D.; Primavera, F.; Calvelli, V.; Ferro, F.; Lo Vetere, M.; Monge, M. R.; Robutti, E.; Tosi, S.; Brianza, L.; Dinardo, M. E.; Fiorendi, S.; Gennai, S.; Gerosa, R.; Ghezzi, A.; Govoni, P.; Malvezzi, S.; Manzoni, R. A.; Marzocchi, B.; Menasce, D.; Moroni, L.; Paganoni, M.; Pedrini, D.; Ragazzi, S.; Redaelli, N.; Tabarelli de Fatis, T.; Buontempo, S.; Cavallo, N.; Di Guida, S.; Esposito, M.; Fabozzi, F.; Iorio, A. O. M.; Lanza, G.; Lista, L.; Meola, S.; Merola, M.; Paolucci, P.; Sciacca, C.; Thyssen, F.; Azzi, P.; Bacchetta, N.; Bellato, M.; Benato, L.; Bisello, D.; Boletti, A.; Branca, A.; Carlin, R.; Checchia, P.; Dall'Osso, M.; Dorigo, T.; Dosselli, U.; Fanzago, F.; Gasparini, F.; Gasparini, U.; Gonella, F.; Gozzelino, A.; Kanishchev, K.; Lacaprara, S.; Maron, G.; Pazzini, J.; Pozzobon, N.; Ronchese, P.; Tosi, M.; Vanini, S.; Ventura, S.; Zanetti, M.; Zucchetta, A.; Zumerle, G.; Braghieri, A.; Magnani, A.; Montagna, P.; Ratti, S. P.; Re, V.; Riccardi, C.; Salvini, P.; Vai, I.; Vitulo, P.; Alunni Solestizi, L.; Biasini, M.; Bilei, G. M.; Ciangottini, D.; Fanò, L.; Lariccia, P.; Mantovani, G.; Menichelli, M.; Saha, A.; Santocchia, A.; Spiezia, A.; Androsov, K.; Azzurri, P.; Bagliesi, G.; Bernardini, J.; Boccali, T.; Broccolo, G.; Castaldi, R.; Ciocci, M. A.; Dell'Orso, R.; Donato, S.; Fedi, G.; Foà, L.; Giassi, A.; Grippo, M. T.; Ligabue, F.; Lomtadze, T.; Martini, L.; Messineo, A.; Palla, F.; Rizzi, A.; Savoy-Navarro, A.; Serban, A. T.; Spagnolo, P.; Squillacioti, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Barone, L.; Cavallari, F.; D'imperio, G.; Del Re, D.; Diemoz, M.; Gelli, S.; Jorda, C.; Longo, E.; Margaroli, F.; Meridiani, P.; Organtini, G.; Paramatti, R.; Preiato, F.; Rahatlou, S.; Rovelli, C.; Santanastasio, F.; Traczyk, P.; Amapane, N.; Arcidiacono, R.; Argiro, S.; Arneodo, M.; Bellan, R.; Biino, C.; Cartiglia, N.; Costa, M.; Covarelli, R.; Degano, A.; Demaria, N.; Finco, L.; Kiani, B.; Mariotti, C.; Maselli, S.; Migliore, E.; Monaco, V.; Monteil, E.; Musich, M.; Obertino, M. M.; Pacher, L.; Pastrone, N.; Pelliccioni, M.; Pinna Angioni, G. L.; Ravera, F.; Romero, A.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Tamponi, U.; Belforte, S.; Candelise, V.; Casarsa, M.; Cossutti, F.; Della Ricca, G.; Gobbo, B.; La Licata, C.; Marone, M.; Schizzi, A.; Zanetti, A.; Kropivnitskaya, A.; Nam, S. K.; Kim, D. H.; Kim, G. N.; Kim, M. S.; Kong, D. J.; Lee, S.; Oh, Y. D.; Sakharov, A.; Son, D. C.; Brochero Cifuentes, J. A.; Kim, H.; Kim, T. J.; Song, S.; Choi, S.; Go, Y.; Gyun, D.; Hong, B.; Jo, M.; Kim, H.; Kim, Y.; Lee, B.; Lee, K.; Lee, K. S.; Lee, S.; Park, S. K.; Roh, Y.; Yoo, H. D.; Choi, M.; Kim, H.; Kim, J. H.; Lee, J. S. H.; Park, I. C.; Ryu, G.; Ryu, M. S.; Choi, Y.; Goh, J.; Kim, D.; Kwon, E.; Lee, J.; Yu, I.; Juodagalvis, A.; Vaitkus, J.; Ahmed, I.; Ibrahim, Z. A.; Komaragiri, J. R.; Ali, M. A. B. Md; Mohamad Idris, F.; Abdullah, W. A. T. Wan; Yusli, M. N.; Casimiro Linares, E.; Castilla-Valdez, H.; De La Cruz-Burelo, E.; Heredia-De La Cruz, I.; Hernandez-Almada, A.; Lopez-Fernandez, R.; Sanchez-Hernandez, A.; Carrillo Moreno, S.; Vazquez Valencia, F.; Pedraza, I.; Salazar Ibarguen, H. A.; Morelos Pineda, A.; Krofcheck, D.; Butler, P. H.; Ahmad, A.; Ahmad, M.; Hassan, Q.; Hoorani, H. R.; Khan, W. A.; Khurshid, T.; Shoaib, M.; Bialkowska, H.; Bluj, M.; Boimska, B.; Frueboes, T.; Górski, M.; Kazana, M.; Nawrocki, K.; Romanowska-Rybinska, K.; Szleper, M.; Zalewski, P.; Brona, G.; Bunkowski, K.; Byszuk, A.; Doroba, K.; Kalinowski, A.; Konecki, M.; Krolikowski, J.; Misiura, M.; Olszewski, M.; Walczak, M.; Bargassa, P.; Silva, C. Beirão Da Cruz E.; Di Francesco, A.; Faccioli, P.; Ferreira Parracho, P. G.; Gallinaro, M.; Leonardo, N.; Lloret Iglesias, L.; Nguyen, F.; Rodrigues Antunes, J.; Seixas, J.; Toldaiev, O.; Vadruccio, D.; Varela, J.; Vischia, P.; Afanasiev, S.; Bunin, P.; Gavrilenko, M.; Golutvin, I.; Gorbunov, I.; Kamenev, A.; Karjavin, V.; Konoplyanikov, V.; Lanev, A.; Malakhov, A.; Matveev, V.; Moisenz, P.; Palichik, V.; Perelygin, V.; Shmatov, S.; Shulha, S.; Skatchkov, N.; Smirnov, V.; Zarubin, A.; Golovtsov, V.; Ivanov, Y.; Kim, V.; Kuznetsova, E.; Levchenko, P.; Murzin, V.; Oreshkin, V.; Smirnov, I.; Sulimov, V.; Uvarov, L.; Vavilov, S.; Vorobyev, A.; Andreev, Yu.; Dermenev, A.; Gninenko, S.; Golubev, N.; Karneyeu, A.; Kirsanov, M.; Krasnikov, N.; Pashenkov, A.; Tlisov, D.; Toropin, A.; Epshteyn, V.; Gavrilov, V.; Lychkovskaya, N.; Popov, V.; Pozdnyakov, I.; Safronov, G.; Spiridonov, A.; Vlasov, E.; Zhokin, A.; Bylinkin, A.; Andreev, V.; Azarkin, M.; Dremin, I.; Kirakosyan, M.; Leonidov, A.; Mesyats, G.; Rusakov, S. V.; Baskakov, A.; Belyaev, A.; Boos, E.; Dubinin, M.; Dudko, L.; Ershov, A.; Gribushin, A.; Kaminskiy, A.; Klyukhin, V.; Kodolova, O.; Lokhtin, I.; Myagkov, I.; Obraztsov, S.; Petrushanko, S.; Savrin, V.; Azhgirey, I.; Bayshev, I.; Bitioukov, S.; Kachanov, V.; Kalinin, A.; Konstantinov, D.; Krychkine, V.; Petrov, V.; Ryutin, R.; Sobol, A.; Tourtchanovitch, L.; Troshin, S.; Tyurin, N.; Uzunian, A.; Volkov, A.; Adzic, P.; Milosevic, J.; Rekovic, V.; Alcaraz Maestre, J.; Calvo, E.; Cerrada, M.; Chamizo Llatas, M.; Colino, N.; De La Cruz, B.; Delgado Peris, A.; Domínguez Vázquez, D.; Escalante Del Valle, A.; Fernandez Bedoya, C.; Fernández Ramos, J. P.; Flix, J.; Fouz, M. C.; Garcia-Abia, P.; Gonzalez Lopez, O.; Goy Lopez, S.; Hernandez, J. M.; Josa, M. I.; Navarro De Martino, E.; Pérez-Calero Yzquierdo, A.; Puerta Pelayo, J.; Quintario Olmeda, A.; Redondo, I.; Romero, L.; Santaolalla, J.; Soares, M. S.; Albajar, C.; de Trocóniz, J. F.; Missiroli, M.; Moran, D.; Cuevas, J.; Fernandez Menendez, J.; Folgueras, S.; Gonzalez Caballero, I.; Palencia Cortezon, E.; Vizan Garcia, J. M.; Cabrillo, I. J.; Calderon, A.; Castiñeiras De Saa, J. R.; De Castro Manzano, P.; Duarte Campderros, J.; Fernandez, M.; Garcia-Ferrero, J.; Gomez, G.; Lopez Virto, A.; Marco, J.; Marco, R.; Martinez Rivero, C.; Matorras, F.; Munoz Sanchez, F. J.; Piedra Gomez, J.; Rodrigo, T.; Rodríguez-Marrero, A. Y.; Ruiz-Jimeno, A.; Scodellaro, L.; Trevisani, N.; Vila, I.; Vilar Cortabitarte, R.; Abbaneo, D.; Auffray, E.; Auzinger, G.; Bachtis, M.; Baillon, P.; Ball, A. H.; Barney, D.; Benaglia, A.; Bendavid, J.; Benhabib, L.; Benitez, J. F.; Berruti, G. M.; Bloch, P.; Bocci, A.; Bonato, A.; Botta, C.; Breuker, H.; Camporesi, T.; Castello, R.; Cerminara, G.; D'Alfonso, M.; d'Enterria, D.; Dabrowski, A.; Daponte, V.; David, A.; De Gruttola, M.; De Guio, F.; De Roeck, A.; De Visscher, S.; Di Marco, E.; Dobson, M.; Dordevic, M.; Dorney, B.; du Pree, T.; Dünser, M.; Dupont, N.; Elliott-Peisert, A.; Franzoni, G.; Funk, W.; Gigi, D.; Gill, K.; Giordano, D.; Girone, M.; Glege, F.; Guida, R.; Gundacker, S.; Guthoff, M.; Hammer, J.; Harris, P.; Hegeman, J.; Innocente, V.; Janot, P.; Kirschenmann, H.; Kortelainen, M. J.; Kousouris, K.; Krajczar, K.; Lecoq, P.; Lourenço, C.; Lucchini, M. T.; Magini, N.; Malgeri, L.; Mannelli, M.; Martelli, A.; Masetti, L.; Meijers, F.; Mersi, S.; Meschi, E.; Moortgat, F.; Morovic, S.; Mulders, M.; Nemallapudi, M. V.; Neugebauer, H.; Orfanelli, S.; Orsini, L.; Pape, L.; Perez, E.; Peruzzi, M.; Petrilli, A.; Petrucciani, G.; Pfeiffer, A.; Piparo, D.; Racz, A.; Rolandi, G.; Rovere, M.; Ruan, M.; Sakulin, H.; Schäfer, C.; Schwick, C.; Sharma, A.; Silva, P.; Simon, M.; Sphicas, P.; Steggemann, J.; Stieger, B.; Stoye, M.; Takahashi, Y.; Treille, D.; Triossi, A.; Tsirou, A.; Veres, G. I.; Wardle, N.; Wöhri, H. K.; Zagozdzinska, A.; Zeuner, W. D.; Bertl, W.; Deiters, K.; Erdmann, W.; Horisberger, R.; Ingram, Q.; Kaestli, H. C.; Kotlinski, D.; Langenegger, U.; Renker, D.; Rohe, T.; Bachmair, F.; Bäni, L.; Bianchini, L.; Casal, B.; Dissertori, G.; Dittmar, M.; Donegà, M.; Eller, P.; Grab, C.; Heidegger, C.; Hits, D.; Hoss, J.; Kasieczka, G.; Lustermann, W.; Mangano, B.; Marionneau, M.; Martinez Ruiz del Arbol, P.; Masciovecchio, M.; Meister, D.; Micheli, F.; Musella, P.; Nessi-Tedaldi, F.; Pandolfi, F.; Pata, J.; Pauss, F.; Perrozzi, L.; Quittnat, M.; Rossini, M.; Starodumov, A.; Takahashi, M.; Tavolaro, V. R.; Theofilatos, K.; Wallny, R.; Aarrestad, T. K.; Amsler, C.; Caminada, L.; Canelli, M. F.; Chiochia, V.; De Cosa, A.; Galloni, C.; Hinzmann, A.; Hreus, T.; Kilminster, B.; Lange, C.; Ngadiuba, J.; Pinna, D.; Robmann, P.; Ronga, F. J.; Salerno, D.; Yang, Y.; Cardaci, M.; Chen, K. H.; Doan, T. H.; Jain, Sh.; Khurana, R.; Konyushikhin, M.; Kuo, C. M.; Lin, W.; Lu, Y. J.; Yu, S. S.; Kumar, Arun; Bartek, R.; Chang, P.; Chang, Y. H.; Chang, Y. W.; Chao, Y.; Chen, K. F.; Chen, P. H.; Dietz, C.; Fiori, F.; Grundler, U.; Hou, W.-S.; Hsiung, Y.; Liu, Y. F.; Lu, R.-S.; Miñano Moya, M.; Petrakou, E.; Tsai, J. f.; Tzeng, Y. M.; Asavapibhop, B.; Kovitanggoon, K.; Singh, G.; Srimanobhas, N.; Suwonjandee, N.; Adiguzel, A.; Cerci, S.; Demiroglu, Z. S.; Dozen, C.; Dumanoglu, I.; Girgis, S.; Gokbulut, G.; Guler, Y.; Gurpinar, E.; Hos, I.; Kangal, E. E.; Kayis Topaksu, A.; Onengut, G.; Ozdemir, K.; Ozturk, S.; Tali, B.; Topakli, H.; Vergili, M.; Zorbilmez, C.; Akin, I. V.; Bilin, B.; Bilmis, S.; Isildak, B.; Karapinar, G.; Yalvac, M.; Zeyrek, M.; Albayrak, E. A.; Gülmez, E.; Kaya, M.; Kaya, O.; Yetkin, T.; Cankocak, K.; Sen, S.; Vardarli, F. I.; Grynyov, B.; Levchuk, L.; Sorokin, P.; Aggleton, R.; Ball, F.; Beck, L.; Brooke, J. J.; Clement, E.; Cussans, D.; Flacher, H.; Goldstein, J.; Grimes, M.; Heath, G. P.; Heath, H. F.; Jacob, J.; Kreczko, L.; Lucas, C.; Meng, Z.; Newbold, D. M.; Paramesvaran, S.; Poll, A.; Sakuma, T.; Seif El Nasr-storey, S.; Senkin, S.; Smith, D.; Smith, V. J.; Bell, K. W.; Belyaev, A.; Brew, C.; Brown, R. M.; Cieri, D.; Cockerill, D. J. A.; Coughlan, J. A.; Harder, K.; Harper, S.; Olaiya, E.; Petyt, D.; Shepherd-Themistocleous, C. H.; Thea, A.; Tomalin, I. R.; Williams, T.; Womersley, W. J.; Worm, S. D.; Baber, M.; Bainbridge, R.; Buchmuller, O.; Bundock, A.; Burton, D.; Casasso, S.; Citron, M.; Colling, D.; Corpe, L.; Cripps, N.; Dauncey, P.; Davies, G.; De Wit, A.; Della Negra, M.; Dunne, P.; Elwood, A.; Ferguson, W.; Fulcher, J.; Futyan, D.; Hall, G.; Iles, G.; Kenzie, M.; Lane, R.; Lucas, R.; Lyons, L.; Magnan, A.-M.; Malik, S.; Nash, J.; Nikitenko, A.; Pela, J.; Pesaresi, M.; Petridis, K.; Raymond, D. M.; Richards, A.; Rose, A.; Seez, C.; Tapper, A.; Uchida, K.; Vazquez Acosta, M.; Virdee, T.; Zenz, S. C.; Cole, J. E.; Hobson, P. R.; Khan, A.; Kyberd, P.; Leggat, D.; Leslie, D.; Reid, I. D.; Symonds, P.; Teodorescu, L.; Turner, M.; Borzou, A.; Call, K.; Dittmann, J.; Hatakeyama, K.; Kasmi, A.; Liu, H.; Pastika, N.; Charaf, O.; Cooper, S. I.; Henderson, C.; Rumerio, P.; Avetisyan, A.; Bose, T.; Fantasia, C.; Gastler, D.; Lawson, P.; Rankin, D.; Richardson, C.; Rohlf, J.; St. John, J.; Sulak, L.; Zou, D.; Alimena, J.; Berry, E.; Bhattacharya, S.; Cutts, D.; Dhingra, N.; Ferapontov, A.; Garabedian, A.; Hakala, J.; Heintz, U.; Laird, E.; Landsberg, G.; Mao, Z.; Narain, M.; Piperov, S.; Sagir, S.; Sinthuprasith, T.; Syarif, R.; Breedon, R.; Breto, G.; Calderon De La Barca Sanchez, M.; Chauhan, S.; Chertok, M.; Conway, J.; Conway, R.; Cox, P. T.; Erbacher, R.; Gardner, M.; Ko, W.; Lander, R.; Mulhearn, M.; Pellett, D.; Pilot, J.; Ricci-Tam, F.; Shalhout, S.; Smith, J.; Squires, M.; Stolp, D.; Tripathi, M.; Wilbur, S.; Yohay, R.; Cousins, R.; Everaerts, P.; Farrell, C.; Hauser, J.; Ignatenko, M.; Saltzberg, D.; Takasugi, E.; Valuev, V.; Weber, M.; Burt, K.; Clare, R.; Ellison, J.; Gary, J. W.; Hanson, G.; Heilman, J.; Ivova PANEVA, M.; Jandir, P.; Kennedy, E.; Lacroix, F.; Long, O. R.; Luthra, A.; Malberti, M.; Olmedo Negrete, M.; Shrinivas, A.; Wei, H.; Wimpenny, S.; Yates, B. R.; Branson, J. G.; Cerati, G. B.; Cittolin, S.; D'Agnolo, R. T.; Holzner, A.; Kelley, R.; Klein, D.; Letts, J.; Macneill, I.; Olivito, D.; Padhi, S.; Pieri, M.; Sani, M.; Sharma, V.; Simon, S.; Tadel, M.; Vartak, A.; Wasserbaech, S.; Welke, C.; Würthwein, F.; Yagil, A.; Zevi Della Porta, G.; Barge, D.; Bradmiller-Feld, J.; Campagnari, C.; Dishaw, A.; Dutta, V.; Flowers, K.; Sevilla, M. Franco; Geffert, P.; George, C.; Golf, F.; Gouskos, L.; Gran, J.; Incandela, J.; Justus, C.; Mccoll, N.; Mullin, S. D.; Richman, J.; Stuart, D.; Suarez, I.; To, W.; West, C.; Yoo, J.; Anderson, D.; Apresyan, A.; Bornheim, A.; Bunn, J.; Chen, Y.; Duarte, J.; Mott, A.; Newman, H. B.; Pena, C.; Pierini, M.; Spiropulu, M.; Vlimant, J. R.; Xie, S.; Zhu, R. Y.; Andrews, M. B.; Azzolini, V.; Calamba, A.; Carlson, B.; Ferguson, T.; Paulini, M.; Russ, J.; Sun, M.; Vogel, H.; Vorobiev, I.; Cumalat, J. P.; Ford, W. T.; Gaz, A.; Jensen, F.; Johnson, A.; Krohn, M.; Mulholland, T.; Nauenberg, U.; Stenson, K.; Wagner, S. R.; Alexander, J.; Chatterjee, A.; Chaves, J.; Chu, J.; Dittmer, S.; Eggert, N.; Mirman, N.; Kaufman, G. Nicolas; Patterson, J. R.; Rinkevicius, A.; Ryd, A.; Skinnari, L.; Soffi, L.; Sun, W.; Tan, S. M.; Teo, W. D.; Thom, J.; Thompson, J.; Tucker, J.; Weng, Y.; Wittich, P.; Abdullin, S.; Albrow, M.; Anderson, J.; Apollinari, G.; Banerjee, S.; Bauerdick, L. A. T.; Beretvas, A.; Berryhill, J.; Bhat, P. C.; Bolla, G.; Burkett, K.; Butler, J. N.; Cheung, H. W. K.; Chlebana, F.; Cihangir, S.; Elvira, V. D.; Fisk, I.; Freeman, J.; Gottschalk, E.; Gray, L.; Green, D.; Grünendahl, S.; Gutsche, O.; Hanlon, J.; Hare, D.; Harris, R. M.; Hasegawa, S.; Hirschauer, J.; Hu, Z.; Jindariani, S.; Johnson, M.; Joshi, U.; Jung, A. W.; Klima, B.; Kreis, B.; Kwan, S.; Lammel, S.; Linacre, J.; Lincoln, D.; Lipton, R.; Liu, T.; Lopes De Sá, R.; Lykken, J.; Maeshima, K.; Marraffino, J. M.; Martinez Outschoorn, V. I.; Maruyama, S.; Mason, D.; McBride, P.; Merkel, P.; Mishra, K.; Mrenna, S.; Nahn, S.; Newman-Holmes, C.; O'Dell, V.; Pedro, K.; Prokofyev, O.; Rakness, G.; Sexton-Kennedy, E.; Soha, A.; Spalding, W. J.; Spiegel, L.; Taylor, L.; Tkaczyk, S.; Tran, N. V.; Uplegger, L.; Vaandering, E. W.; Vernieri, C.; Verzocchi, M.; Vidal, R.; Weber, H. A.; Whitbeck, A.; Yang, F.; Acosta, D.; Avery, P.; Bortignon, P.; Bourilkov, D.; Carnes, A.; Carver, M.; Curry, D.; Das, S.; Di Giovanni, G. P.; Field, R. D.; Furic, I. K.; Hugon, J.; Konigsberg, J.; Korytov, A.; Low, J. F.; Ma, P.; Matchev, K.; Mei, H.; Milenovic, P.; Mitselmakher, G.; Rank, D.; Rossin, R.; Shchutska, L.; Snowball, M.; Sperka, D.; Terentyev, N.; Thomas, L.; Wang, J.; Wang, S.; Yelton, J.; Hewamanage, S.; Linn, S.; Markowitz, P.; Martinez, G.; Rodriguez, J. L.; Ackert, A.; Adams, J. R.; Adams, T.; Askew, A.; Bochenek, J.; Diamond, B.; Haas, J.; Hagopian, S.; Hagopian, V.; Johnson, K. F.; Khatiwada, A.; Prosper, H.; Weinberg, M.; Baarmand, M. M.; Bhopatkar, V.; Colafranceschi, S.; Hohlmann, M.; Kalakhety, H.; Noonan, D.; Roy, T.; Yumiceva, F.; Adams, M. R.; Apanasevich, L.; Berry, D.; Betts, R. R.; Bucinskaite, I.; Cavanaugh, R.; Evdokimov, O.; Gauthier, L.; Gerber, C. E.; Hofman, D. J.; Kurt, P.; O'Brien, C.; Sandoval Gonzalez, I. D.; Silkworth, C.; Turner, P.; Varelas, N.; Wu, Z.; Zakaria, M.; Bilki, B.; Clarida, W.; Dilsiz, K.; Durgut, S.; Gandrajula, R. P.; Haytmyradov, M.; Khristenko, V.; Merlo, J.-P.; Mermerkaya, H.; Mestvirishvili, A.; Moeller, A.; Nachtman, J.; Ogul, H.; Onel, Y.; Ozok, F.; Penzo, A.; Snyder, C.; Tan, P.; Tiras, E.; Wetzel, J.; Yi, K.; Anderson, I.; Barnett, B. A.; Blumenfeld, B.; Eminizer, N.; Fehling, D.; Feng, L.; Gritsan, A. V.; Maksimovic, P.; Martin, C.; Osherson, M.; Roskes, J.; Sady, A.; Sarica, U.; Swartz, M.; Xiao, M.; Xin, Y.; You, C.; Baringer, P.; Bean, A.; Benelli, G.; Bruner, C.; Kenny, R. P., III; Majumder, D.; Malek, M.; Murray, M.; Sanders, S.; Stringer, R.; Wang, Q.; Ivanov, A.; Kaadze, K.; Khalil, S.; Makouski, M.; Maravin, Y.; Mohammadi, A.; Saini, L. K.; Skhirtladze, N.; Toda, S.; Lange, D.; Rebassoo, F.; Wright, D.; Anelli, C.; Baden, A.; Baron, O.; Belloni, A.; Calvert, B.; Eno, S. C.; Ferraioli, C.; Gomez, J. A.; Hadley, N. J.; Jabeen, S.; Kellogg, R. G.; Kolberg, T.; Kunkle, J.; Lu, Y.; Mignerey, A. C.; Shin, Y. H.; Skuja, A.; Tonjes, M. B.; Tonwar, S. C.; Apyan, A.; Barbieri, R.; Baty, A.; Bierwagen, K.; Brandt, S.; Busza, W.; Cali, I. A.; Demiragli, Z.; Di Matteo, L.; Gomez Ceballos, G.; Goncharov, M.; Gulhan, D.; Iiyama, Y.; Innocenti, G. M.; Klute, M.; Kovalskyi, D.; Lai, Y. S.; Lee, Y.-J.; Levin, A.; Luckey, P. D.; Marini, A. C.; Mcginn, C.; Mironov, C.; Niu, X.; Paus, C.; Ralph, D.; Roland, C.; Roland, G.; Salfeld-Nebgen, J.; Stephans, G. S. F.; Sumorok, K.; Varma, M.; Velicanu, D.; Veverka, J.; Wang, J.; Wang, T. W.; Wyslouch, B.; Yang, M.; Zhukova, V.; Dahmes, B.; Evans, A.; Finkel, A.; Gude, A.; Hansen, P.; Kalafut, S.; Kao, S. C.; Klapoetke, K.; Kubota, Y.; Lesko, Z.; Mans, J.; Nourbakhsh, S.; Ruckstuhl, N.; Rusack, R.; Tambe, N.; Turkewitz, J.; Acosta, J. G.; Oliveros, S.; Avdeeva, E.; Bloom, K.; Bose, S.; Claes, D. R.; Dominguez, A.; Fangmeier, C.; Gonzalez Suarez, R.; Kamalieddin, R.; Keller, J.; Knowlton, D.; Kravchenko, I.; Lazo-Flores, J.; Meier, F.; Monroy, J.; Ratnikov, F.; Siado, J. E.; Snow, G. R.; Alyari, M.; Dolen, J.; George, J.; Godshalk, A.; Harrington, C.; Iashvili, I.; Kaisen, J.; Kharchilava, A.; Kumar, A.; Rappoccio, S.; Roozbahani, B.; Alverson, G.; Barberis, E.; Baumgartel, D.; Chasco, M.; Hortiangtham, A.; Massironi, A.; Morse, D. M.; Nash, D.; Orimoto, T.; Teixeira De Lima, R.; Trocino, D.; Wang, R.-J.; Wood, D.; Zhang, J.; Hahn, K. A.; Kubik, A.; Mucia, N.; Odell, N.; Pollack, B.; Pozdnyakov, A.; Schmitt, M.; Stoynev, S.; Sung, K.; Trovato, M.; Velasco, M.; Brinkerhoff, A.; Dev, N.; Hildreth, M.; Jessop, C.; Karmgard, D. J.; Kellams, N.; Lannon, K.; Lynch, S.; Marinelli, N.; Meng, F.; Mueller, C.; Musienko, Y.; Pearson, T.; Planer, M.; Reinsvold, A.; Ruchti, R.; Smith, G.; Taroni, S.; Valls, N.; Wayne, M.; Wolf, M.; Woodard, A.; Antonelli, L.; Brinson, J.; Bylsma, B.; Durkin, L. S.; Flowers, S.; Hart, A.; Hill, C.; Hughes, R.; Ji, W.; Kotov, K.; Ling, T. Y.; Liu, B.; Luo, W.; Puigh, D.; Rodenburg, M.; Winer, B. L.; Wulsin, H. W.; Driga, O.; Elmer, P.; Hardenbrook, J.; Hebda, P.; Koay, S. A.; Lujan, P.; Marlow, D.; Medvedeva, T.; Mooney, M.; Olsen, J.; Palmer, C.; Piroué, P.; Quan, X.; Saka, H.; Stickland, D.; Tully, C.; Werner, J. S.; Zuranski, A.; Malik, S.; Barnes, V. E.; Benedetti, D.; Bortoletto, D.; Gutay, L.; Jha, M. K.; Jones, M.; Jung, K.; Miller, D. H.; Neumeister, N.; Radburn-Smith, B. C.; Shi, X.; Shipsey, I.; Silvers, D.; Sun, J.; Svyatkovskiy, A.; Wang, F.; Xie, W.; Xu, L.; Parashar, N.; Stupak, J.; Adair, A.; Akgun, B.; Chen, Z.; Ecklund, K. M.; Geurts, F. J. M.; Guilbaud, M.; Li, W.; Michlin, B.; Northup, M.; Padley, B. P.; Redjimi, R.; Roberts, J.; Rorie, J.; Tu, Z.; Zabel, J.; Betchart, B.; Bodek, A.; de Barbaro, P.; Demina, R.; Eshaq, Y.; Ferbel, T.; Galanti, M.; Garcia-Bellido, A.; Han, J.; Harel, A.; Hindrichs, O.; Khukhunaishvili, A.; Petrillo, G.; Verzetti, M.; Arora, S.; Barker, A.; Chou, J. P.; Contreras-Campana, C.; Contreras-Campana, E.; Duggan, D.; Ferencek, D.; Gershtein, Y.; Gray, R.; Halkiadakis, E.; Hidas, D.; Hughes, E.; Kaplan, S.; Kunnawalkam Elayavalli, R.; Lath, A.; Nash, K.; Panwalkar, S.; Park, M.; Salur, S.; Schnetzer, S.; Sheffield, D.; Somalwar, S.; Stone, R.; Thomas, S.; Thomassen, P.; Walker, M.; Foerster, M.; Riley, G.; Rose, K.; Spanier, S.; York, A.; Bouhali, O.; Castaneda Hernandez, A.; Dalchenko, M.; De Mattia, M.; Delgado, A.; Dildick, S.; Eusebi, R.; Gilmore, J.; Kamon, T.; Krutelyov, V.; Mueller, R.; Osipenkov, I.; Pakhotin, Y.; Patel, R.; Perloff, A.; Rose, A.; Safonov, A.; Tatarinov, A.; Ulmer, K. A.; Akchurin, N.; Cowden, C.; Damgov, J.; Dragoiu, C.; Dudero, P. R.; Faulkner, J.; Kunori, S.; Lamichhane, K.; Lee, S. W.; Libeiro, T.; Undleeb, S.; Volobouev, I.; Appelt, E.; Delannoy, A. G.; Greene, S.; Gurrola, A.; Janjam, R.; Johns, W.; Maguire, C.; Mao, Y.; Melo, A.; Ni, H.; Sheldon, P.; Snook, B.; Tuo, S.; Velkovska, J.; Xu, Q.; Arenton, M. W.; Cox, B.; Francis, B.; Goodell, J.; Hirosky, R.; Ledovskoy, A.; Li, H.; Lin, C.; Neu, C.; Sun, X.; Wang, Y.; Wolfe, E.; Wood, J.; Xia, F.; Clarke, C.; Harr, R.; Karchin, P. E.; Kottachchi Kankanamge Don, C.; Lamichhane, P.; Sturdy, J.; Belknap, D. A.; Carlsmith, D.; Cepeda, M.; Dasu, S.; Dodd, L.; Duric, S.; Friis, E.; Gomber, B.; Grothe, M.; Hall-Wilton, R.; Herndon, M.; Hervé, A.; Klabbers, P.; Lanaro, A.; Levine, A.; Long, K.; Loveless, R.; Mohapatra, A.; Ojalvo, I.; Perry, T.; Pierro, G. A.; Polese, G.; Ruggles, T.; Sarangi, T.; Savin, A.; Sharma, A.; Smith, N.; Smith, W. H.; Taylor, D.; Woods, N.

    2017-02-01

    Improved jet energy scale corrections, based on a data sample corresponding to an integrated luminosity of 19.7 fb-1 collected by the CMS experiment in proton-proton collisions at a center-of-mass energy of 8 TeV, are presented. The corrections as a function of pseudorapidity η and transverse momentum pT are extracted from data and simulated events combining several channels and methods. They account successively for the effects of pileup, uniformity of the detector response, and residual data-simulation jet energy scale differences. Further corrections, depending on the jet flavor and distance parameter (jet size) R, are also presented. The jet energy resolution is measured in data and simulated events and is studied as a function of pileup, jet size, and jet flavor. Typical jet energy resolutions at the central rapidities are 15-20% at 30 GeV, about 10% at 100 GeV, and 5% at 1 TeV. The studies exploit events with dijet topology, as well as photon+jet, Z+jet and multijet events. Several new techniques are used to account for the various sources of jet energy scale corrections, and a full set of uncertainties, and their correlations, are provided.The final uncertainties on the jet energy scale are below 3% across the phase space considered by most analyses (pT>30 GeV and 0| η| <5.). In the barrel region (| η| <1.3) an uncertainty below 1% for pT>30 GeV is reached, when excluding the jet flavor uncertainties, which are provided separately for different jet flavors. A new benchmark for jet energy scale determination at hadron colliders is achieved with 0.32% uncertainty for jets with \\pt of the order of 165-330\\GeV, and | η| <0.8.

  4. Estimation of full moment tensors, including uncertainties, for earthquakes, volcanic events, and nuclear explosions

    NASA Astrophysics Data System (ADS)

    Alvizuri, Celso; Silwal, Vipul; Krischer, Lion; Tape, Carl

    2017-04-01

    A seismic moment tensor is a 3 × 3 symmetric matrix that provides a compact representation of seismic events within Earth's crust. We develop an algorithm to estimate moment tensors and their uncertainties from observed seismic data. For a given event, the algorithm performs a grid search over the six-dimensional space of moment tensors by generating synthetic waveforms at each grid point and then evaluating a misfit function between the observed and synthetic waveforms. 'The' moment tensor M for the event is then the moment tensor with minimum misfit. To describe the uncertainty associated with M, we first convert the misfit function to a probability function. The uncertainty, or rather the confidence, is then given by the 'confidence curve' P(V ), where P(V ) is the probability that the true moment tensor for the event lies within the neighborhood of M that has fractional volume V . The area under the confidence curve provides a single, abbreviated 'confidence parameter' for M. We apply the method to data from events in different regions and tectonic settings: small (Mw < 2.5) events at Uturuncu volcano in Bolivia, moderate (Mw > 4) earthquakes in the southern Alaska subduction zone, and natural and man-made events at the Nevada Test Site. Moment tensor uncertainties allow us to better discriminate among moment tensor source types and to assign physical processes to the events.

  5. Visualizing the uncertainty in the relationship between seasonal average climate and malaria risk.

    PubMed

    MacLeod, D A; Morse, A P

    2014-12-02

    Around $1.6 billion per year is spent financing anti-malaria initiatives, and though malaria morbidity is falling, the impact of annual epidemics remains significant. Whilst malaria risk may increase with climate change, projections are highly uncertain and to sidestep this intractable uncertainty, adaptation efforts should improve societal ability to anticipate and mitigate individual events. Anticipation of climate-related events is made possible by seasonal climate forecasting, from which warnings of anomalous seasonal average temperature and rainfall, months in advance are possible. Seasonal climate hindcasts have been used to drive climate-based models for malaria, showing significant skill for observed malaria incidence. However, the relationship between seasonal average climate and malaria risk remains unquantified. Here we explore this relationship, using a dynamic weather-driven malaria model. We also quantify key uncertainty in the malaria model, by introducing variability in one of the first order uncertainties in model formulation. Results are visualized as location-specific impact surfaces: easily integrated with ensemble seasonal climate forecasts, and intuitively communicating quantified uncertainty. Methods are demonstrated for two epidemic regions, and are not limited to malaria modeling; the visualization method could be applied to any climate impact.

  6. Visualizing the uncertainty in the relationship between seasonal average climate and malaria risk

    NASA Astrophysics Data System (ADS)

    MacLeod, D. A.; Morse, A. P.

    2014-12-01

    Around $1.6 billion per year is spent financing anti-malaria initiatives, and though malaria morbidity is falling, the impact of annual epidemics remains significant. Whilst malaria risk may increase with climate change, projections are highly uncertain and to sidestep this intractable uncertainty, adaptation efforts should improve societal ability to anticipate and mitigate individual events. Anticipation of climate-related events is made possible by seasonal climate forecasting, from which warnings of anomalous seasonal average temperature and rainfall, months in advance are possible. Seasonal climate hindcasts have been used to drive climate-based models for malaria, showing significant skill for observed malaria incidence. However, the relationship between seasonal average climate and malaria risk remains unquantified. Here we explore this relationship, using a dynamic weather-driven malaria model. We also quantify key uncertainty in the malaria model, by introducing variability in one of the first order uncertainties in model formulation. Results are visualized as location-specific impact surfaces: easily integrated with ensemble seasonal climate forecasts, and intuitively communicating quantified uncertainty. Methods are demonstrated for two epidemic regions, and are not limited to malaria modeling; the visualization method could be applied to any climate impact.

  7. Markov logic network based complex event detection under uncertainty

    NASA Astrophysics Data System (ADS)

    Lu, Jingyang; Jia, Bin; Chen, Genshe; Chen, Hua-mei; Sullivan, Nichole; Pham, Khanh; Blasch, Erik

    2018-05-01

    In a cognitive reasoning system, the four-stage Observe-Orient-Decision-Act (OODA) reasoning loop is of interest. The OODA loop is essential for the situational awareness especially in heterogeneous data fusion. Cognitive reasoning for making decisions can take advantage of different formats of information such as symbolic observations, various real-world sensor readings, or the relationship between intelligent modalities. Markov Logic Network (MLN) provides mathematically sound technique in presenting and fusing data at multiple levels of abstraction, and across multiple intelligent sensors to conduct complex decision-making tasks. In this paper, a scenario about vehicle interaction is investigated, in which uncertainty is taken into consideration as no systematic approaches can perfectly characterize the complex event scenario. MLNs are applied to the terrestrial domain where the dynamic features and relationships among vehicles are captured through multiple sensors and information sources regarding the data uncertainty.

  8. Flood risk analysis for flood control and sediment transportation in sandy regions: A case study in the Loess Plateau, China

    NASA Astrophysics Data System (ADS)

    Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai

    2018-05-01

    Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.

  9. Estimating the empirical probability of submarine landslide occurrence

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  10. Wave-height hazard analysis in Eastern Coast of Spain - Bayesian approach using generalized Pareto distribution

    NASA Astrophysics Data System (ADS)

    Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.

    2005-03-01

    Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.

  11. Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model

    NASA Astrophysics Data System (ADS)

    Anderson, K. R.

    2016-12-01

    Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics-based, mixed deterministic-probabilistic eruption forecasting approach in reducing and quantifying these uncertainties.

  12. Changing Global Risk Landscape - Challenges for Risk Management (Invited)

    NASA Astrophysics Data System (ADS)

    Wenzel, F.

    2009-12-01

    The exponentially growing losses related to natural disasters on a global scale reflect a changing risk landscape that is characterized by the influence of climate change and a growing population, particularly in urban agglomerations and coastal zones. In consequence of these trends we witness (a) new hazards such as landslides due to dwindling permafrost, new patterns of strong precipitation and related floods, potential for tropical cyclones in the Mediterranean, sea level rise and others; (b) new risks related to large numbers of people in very dense urban areas, and risks related to the vulnerability of infrastructure such as energy supply, water supply, transportation, communication, etc. (c) extreme events with unprecedented size and implications. An appropriate answer to these challenges goes beyond classical views of risk assessment and protection. It must include an understanding of risk as changing with time so that risk assessment needs to be supplemented by risk monitoring. It requires decision making under high uncertainty. The risks (i.e. potentials for future losses) of extreme events are not only high but also very difficult to quantify, as they are characterized by high levels of uncertainty. Uncertainties relate to frequency, time of occurrence, strength and impact of extreme events but also to the coping capacities of society in response to them. The characterization, quantification, reduction in the extent possible of the uncertainties is an inherent topic of extreme event research. However, they will not disappear, so a rational approach to extreme events must include more than reducing uncertainties. It requires us to assess and rate the irreducible uncertainties, to evaluate options for mitigation under large uncertainties, and their communication to societal sectors. Thus scientist need to develop methodologies that aim at a rational approach to extreme events associated with high levels of uncertainty.

  13. Cancer Risk Assessment for Space Radiation

    NASA Technical Reports Server (NTRS)

    Richmond, Robert C.; Cruz, Angela; Bors, Karen; Curreri, Peter A. (Technical Monitor)

    2001-01-01

    Predicting the occurrence of human cancer following exposure to any agent causing genetic damage is a difficult task. This is because the uncertainty of uniform exposure to the damaging agent, and the uncertainty of uniform processing of that damage within a complex set of biological variables, degrade the confidence of predicting the delayed expression of cancer as a relatively rare event within any given clinically normal individual. The radiation health research priorities for enabling long-duration human exploration of space were established in the 1996 NRC Report entitled 'Radiation Hazards to Crews of Interplanetary Missions: Biological Issues and Research Strategies'. This report emphasized that a 15-fold uncertainty in predicting radiation-induced cancer incidence must be reduced before NASA can commit humans to extended interplanetary missions. That report concluded that the great majority of this uncertainty is biologically based, while a minority is physically based due to uncertainties in radiation dosimetry and radiation transport codes. Since that report, the biologically based uncertainty has remained large, and the relatively small uncertainty associated with radiation dosimetry has increased due to the considerations raised by concepts of microdosimetry. In a practical sense, however, the additional uncertainties introduced by microdosimetry are encouraging since they are in a direction of lowered effective dose absorbed through infrequent interactions of any given cell with the high energy particle component of space radiation. Additional information is contained in the original extended abstract.

  14. Risk based adaptation of infrastructures to floods and storm surges induced by climate change.

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Garrè, Luca; Hansen, Peter Friis

    2014-05-01

    Coastal natural hazards are changing in frequency and intensity associated to climate change. These extreme events combined with an increase in the extent of vulnerable societies will lead to an increase of substantial monetary losses. For this reason, adaptive measures are required to identify the effective and adequate measures to withstand the impacts of climate change. Decision strategies are needed for the timing of investments and for the allocation of resources to safeguard the future in a sustainable manner. Adapting structures to climate change requires decision making under uncertainties. Therefore, it is vital that risk assessments are generated on a reliable and appropriate evaluation of the involved uncertainties. Linking a Bayesian network (BN) to a Geographic Information System (GIS) for a risk assessment enables to model all the relevant parameters, their causal relations and the involved uncertainties. The integration of the probabilistic approach into a GIS allows quantifying and visualizing uncertainties in a spatial manner. By addressing these uncertainties, the Bayesian Network approach allows quantifying their effects; and facilitates the identification of future model improvements and where other efforts should be concentrated. The final results can be applied as a supportive tool for presenting reliable risk assessments to decision-makers. Based on this premises, a case study was performed to assess how the storm surge magnitude and flooding extent of an event with similar characteristics to the Sandy Super storm will occur in 2050 and 2090.

  15. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  16. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in flood event management, the more damage can be reduced. And with decisions based on probabilistic forecasts, partial decisions can be made earlier in time (with a lower probability) and can be scaled up or down later in time when there is more certainty; whether the event takes place or not. Partial decisions are often more cheap, or shorten the final mitigation-time at the moment when there is more certainty. The proposed method is tested on Stonehaven, on the Carron River in Scotland. Decisions to implement demountable defences in the town are currently made based on a very short lead-time due to the absence of certainty. Application showed that staged decision making is possible and gives the decision maker more time to respond to a situation. The decision maker is able to take a lower regret decision with higher uncertainty and less related negative consequences. Although it is not possible to quantify intangible effects, it is part of the analysis to reduce these effects. Above all, the proposed approach has shown to be a possible improvement in economic terms and opens up possibilities of more flexible and robust decision making.

  17. Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    DOE PAGES

    Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.

    2016-02-16

    Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less

  18. Propagation of hydro-meteorological uncertainty in a model cascade framework to inundation prediction

    NASA Astrophysics Data System (ADS)

    Rodríguez-Rincón, J. P.; Pedrozo-Acuña, A.; Breña-Naranjo, J. A.

    2015-07-01

    This investigation aims to study the propagation of meteorological uncertainty within a cascade modelling approach to flood prediction. The methodology was comprised of a numerical weather prediction (NWP) model, a distributed rainfall-runoff model and a 2-D hydrodynamic model. The uncertainty evaluation was carried out at the meteorological and hydrological levels of the model chain, which enabled the investigation of how errors that originated in the rainfall prediction interact at a catchment level and propagate to an estimated inundation area and depth. For this, a hindcast scenario is utilised removing non-behavioural ensemble members at each stage, based on the fit with observed data. At the hydrodynamic level, an uncertainty assessment was not incorporated; instead, the model was setup following guidelines for the best possible representation of the case study. The selected extreme event corresponds to a flood that took place in the southeast of Mexico during November 2009, for which field data (e.g. rain gauges; discharge) and satellite imagery were available. Uncertainty in the meteorological model was estimated by means of a multi-physics ensemble technique, which is designed to represent errors from our limited knowledge of the processes generating precipitation. In the hydrological model, a multi-response validation was implemented through the definition of six sets of plausible parameters from past flood events. Precipitation fields from the meteorological model were employed as input in a distributed hydrological model, and resulting flood hydrographs were used as forcing conditions in the 2-D hydrodynamic model. The evolution of skill within the model cascade shows a complex aggregation of errors between models, suggesting that in valley-filling events hydro-meteorological uncertainty has a larger effect on inundation depths than that observed in estimated flood inundation extents.

  19. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  20. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    PubMed

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  1. Bayesian analysis of rare events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less

  2. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    NASA Astrophysics Data System (ADS)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  3. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  4. A terrestrial lidar-based workflow for determining three-dimensional slip vectors and associated uncertainties

    USGS Publications Warehouse

    Gold, Peter O.; Cowgill, Eric; Kreylos, Oliver; Gold, Ryan D.

    2012-01-01

    Three-dimensional (3D) slip vectors recorded by displaced landforms are difficult to constrain across complex fault zones, and the uncertainties associated with such measurements become increasingly challenging to assess as landforms degrade over time. We approach this problem from a remote sensing perspective by using terrestrial laser scanning (TLS) and 3D structural analysis. We have developed an integrated TLS data collection and point-based analysis workflow that incorporates accurate assessments of aleatoric and epistemic uncertainties using experimental surveys, Monte Carlo simulations, and iterative site reconstructions. Our scanning workflow and equipment requirements are optimized for single-operator surveying, and our data analysis process is largely completed using new point-based computing tools in an immersive 3D virtual reality environment. In a case study, we measured slip vector orientations at two sites along the rupture trace of the 1954 Dixie Valley earthquake (central Nevada, United States), yielding measurements that are the first direct constraints on the 3D slip vector for this event. These observations are consistent with a previous approximation of net extension direction for this event. We find that errors introduced by variables in our survey method result in <2.5 cm of variability in components of displacement, and are eclipsed by the 10–60 cm epistemic errors introduced by reconstructing the field sites to their pre-erosion geometries. Although the higher resolution TLS data sets enabled visualization and data interactivity critical for reconstructing the 3D slip vector and for assessing uncertainties, dense topographic constraints alone were not sufficient to significantly narrow the wide (<26°) range of allowable slip vector orientations that resulted from accounting for epistemic uncertainties.

  5. A variational technique to estimate snowfall rate from coincident radar, snowflake, and fall-speed observations

    NASA Astrophysics Data System (ADS)

    Cooper, Steven J.; Wood, Norman B.; L'Ecuyer, Tristan S.

    2017-07-01

    Estimates of snowfall rate as derived from radar reflectivities alone are non-unique. Different combinations of snowflake microphysical properties and particle fall speeds can conspire to produce nearly identical snowfall rates for given radar reflectivity signatures. Such ambiguities can result in retrieval uncertainties on the order of 100-200 % for individual events. Here, we use observations of particle size distribution (PSD), fall speed, and snowflake habit from the Multi-Angle Snowflake Camera (MASC) to constrain estimates of snowfall derived from Ka-band ARM zenith radar (KAZR) measurements at the Atmospheric Radiation Measurement (ARM) North Slope Alaska (NSA) Climate Research Facility site at Barrow. MASC measurements of microphysical properties with uncertainties are introduced into a modified form of the optimal-estimation CloudSat snowfall algorithm (2C-SNOW-PROFILE) via the a priori guess and variance terms. Use of the MASC fall speed, MASC PSD, and CloudSat snow particle model as base assumptions resulted in retrieved total accumulations with a -18 % difference relative to nearby National Weather Service (NWS) observations over five snow events. The average error was 36 % for the individual events. Use of different but reasonable combinations of retrieval assumptions resulted in estimated snowfall accumulations with differences ranging from -64 to +122 % for the same storm events. Retrieved snowfall rates were particularly sensitive to assumed fall speed and habit, suggesting that in situ measurements can help to constrain key snowfall retrieval uncertainties. More accurate knowledge of these properties dependent upon location and meteorological conditions should help refine and improve ground- and space-based radar estimates of snowfall.

  6. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973

  7. Determining SAFOD area microearthquake locations solely with the Pilot Hole seismic array data

    NASA Astrophysics Data System (ADS)

    Oye, Volker; Chavarria, J. Andres; Malin, Peter E.

    2004-05-01

    In August 2002, an array of 32 three-component geophones was installed in the San Andreas Fault Observatory at Depth (SAFOD) Pilot Hole (PH) at Parkfield, CA. As an independent test of surface-observation-based microearthquake locations, we have located such events using only data recorded on the PH array. We then compared these locations with locations from a combined set of PH and Parkfield High Resolution Seismic Network (HRSN) observations. We determined the uncertainties in the locations as they relate to errors in the travel time picks and the velocity model by the bootstrap method. Based on the PH and combined locations, we find that the ``C2'' cluster to the northeast of the PH has the smallest location uncertainties. Events in this cluster also have the most similar waveforms and largest magnitudes. This confirms earlier suggestions that the C2 cluster is a promising target for the SAFOD Main Hole.

  8. Characterizing Drought Events from a Hydrological Model Ensemble

    NASA Astrophysics Data System (ADS)

    Smith, Katie; Parry, Simon; Prudhomme, Christel; Hannaford, Jamie; Tanguy, Maliko; Barker, Lucy; Svensson, Cecilia

    2017-04-01

    Hydrological droughts are a slow onset natural hazard that can affect large areas. Within the United Kingdom there have been eight major drought events over the last 50 years, with several events acting at the continental scale, and covering the entire nation. Many of these events have lasted several years and had significant impacts on agriculture, the environment and the economy. Generally in the UK, due to a northwest-southeast gradient in rainfall and relief, as well as varying underlying geology, droughts tend to be most severe in the southeast, which can threaten water supplies to the capital in London. With the impacts of climate change likely to increase the severity and duration of drought events worldwide, it is crucial that we gain an understanding of the characteristics of some of the longer and more extreme droughts of the 19th and 20th centuries, so we may utilize this information in planning for the future. Hydrological models are essential both for reconstructing such events that predate streamflow records, and for use in drought forecasting. However, whilst the uncertainties involved in modelling hydrological extremes on the flooding end of the flow regime have been studied in depth over the past few decades, the uncertainties in simulating droughts and low flow events have not yet received such rigorous academic attention. The "Cascade of Uncertainty" approach has been applied to explore uncertainty and coherence across simulations of notable drought events from the past 50 years using the airGR family of daily lumped catchment models. Parameter uncertainty has been addressed using a Latin Hypercube sampled experiment of 500,000 parameter sets per model (GR4J, GR5J and GR6J), over more than 200 catchments across the UK. The best performing model parameterisations, determined using a multi-objective function approach, have then been taken forward for use in the assessment of the impact of model parameters and model structure on drought event detection and characterization. This ensemble approach allows for uncertainty estimates and confidence intervals to be explored in simulations of drought event characteristics, such as duration and severity, which would not otherwise be available from a deterministic approach. The acquired understanding of uncertainty in drought events may then be applied to historic drought reconstructions, supplying evidence which could prove vital in decision making scenarios.

  9. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    NASA Astrophysics Data System (ADS)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.

  10. A comparison of the momentum budget in reanalysis datasets during sudden stratospheric warming events

    NASA Astrophysics Data System (ADS)

    Martineau, Patrick; Son, Seok-Woo; Taguchi, Masakazu; Butler, Amy H.

    2018-05-01

    The agreement between reanalysis datasets, in terms of the zonal-mean momentum budget, is evaluated during sudden stratospheric warming (SSW) events. It is revealed that there is a good agreement among datasets in the lower stratosphere and troposphere concerning zonal-mean zonal wind, but less so in the upper stratosphere. Forcing terms of the momentum equation are also relatively similar in the lower atmosphere, but their uncertainties are typically larger than uncertainties of the zonal-wind tendency. Similar to zonal-wind tendency, the agreement among forcing terms is degraded in the upper stratosphere. Discrepancies among reanalyses increase during the onset of SSW events, a period characterized by unusually large fluxes of planetary-scale waves from the troposphere to the stratosphere, and decrease substantially after the onset. While the largest uncertainties in the resolved terms of the momentum budget are found in the Coriolis torque, momentum flux convergence also presents a non-negligible spread among the reanalyses. Such a spread is reduced in the latest reanalysis products, decreasing the uncertainty of the momentum budget. It is also found that the uncertainties in the Coriolis torque depend on the strength of SSW events: the SSW events that exhibit the most intense deceleration of zonal-mean zonal wind are subject to larger discrepancies among reanalyses. These uncertainties in stratospheric circulation, however, are not communicated to the troposphere.

  11. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    NASA Astrophysics Data System (ADS)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.

  12. BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods

    NASA Astrophysics Data System (ADS)

    Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.

    2017-12-01

    Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.

  13. Statistical approaches for the definition of landslide rainfall thresholds and their uncertainty using rain gauge and satellite data

    NASA Astrophysics Data System (ADS)

    Rossi, M.; Luciani, S.; Valigi, D.; Kirschbaum, D.; Brunetti, M. T.; Peruccacci, S.; Guzzetti, F.

    2017-05-01

    Models for forecasting rainfall-induced landslides are mostly based on the identification of empirical rainfall thresholds obtained exploiting rain gauge data. Despite their increased availability, satellite rainfall estimates are scarcely used for this purpose. Satellite data should be useful in ungauged and remote areas, or should provide a significant spatial and temporal reference in gauged areas. In this paper, the analysis of the reliability of rainfall thresholds based on rainfall remote sensed and rain gauge data for the prediction of landslide occurrence is carried out. To date, the estimation of the uncertainty associated with the empirical rainfall thresholds is mostly based on a bootstrap resampling of the rainfall duration and the cumulated event rainfall pairs (D,E) characterizing rainfall events responsible for past failures. This estimation does not consider the measurement uncertainty associated with D and E. In the paper, we propose (i) a new automated procedure to reconstruct ED conditions responsible for the landslide triggering and their uncertainties, and (ii) three new methods to identify rainfall threshold for the possible landslide occurrence, exploiting rain gauge and satellite data. In particular, the proposed methods are based on Least Square (LS), Quantile Regression (QR) and Nonlinear Least Square (NLS) statistical approaches. We applied the new procedure and methods to define empirical rainfall thresholds and their associated uncertainties in the Umbria region (central Italy) using both rain-gauge measurements and satellite estimates. We finally validated the thresholds and tested the effectiveness of the different threshold definition methods with independent landslide information. The NLS method among the others performed better in calculating thresholds in the full range of rainfall durations. We found that the thresholds obtained from satellite data are lower than those obtained from rain gauge measurements. This is in agreement with the literature, where satellite rainfall data underestimate the "ground" rainfall registered by rain gauges.

  14. Statistical Approaches for the Definition of Landslide Rainfall Thresholds and their Uncertainty Using Rain Gauge and Satellite Data

    NASA Technical Reports Server (NTRS)

    Rossi, M.; Luciani, S.; Valigi, D.; Kirschbaum, D.; Brunetti, M. T.; Peruccacci, S.; Guzzetti, F.

    2017-01-01

    Models for forecasting rainfall-induced landslides are mostly based on the identification of empirical rainfall thresholds obtained exploiting rain gauge data. Despite their increased availability, satellite rainfall estimates are scarcely used for this purpose. Satellite data should be useful in ungauged and remote areas, or should provide a significant spatial and temporal reference in gauged areas. In this paper, the analysis of the reliability of rainfall thresholds based on rainfall remote sensed and rain gauge data for the prediction of landslide occurrence is carried out. To date, the estimation of the uncertainty associated with the empirical rainfall thresholds is mostly based on a bootstrap resampling of the rainfall duration and the cumulated event rainfall pairs (D,E) characterizing rainfall events responsible for past failures. This estimation does not consider the measurement uncertainty associated with D and E. In the paper, we propose (i) a new automated procedure to reconstruct ED conditions responsible for the landslide triggering and their uncertainties, and (ii) three new methods to identify rainfall threshold for the possible landslide occurrence, exploiting rain gauge and satellite data. In particular, the proposed methods are based on Least Square (LS), Quantile Regression (QR) and Nonlinear Least Square (NLS) statistical approaches. We applied the new procedure and methods to define empirical rainfall thresholds and their associated uncertainties in the Umbria region (central Italy) using both rain-gauge measurements and satellite estimates. We finally validated the thresholds and tested the effectiveness of the different threshold definition methods with independent landslide information. The NLS method among the others performed better in calculating thresholds in the full range of rainfall durations. We found that the thresholds obtained from satellite data are lower than those obtained from rain gauge measurements. This is in agreement with the literature, where satellite rainfall data underestimate the 'ground' rainfall registered by rain gauges.

  15. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Florian Wellmann, J.; Thiele, Sam T.; Lindsay, Mark D.; Jessell, Mark W.

    2016-03-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilize the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  16. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Wellmann, J. F.; Thiele, S. T.; Lindsay, M. D.; Jessell, M. W.

    2015-11-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilise the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a~link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential-fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  17. Out of the black box: expansion of a theory-based intervention to self-manage the uncertainty associated with active surveillance (AS) for prostate cancer.

    PubMed

    Kazer, Meredith Wallace; Bailey, Donald E; Whittemore, Robin

    2010-01-01

    Active surveillance (AS) (sometimes referred to as watchful waiting) is an alternative approach to managing low-risk forms of prostate cancer. This management approach allows men to avoid expensive prostate cancer treatments and their well-documented adverse events of erectile dysfunction and incontinence. However, AS is associated with illness uncertainty and reduced quality of life (QOL; Wallace, 2003). An uncertainty management intervention (UMI) was developed by Mishel et al. (2002) to manage uncertainty in women treated for breast cancer and men treated for prostate cancer. However, the UMI was not developed for men undergoing AS for prostate cancer and has not been adequately tested in this population. This article reports on the expansion of a theory-based intervention to manage the uncertainty associated with AS for prostate cancer. Intervention Theory (Sidani & Braden, 1998) is discussed as a framework for revising the UMI intervention for men undergoing AS for prostate cancer (UMI-AS). The article concludes with plans for testing of the expanded intervention and implications for the extended theory.

  18. Jet energy scale and resolution in the CMS experiment in pp collisions at 8 TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.

    Improved jet energy scale corrections, based on a data sample corresponding to an integrated luminosity of 19.7 fbmore » $$^{-1}$$ collected by the CMS experiment in proton-proton collisions at a center-of-mass energy of 8 TeV, are presented. The corrections as a function of pseudorapidity $$\\eta$$ and transverse momentum $$p_{\\mathrm{T}}$$ are extracted from data and simulated events combining several channels and methods. They account successively for the effects of pileup, uniformity of the detector response, and residual data-simulation jet energy scale differences. Further corrections, depending on the jet flavor and distance parameter (jet size) $R$, are also presented. The jet energy resolution is measured in data and simulated events and is studied as a function of pileup, jet size, and jet flavor. Typical jet energy resolutions at the central rapidities are 15-20% at 30 GeV, about 10% at 100 GeV, and 5% at 1 TeV. The studies exploit events with dijet topology, as well as photon+jet, Z+jet and multijet events. Several new techniques are used to account for the various sources of jet energy scale corrections, and a full set of uncertainties, and their correlations, are provided.The final uncertainties on the jet energy scale are below 3% across the phase space considered by most analyses ($$p_{\\mathrm{T}}> $$ 30 GeV and $$| \\eta| < $$ 5.0). In the barrel region ($$| \\eta| < $$ 1.3) an uncertainty below 1% for $$p_{\\mathrm{T}}> $$ 30 GeV is reached, when excluding the jet flavor uncertainties, which are provided separately for different jet flavors. Finally, a new benchmark for jet energy scale determination at hadron colliders is achieved with 0.32% uncertainty for jets with $$p_{\\mathrm{T}}$$ of the order of 165-330 GeV, and $$| \\eta| < $$ 0.8.« less

  19. Jet energy scale and resolution in the CMS experiment in pp collisions at 8 TeV

    DOE PAGES

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; ...

    2017-02-22

    Improved jet energy scale corrections, based on a data sample corresponding to an integrated luminosity of 19.7 fbmore » $$^{-1}$$ collected by the CMS experiment in proton-proton collisions at a center-of-mass energy of 8 TeV, are presented. The corrections as a function of pseudorapidity $$\\eta$$ and transverse momentum $$p_{\\mathrm{T}}$$ are extracted from data and simulated events combining several channels and methods. They account successively for the effects of pileup, uniformity of the detector response, and residual data-simulation jet energy scale differences. Further corrections, depending on the jet flavor and distance parameter (jet size) $R$, are also presented. The jet energy resolution is measured in data and simulated events and is studied as a function of pileup, jet size, and jet flavor. Typical jet energy resolutions at the central rapidities are 15-20% at 30 GeV, about 10% at 100 GeV, and 5% at 1 TeV. The studies exploit events with dijet topology, as well as photon+jet, Z+jet and multijet events. Several new techniques are used to account for the various sources of jet energy scale corrections, and a full set of uncertainties, and their correlations, are provided.The final uncertainties on the jet energy scale are below 3% across the phase space considered by most analyses ($$p_{\\mathrm{T}}> $$ 30 GeV and $$| \\eta| < $$ 5.0). In the barrel region ($$| \\eta| < $$ 1.3) an uncertainty below 1% for $$p_{\\mathrm{T}}> $$ 30 GeV is reached, when excluding the jet flavor uncertainties, which are provided separately for different jet flavors. Finally, a new benchmark for jet energy scale determination at hadron colliders is achieved with 0.32% uncertainty for jets with $$p_{\\mathrm{T}}$$ of the order of 165-330 GeV, and $$| \\eta| < $$ 0.8.« less

  20. Risk Assessment: Evidence Base

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2007-01-01

    Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.

  1. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.

  2. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models.

    PubMed

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-12-15

    Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  3. Uncertainty estimation of water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, D. C.; Halldin, S.; Lundin, L.; Xu, C.

    2012-12-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Simulation of elevated water surfaces provides a good way to understand the hydraulic mechanism of large flood events. In this study the one-dimensional HEC-RAS model for steady flow conditions together with the two-dimensional Lisflood-fp model were used to estimate the water level for the Mitch event in the river reaches at Tegucigalpa. Parameters uncertainty of the model was investigated using the generalized likelihood uncertainty estimation (GLUE) framework. Because of the extremely large magnitude of the Mitch flood, no hydrometric measurements were taken during the event. However, post-event indirect measurements of discharge and observed water levels were obtained in previous works by JICA and USGS. To overcome the problem of lacking direct hydrometric measurement data, uncertainty in the discharge was estimated. Both models could well define the value for channel roughness, though more dispersion resulted from the floodplain value. Analysis of the data interaction showed that there was a tradeoff between discharge at the outlet and floodplain roughness for the 1D model. The estimated discharge range at the outlet of the study area encompassed the value indirectly estimated by JICA, however the indirect method used by the USGS overestimated the value. If behavioral parameter sets can well reproduce water surface levels for past events such as Mitch, more reliable predictions for future events can be expected. The results acquired in this research will provide guidelines to deal with the problem of modeling past floods when no direct data was measured during the event, and to predict future large events taking uncertainty into account. The obtained range of the uncertain flood extension will be an outcome useful for decision makers.

  4. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2008-06-30

    throughout this study . The data set consists of GT0-2 nuclear explosions from the SAIC Nuclear Explosion Database (www.rdss.info, Bahavar et al...errors: Bias and variance In this study SNR dependence of both delay and variance of reading errors of first arriving P waves are analyzed and...ground truth and range of event size. For other datasets we turn to estimates based on double- differences between arrival times of station pairs

  5. Developing a Signature Based Safeguards Approach for the Electrorefiner and Salt Cleanup Unit Operations in Pyroprocessing Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, Chantell Lynne-Marie

    Traditional nuclear materials accounting does not work well for safeguards when applied to pyroprocessing. Alternate methods such as Signature Based Safeguards (SBS) are being investigated. The goal of SBS is real-time/near-real-time detection of anomalous events in the pyroprocessing facility as they could indicate loss of special nuclear material. In high-throughput reprocessing facilities, metric tons of separated material are processed that must be accounted for. Even with very low uncertainties of accountancy measurements (<0.1%) the uncertainty of the material balances is still greater than the desired level. Novel contributions of this work are as follows: (1) significant enhancement of SBS developmentmore » for the salt cleanup process by creating a new gas sparging process model, selecting sensors to monitor normal operation, identifying safeguards-significant off-normal scenarios, and simulating those off-normal events and generating sensor output; (2) further enhancement of SBS development for the electrorefiner by simulating off-normal events caused by changes in salt concentration and identifying which conditions lead to Pu and Cm not tracking throughout the rest of the system; and (3) new contribution in applying statistical techniques to analyze the signatures gained from these two models to help draw real-time conclusions on anomalous events.« less

  6. A variational technique to estimate snowfall rate from coincident radar, snowflake, and fall-speed observations

    DOE PAGES

    Cooper, Steven J.; Wood, Norman B.; L'Ecuyer, Tristan S.

    2017-07-20

    Estimates of snowfall rate as derived from radar reflectivities alone are non-unique. Different combinations of snowflake microphysical properties and particle fall speeds can conspire to produce nearly identical snowfall rates for given radar reflectivity signatures. Such ambiguities can result in retrieval uncertainties on the order of 100–200% for individual events. Here, we use observations of particle size distribution (PSD), fall speed, and snowflake habit from the Multi-Angle Snowflake Camera (MASC) to constrain estimates of snowfall derived from Ka-band ARM zenith radar (KAZR) measurements at the Atmospheric Radiation Measurement (ARM) North Slope Alaska (NSA) Climate Research Facility site at Barrow. MASCmore » measurements of microphysical properties with uncertainties are introduced into a modified form of the optimal-estimation CloudSat snowfall algorithm (2C-SNOW-PROFILE) via the a priori guess and variance terms. Use of the MASC fall speed, MASC PSD, and CloudSat snow particle model as base assumptions resulted in retrieved total accumulations with a -18% difference relative to nearby National Weather Service (NWS) observations over five snow events. The average error was 36% for the individual events. The use of different but reasonable combinations of retrieval assumptions resulted in estimated snowfall accumulations with differences ranging from -64 to +122% for the same storm events. Retrieved snowfall rates were particularly sensitive to assumed fall speed and habit, suggesting that in situ measurements can help to constrain key snowfall retrieval uncertainties. Furthermore, accurate knowledge of these properties dependent upon location and meteorological conditions should help refine and improve ground- and space-based radar estimates of snowfall.« less

  7. A variational technique to estimate snowfall rate from coincident radar, snowflake, and fall-speed observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, Steven J.; Wood, Norman B.; L'Ecuyer, Tristan S.

    Estimates of snowfall rate as derived from radar reflectivities alone are non-unique. Different combinations of snowflake microphysical properties and particle fall speeds can conspire to produce nearly identical snowfall rates for given radar reflectivity signatures. Such ambiguities can result in retrieval uncertainties on the order of 100–200% for individual events. Here, we use observations of particle size distribution (PSD), fall speed, and snowflake habit from the Multi-Angle Snowflake Camera (MASC) to constrain estimates of snowfall derived from Ka-band ARM zenith radar (KAZR) measurements at the Atmospheric Radiation Measurement (ARM) North Slope Alaska (NSA) Climate Research Facility site at Barrow. MASCmore » measurements of microphysical properties with uncertainties are introduced into a modified form of the optimal-estimation CloudSat snowfall algorithm (2C-SNOW-PROFILE) via the a priori guess and variance terms. Use of the MASC fall speed, MASC PSD, and CloudSat snow particle model as base assumptions resulted in retrieved total accumulations with a -18% difference relative to nearby National Weather Service (NWS) observations over five snow events. The average error was 36% for the individual events. The use of different but reasonable combinations of retrieval assumptions resulted in estimated snowfall accumulations with differences ranging from -64 to +122% for the same storm events. Retrieved snowfall rates were particularly sensitive to assumed fall speed and habit, suggesting that in situ measurements can help to constrain key snowfall retrieval uncertainties. Furthermore, accurate knowledge of these properties dependent upon location and meteorological conditions should help refine and improve ground- and space-based radar estimates of snowfall.« less

  8. Tools used by the insurance industry to assess risk from hydroclimatic extremes

    NASA Astrophysics Data System (ADS)

    Higgs, Stephanie; McMullan, Caroline

    2016-04-01

    Probabilistic catastrophe models are widely used within the insurance industry to assess and price the risk of natural hazards to individual residences through to portfolios of millions of properties. Over the relatively short period that catastrophe models have been available (almost 30 years), the insurance industry has built up a financial resilience to key natural hazards in certain areas (e.g. US tropical cyclone, European extra-tropical cyclone and flood). However, due the rapidly expanding global population and increase in wealth, together with uncertainties in the behaviour of meteorological phenomena introduced by climate change, the domain in which natural hazards impact society is growing. As a result, the insurance industry faces new challenges in assessing the risk and uncertainty from natural hazards. As a catastrophe modelling company, AIR Worldwide has a toolbox of options available to help the insurance industry assess extreme climatic events and their associated uncertainty. Here we discuss several of these tools: from helping analysts understand how uncertainty is inherently built in to probabilistic catastrophe models, to understanding alternative stochastic catalogs for tropical cyclone based on climate conditioning. Through the use of stochastic extreme disaster events such as those provided through AIR's catalogs or through the Lloyds of London marketplace (RDS's) to provide useful benchmarks for the loss probability exceedence and tail-at-risk metrics outputted from catastrophe models; to the visualisation of 1000+ year event footprints and hazard intensity maps. Ultimately the increased transparency of catastrophe models and flexibility of a software platform that allows for customisation of modelled and non-modelled risks will drive a greater understanding of extreme hydroclimatic events within the insurance industry.

  9. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").

  10. An Improved Method for Seismic Event Depth and Moment Tensor Determination: CTBT Related Application

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.

    2016-12-01

    According to the Protocol to CTBT, International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event. Determination of seismic event source mechanism and its depth is a part of these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. We show preliminary results using the latter approach from an improved software design and applied on a moderately powered computer. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK 2009, 2013 and 2016 events and shallow earthquakes using a new implementation of waveform fitting of teleseismic P waves. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by Tape & Tape (2012) to discretize the complete moment tensor space from a geometric perspective is used. Moment tensors for DPRK events show isotropic percentages greater than 50%. Depth estimates for the DPRK events range from 1.0-1.4 km. Probabilistic uncertainty estimates on the moment tensor parameters provide robustness to solution.

  11. Statistical uncertainty of extreme wind storms over Europe derived from a probabilistic clustering technique

    NASA Astrophysics Data System (ADS)

    Walz, Michael; Leckebusch, Gregor C.

    2016-04-01

    Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.

  12. Advanced Methods for Determining Prediction Uncertainty in Model-Based Prognostics with Application to Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Sankararaman, Shankar

    2013-01-01

    Prognostics is centered on predicting the time of and time until adverse events in components, subsystems, and systems. It typically involves both a state estimation phase, in which the current health state of a system is identified, and a prediction phase, in which the state is projected forward in time. Since prognostics is mainly a prediction problem, prognostic approaches cannot avoid uncertainty, which arises due to several sources. Prognostics algorithms must both characterize this uncertainty and incorporate it into the predictions so that informed decisions can be made about the system. In this paper, we describe three methods to solve these problems, including Monte Carlo-, unscented transform-, and first-order reliability-based methods. Using a planetary rover as a case study, we demonstrate and compare the different methods in simulation for battery end-of-discharge prediction.

  13. Performance evaluation of a smart buffer control at a wastewater treatment plant.

    PubMed

    van Daal-Rombouts, P; Benedetti, L; de Jonge, J; Weijers, S; Langeveld, J

    2017-11-15

    Real time control (RTC) is increasingly seen as a viable method to optimise the functioning of wastewater systems. Model exercises and case studies reported in literature claim a positive impact of RTC based on results without uncertainty analysis and flawed evaluation periods. This paper describes two integrated RTC strategies at the wastewater treatment plant (WWTP) Eindhoven, the Netherlands, that aim to improve the use of the available tanks at the WWTP and storage in the contributing catchments to reduce the impact on the receiving water. For the first time it is demonstrated that a significant improvement can be achieved through the application of RTC in practice. The Storm Tank Control is evaluated based on measurements and reduces the number of storm water settling tank discharges by 44% and the discharged volume by an estimated 33%, decreasing dissolved oxygen depletion in the river. The Primary Clarifier Control is evaluated based on model simulations. The maximum event NH4 concentration in the effluent reduced on average 19% for large events, while the load reduced 20%. For all 31 events the reductions are 11 and 4% respectively. Reductions are significant taking uncertainties into account, while using representative evaluation periods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Propagation of the velocity model uncertainties to the seismic event location

    NASA Astrophysics Data System (ADS)

    Gesret, A.; Desassis, N.; Noble, M.; Romary, T.; Maisons, C.

    2015-01-01

    Earthquake hypocentre locations are crucial in many domains of application (academic and industrial) as seismic event location maps are commonly used to delineate faults or fractures. The interpretation of these maps depends on location accuracy and on the reliability of the associated uncertainties. The largest contribution to location and uncertainty errors is due to the fact that the velocity model errors are usually not correctly taken into account. We propose a new Bayesian formulation that integrates properly the knowledge on the velocity model into the formulation of the probabilistic earthquake location. In this work, the velocity model uncertainties are first estimated with a Bayesian tomography of active shot data. We implement a sampling Monte Carlo type algorithm to generate velocity models distributed according to the posterior distribution. In a second step, we propagate the velocity model uncertainties to the seismic event location in a probabilistic framework. This enables to obtain more reliable hypocentre locations as well as their associated uncertainties accounting for picking and velocity model uncertainties. We illustrate the tomography results and the gain in accuracy of earthquake location for two synthetic examples and one real data case study in the context of induced microseismicity.

  15. Effects of Uncertainty on ERPs to Emotional Pictures Depend on Emotional Valence

    PubMed Central

    Lin, Huiyan; Jin, Hua; Liang, Jiafeng; Yin, Ruru; Liu, Ting; Wang, Yiwen

    2015-01-01

    Uncertainty about the emotional content of an upcoming event has found to modulate neural activity to the event before its occurrence. However, it is still under debate whether the uncertainty effects occur after the occurrence of the event. To address this issue, participants were asked to view emotional pictures that were shortly after a cue, which either indicated a certain emotion of the picture or not. Both certain and uncertain cues were used by neutral symbols. The anticipatory phase (i.e., inter-trial interval, ITI) between the cue and the picture was short to enhance the effects of uncertainty. In addition, we used positive and negative pictures that differed only in valence but not in arousal to investigate whether the uncertainty effect was dependent on emotional valence. Electroencephalography (EEG) was recorded during the presentation of the pictures. Event-related potential (ERP) results showed that negative pictures evoked smaller P2 and late LPP but larger N2 in the uncertain as compared to the certain condition; whereas we did not find the uncertainty effect in early LPP. For positive pictures, the early LPP was larger in the uncertain as compared to the certain condition; however, there were no uncertainty effects in some other ERP components (e.g., P2, N2, and late LPP). The findings suggest that uncertainty modulates neural activity to emotional pictures and this modulation is altered by the valence of the pictures, indicating that individuals alter the allocation of attentional resources toward uncertain emotional pictures dependently on the valence of the pictures. PMID:26733916

  16. Propagation of uncertainties for an evaluation of the Azores-Gibraltar Fracture Zone tsunamigenic potential

    NASA Astrophysics Data System (ADS)

    Antoshchenkova, Ekaterina; Imbert, David; Richet, Yann; Bardet, Lise; Duluc, Claire-Marie; Rebour, Vincent; Gailler, Audrey; Hébert, Hélène

    2016-04-01

    The aim of this study is to assess evaluation the tsunamigenic potential of the Azores-Gibraltar Fracture Zone (AGFZ). This work is part of the French project TANDEM (Tsunamis in the Atlantic and English ChaNnel: Definition of the Effects through numerical Modeling; www-tandem.cea.fr), special attention is paid to French Atlantic coasts. Structurally, the AGFZ region is complex and not well understood. However, a lot of its faults produce earthquakes with significant vertical slip, of a type that can result in tsunami. We use the major tsunami event of the AGFZ on purpose to have a regional estimation of the tsunamigenic potential of this zone. The major reported event for this zone is the 1755 Lisbon event. There are large uncertainties concerning source location and focal mechanism of this earthquake. Hence, simple deterministic approach is not sufficient to cover on the one side the whole AGFZ with its geological complexity and on the other side the lack of information concerning the 1755 Lisbon tsunami. A parametric modeling environment Promethée (promethee.irsn.org/doku.php) was coupled to tsunami simulation software based on shallow water equations with the aim of propagation of uncertainties. Such a statistic point of view allows us to work with multiple hypotheses simultaneously. In our work we introduce the seismic source parameters in a form of distributions, thus giving a data base of thousands of tsunami scenarios and tsunami wave height distributions. Exploring our tsunami scenarios data base we present preliminary results for France. Tsunami wave heights (within one standard deviation of the mean) can be about 0.5 m - 1 m for the Atlantic coast and approaching 0.3 m for the English Channel.

  17. Defining Tsunami Magnitude as Measure of Potential Impact

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Tang, L.

    2016-12-01

    The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.

  18. Uncertainty and stress: Why it causes diseases and how it is mastered by the brain.

    PubMed

    Peters, Achim; McEwen, Bruce S; Friston, Karl

    2017-09-01

    The term 'stress' - coined in 1936 - has many definitions, but until now has lacked a theoretical foundation. Here we present an information-theoretic approach - based on the 'free energy principle' - defining the essence of stress; namely, uncertainty. We address three questions: What is uncertainty? What does it do to us? What are our resources to master it? Mathematically speaking, uncertainty is entropy or 'expected surprise'. The 'free energy principle' rests upon the fact that self-organizing biological agents resist a tendency to disorder and must therefore minimize the entropy of their sensory states. Applied to our everyday life, this means that we feel uncertain, when we anticipate that outcomes will turn out to be something other than expected - and that we are unable to avoid surprise. As all cognitive systems strive to reduce their uncertainty about future outcomes, they face a critical constraint: Reducing uncertainty requires cerebral energy. The characteristic of the vertebrate brain to prioritize its own high energy is captured by the notion of the 'selfish brain'. Accordingly, in times of uncertainty, the selfish brain demands extra energy from the body. If, despite all this, the brain cannot reduce uncertainty, a persistent cerebral energy crisis may develop, burdening the individual by 'allostatic load' that contributes to systemic and brain malfunction (impaired memory, atherogenesis, diabetes and subsequent cardio- and cerebrovascular events). Based on the basic tenet that stress originates from uncertainty, we discuss the strategies our brain uses to avoid surprise and thereby resolve uncertainty. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Event-Based Robust Control for Uncertain Nonlinear Systems Using Adaptive Dynamic Programming.

    PubMed

    Zhang, Qichao; Zhao, Dongbin; Wang, Ding

    2018-01-01

    In this paper, the robust control problem for a class of continuous-time nonlinear system with unmatched uncertainties is investigated using an event-based control method. First, the robust control problem is transformed into a corresponding optimal control problem with an augmented control and an appropriate cost function. Under the event-based mechanism, we prove that the solution of the optimal control problem can asymptotically stabilize the uncertain system with an adaptive triggering condition. That is, the designed event-based controller is robust to the original uncertain system. Note that the event-based controller is updated only when the triggering condition is satisfied, which can save the communication resources between the plant and the controller. Then, a single network adaptive dynamic programming structure with experience replay technique is constructed to approach the optimal control policies. The stability of the closed-loop system with the event-based control policy and the augmented control policy is analyzed using the Lyapunov approach. Furthermore, we prove that the minimal intersample time is bounded by a nonzero positive constant, which excludes Zeno behavior during the learning process. Finally, two simulation examples are provided to demonstrate the effectiveness of the proposed control scheme.

  20. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    NASA Astrophysics Data System (ADS)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  1. Parton shower and NLO-matching uncertainties in Higgs boson pair production

    NASA Astrophysics Data System (ADS)

    Jones, Stephen; Kuttimalai, Silvan

    2018-02-01

    We perform a detailed study of NLO parton shower matching uncertainties in Higgs boson pair production through gluon fusion at the LHC based on a generic and process independent implementation of NLO subtraction and parton shower matching schemes for loop-induced processes in the Sherpa event generator. We take into account the full top-quark mass dependence in the two-loop virtual corrections and compare the results to an effective theory approximation. In the full calculation, our findings suggest large parton shower matching uncertainties that are absent in the effective theory approximation. We observe large uncertainties even in regions of phase space where fixed-order calculations are theoretically well motivated and parton shower effects expected to be small. We compare our results to NLO matched parton shower simulations and analytic resummation results that are available in the literature.

  2. Spatiotemporal analysis and mapping of oral cancer risk in changhua county (taiwan): an application of generalized bayesian maximum entropy method.

    PubMed

    Yu, Hwa-Lung; Chiang, Chi-Ting; Lin, Shu-De; Chang, Tsun-Kuo

    2010-02-01

    Incidence rate of oral cancer in Changhua County is the highest among the 23 counties of Taiwan during 2001. However, in health data analysis, crude or adjusted incidence rates of a rare event (e.g., cancer) for small populations often exhibit high variances and are, thus, less reliable. We proposed a generalized Bayesian Maximum Entropy (GBME) analysis of spatiotemporal disease mapping under conditions of considerable data uncertainty. GBME was used to study the oral cancer population incidence in Changhua County (Taiwan). Methodologically, GBME is based on an epistematics principles framework and generates spatiotemporal estimates of oral cancer incidence rates. In a way, it accounts for the multi-sourced uncertainty of rates, including small population effects, and the composite space-time dependence of rare events in terms of an extended Poisson-based semivariogram. The results showed that GBME analysis alleviates the noises of oral cancer data from population size effect. Comparing to the raw incidence data, the maps of GBME-estimated results can identify high risk oral cancer regions in Changhua County, where the prevalence of betel quid chewing and cigarette smoking is relatively higher than the rest of the areas. GBME method is a valuable tool for spatiotemporal disease mapping under conditions of uncertainty. 2010 Elsevier Inc. All rights reserved.

  3. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2006-09-21

    validation purposes, we use GT0-2 event clusters. These include the Nevada Lop Nor, Semipalatinsk , and Novaya Zemlys test sites , as well as the Azgir...uncertainties. Furthermore, the tails of real seismic data distributions are heavier than Gaussian. The main objectives of this project are to develop, test

  4. Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses

    NASA Astrophysics Data System (ADS)

    Murphy, Christian E.

    2018-05-01

    Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.

  5. Sensitivity to experimental data of pollutant site mean concentration in stormwater runoff.

    PubMed

    Mourad, M; Bertrand-Krajewski, J L; Chebbo, G

    2005-01-01

    Urban wet weather discharges are known to be a great source of pollutants for receiving waters, which protection requires the estimation of long-term discharged pollutant loads. Pollutant loads can be estimated by multiplying a site mean concentration (SMC) by the total runoff volume during a given period of time. The estimation of the SMC value as a weighted mean value with event runoff volumes as weights is affected by uncertainties due to the variability of event mean concentrations and to the number of events used. This study carried out on 13 catchments gives orders of magnitude of these uncertainties and shows the limitations of usual practices using few measured events. The results obtained show that it is not possible to propose a standard minimal number of events to be measured on any catchment in order to evaluate the SMC value with a given uncertainty.

  6. Nuclear event zero-time calculation and uncertainty evaluation.

    PubMed

    Pan, Pujing; Ungar, R Kurt

    2012-04-01

    It is important to know the initial time, or zero-time, of a nuclear event such as a nuclear weapon's test, a nuclear power plant accident or a nuclear terrorist attack (e.g. with an improvised nuclear device, IND). Together with relevant meteorological information, the calculated zero-time is used to help locate the origin of a nuclear event. The zero-time of a nuclear event can be derived from measured activity ratios of two nuclides. The calculated zero-time of a nuclear event would not be complete without an appropriately evaluated uncertainty term. In this paper, analytical equations for zero-time and the associated uncertainty calculations are derived using a measured activity ratio of two nuclides. Application of the derived equations is illustrated in a realistic example using data from the last Chinese thermonuclear test in 1980. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  7. Earthquake prediction analysis based on empirical seismic rate: the M8 algorithm

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.

    2010-12-01

    The quality of space-time earthquake prediction is usually characterized by a 2-D error diagram (n, τ), where n is the fraction of failures-to-predict and τ is the local rate of alarm averaged in space. The most reasonable averaging measure for analysis of a prediction strategy is the normalized rate of target events λ(dg) in a subarea dg. In that case the quantity H = 1 - (n + τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n, τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M >= 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw >= 5.5, 1977-2004, and the magnitude range of target events 8.0 <= M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm.

  8. Estimation of full moment tensors, including uncertainties, for earthquakes, volcanic events, and nuclear explosions

    NASA Astrophysics Data System (ADS)

    Alvizuri, Celso R.

    We present a catalog of full seismic moment tensors for 63 events from Uturuncu volcano in Bolivia. The events were recorded during 2011-2012 in the PLUTONS seismic array of 24 broadband stations. Most events had magnitudes between 0.5 and 2.0 and did not generate discernible surface waves; the largest event was Mw 2.8. For each event we computed the misfit between observed and synthetic waveforms, and we used first-motion polarity measurements to reduce the number of possible solutions. Each moment tensor solution was obtained using a grid search over the six-dimensional space of moment tensors. For each event we show the misfit function in eigenvalue space, represented by a lune. We identify three subsets of the catalog: (1) 6 isotropic events, (2) 5 tensional crack events, and (3) a swarm of 14 events southeast of the volcanic center that appear to be double couples. The occurrence of positively isotropic events is consistent with other published results from volcanic and geothermal regions. Several of these previous results, as well as our results, cannot be interpreted within the context of either an oblique opening crack or a crack-plus-double-couple model. Proper characterization of uncertainties for full moment tensors is critical for distinguishing among physical models of source processes. A seismic moment tensor is a 3x3 symmetric matrix that provides a compact representation of a seismic source. We develop an algorithm to estimate moment tensors and their uncertainties from observed seismic data. For a given event, the algorithm performs a grid search over the six-dimensional space of moment tensors by generating synthetic waveforms for each moment tensor and then evaluating a misfit function between the observed and synthetic waveforms. 'The' moment tensor M0 for the event is then the moment tensor with minimum misfit. To describe the uncertainty associated with M0, we first convert the misfit function to a probability function. The uncertainty, or rather the confidence, is then given by the 'confidence curve' P( V), where P(V) is the probability that the true moment tensor for the event lies within the neighborhood of M that has fractional volume V. The area under the confidence curve provides a single, abbreviated 'confidence parameter' for M0. We apply the method to data from events in different regions and tectonic settings: 63 small (M w 4) earthquakes in the southern Alaska subduction zone, and 12 earthquakes and 17 nuclear explosions at the Nevada Test Site. Characterization of moment tensor uncertainties puts us in better position to discriminate among moment tensor source types and to assign physical processes to the events.

  9. Warning and prevention based on estimates with large uncertainties: the case of low-frequency and large-impact events like tsunamis

    NASA Astrophysics Data System (ADS)

    Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo

    2013-04-01

    Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical sufficient number of large tsunamis, which entails that tsunami hazard has to be estimated by means of speculated worst-case scenarios, and their consequences are evaluated accordingly and usually result associated with large uncertainty bands. In case of large uncertainties, the main issues for geoscientists are how to communicate the information (prediction and uncertainties) to stakeholders and citizens and how to build and implement together responsive procedures that should be adequate. Usually there is a tradeoff between the cost of the countermeasure (warning and prevention) and its efficacy (i.e. its capability of minimizing the damage). The level of the acceptable tradeoff is an issue pertaining to decision makers and to local threatened communities. This paper, that represents a contribution from the European project TRIDEC on management of emergency crises, discusses the role of geoscientists in providing predictions and the related uncertainties. It is stressed that through academic education geoscientists are formed more to better their understanding of processes and the quantification of uncertainties, but are often unprepared to communicate their results in a way appropriate for society. Filling this gap is crucial for improving the way geoscience and society handle natural hazards and devise proper defense means.

  10. Uncertainty in flood forecasting: A distributed modeling approach in a sparse data catchment

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo A.; McPhee, James; Vargas, Ximena

    2012-09-01

    Data scarcity has traditionally precluded the application of advanced hydrologic techniques in developing countries. In this paper, we evaluate the performance of a flood forecasting scheme in a sparsely monitored catchment based on distributed hydrologic modeling, discharge assimilation, and numerical weather predictions with explicit validation uncertainty analysis. For the hydrologic component of our framework, we apply TopNet to the Cautin River basin, located in southern Chile, using a fully distributed a priori parameterization based on both literature-suggested values and data gathered during field campaigns. Results obtained from this step indicate that the incremental effort spent in measuring directly a set of model parameters was insufficient to represent adequately the most relevant hydrologic processes related to spatiotemporal runoff patterns. Subsequent uncertainty validation performed over a six month ensemble simulation shows that streamflow uncertainty is better represented during flood events, due to both the increase of state perturbation introduced by rainfall and the flood-oriented calibration strategy adopted here. Results from different assimilation configurations suggest that the upper part of the basin is the major source of uncertainty in hydrologic process representation and hint at the usefulness of interpreting assimilation results in terms of model input and parameterization inadequacy. Furthermore, in this case study the violation of Markovian state properties by the Ensemble Kalman filter did affect the numerical results, showing that an explicit treatment of the time delay between the generation of surface runoff and the arrival at the basin outlet is required in the assimilation scheme. Peak flow forecasting results demonstrate that there is a major problem with the Weather Research and Forecasting model outputs, which systematically overestimate precipitation over the catchment. A final analysis performed for a large flooding event that occurred in July 2006 shows that, in the absence of bias introduced by an incorrect model calibration, the updating of both model states and meteorological forecasts contributes to a better representation of streamflow uncertainty and to better hydrologic forecasts.

  11. Quantifying radar-rainfall uncertainties in urban drainage flow modelling

    NASA Astrophysics Data System (ADS)

    Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.

    2015-09-01

    This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.

  12. Dispersion analysis for baseline reference mission 2

    NASA Technical Reports Server (NTRS)

    Snow, L. S.

    1975-01-01

    A dispersion analysis considering uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission (BRM) 2. The dispersion analysis is based on the nominal trajectory for BRM 2. The analysis was performed to determine state vector and performance dispersions (or variations) which result from the indicated uncertainties. The dispersions are determined at major mission events and fixed times from liftoff (time slices). The dispersion results will be used to evaluate the capability of the vehicle to perform the mission within a specified level of confidence and to determine flight performance reserves.

  13. Run-up Variability due to Source Effects

    NASA Astrophysics Data System (ADS)

    Del Giudice, Tania; Zolezzi, Francesca; Traverso, Chiara; Valfrè, Giulio; Poggi, Pamela; Parker, Eric J.

    2010-05-01

    This paper investigates the variability of tsunami run-up at a specific location due to uncertainty in earthquake source parameters. It is important to quantify this 'inter-event' variability for probabilistic assessments of tsunami hazard. In principal, this aspect of variability could be studied by comparing field observations at a single location from a number of tsunamigenic events caused by the same source. As such an extensive dataset does not exist, we decided to study the inter-event variability through numerical modelling. We attempt to answer the question 'What is the potential variability of tsunami wave run-up at a specific site, for a given magnitude earthquake occurring at a known location'. The uncertainty is expected to arise from the lack of knowledge regarding the specific details of the fault rupture 'source' parameters. The following steps were followed: the statistical distributions of the main earthquake source parameters affecting the tsunami height were established by studying fault plane solutions of known earthquakes; a case study based on a possible tsunami impact on Egypt coast has been set up and simulated, varying the geometrical parameters of the source; simulation results have been analyzed deriving relationships between run-up height and source parameters; using the derived relationships a Monte Carlo simulation has been performed in order to create the necessary dataset to investigate the inter-event variability of the run-up height along the coast; the inter-event variability of the run-up height along the coast has been investigated. Given the distribution of source parameters and their variability, we studied how this variability propagates to the run-up height, using the Cornell 'Multi-grid coupled Tsunami Model' (COMCOT). The case study was based on the large thrust faulting offshore the south-western Greek coast, thought to have been responsible for the infamous 1303 tsunami. Numerical modelling of the event was used to assess the impact on the North African coast. The effects of uncertainty in fault parameters were assessed by perturbing the base model, and observing variation on wave height along the coast. The tsunami wave run-up was computed at 4020 locations along the Egyptian coast between longitudes 28.7 E and 33.8 E. To assess the effects of fault parameters uncertainty, input model parameters have been varied and effects on run-up have been analyzed. The simulations show that for a given point there are linear relationships between run-up and both fault dislocation and rupture length. A superposition analysis shows that a linear combination of the effects of the different source parameters (evaluated results) leads to a good approximation of the simulated results. This relationship is then used as the basis for a Monte Carlo simulation. The Monte Carlo simulation was performed for 1600 scenarios at each of the 4020 points along the coast. The coefficient of variation (the ratio between standard deviation of the results and the average of the run-up heights along the coast) is comprised between 0.14 and 3.11 with an average value along the coast equal to 0.67. The coefficient of variation of normalized run-up has been compared with the standard deviation of spectral acceleration attenuation laws used for probabilistic seismic hazard assessment studies. These values have a similar meaning, and the uncertainty in the two cases is similar. The 'rule of thumb' relationship between mean and sigma can be expressed as follows: ?+ σ ≈ 2?. The implication is that the uncertainty in run-up estimation should give a range of values within approximately two times the average. This uncertainty should be considered in tsunami hazard analysis, such as inundation and risk maps, evacuation plans and the other related steps.

  14. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  15. Uncertainty quantification in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  16. Energy reconstruction in the long-baseline neutrino experiment.

    PubMed

    Mosel, U; Lalakulich, O; Gallmeister, K

    2014-04-18

    The Long-Baseline Neutrino Experiment aims at measuring fundamental physical parameters to high precision and exploring physics beyond the standard model. Nuclear targets introduce complications towards that aim. We investigate the uncertainties in the energy reconstruction, based on quasielastic scattering relations, due to nuclear effects. The reconstructed event distributions as a function of energy tend to be smeared out and shifted by several 100 MeV in their oscillatory structure if standard event selection is used. We show that a more restrictive experimental event selection offers the possibility to reach the accuracy needed for a determination of the mass ordering and the CP-violating phase. Quasielastic-based energy reconstruction could thus be a viable alternative to the calorimetric reconstruction also at higher energies.

  17. Parton shower and NLO-matching uncertainties in Higgs boson pair production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Stephen; Kuttimalai, Silvan

    We perform a detailed study of NLO parton shower matching uncertainties in Higgs boson pair production through gluon fusion at the LHC based on a generic and process independent implementation of NLO subtraction and parton shower matching schemes for loop-induced processes in the Sherpa event generator. We take into account the full top-quark mass dependence in the two-loop virtual corrections and compare the results to an effective theory approximation. In the full calculation, our findings suggest large parton shower matching uncertainties that are absent in the effective theory approximation. Here, we observe large uncertainties even in regions of phase spacemore » where fixed-order calculations are theoretically well motivated and parton shower effects expected to be small. We compare our results to NLO matched parton shower simulations and analytic resummation results that are available in the literature.« less

  18. Parton shower and NLO-matching uncertainties in Higgs boson pair production

    DOE PAGES

    Jones, Stephen; Kuttimalai, Silvan

    2018-02-28

    We perform a detailed study of NLO parton shower matching uncertainties in Higgs boson pair production through gluon fusion at the LHC based on a generic and process independent implementation of NLO subtraction and parton shower matching schemes for loop-induced processes in the Sherpa event generator. We take into account the full top-quark mass dependence in the two-loop virtual corrections and compare the results to an effective theory approximation. In the full calculation, our findings suggest large parton shower matching uncertainties that are absent in the effective theory approximation. Here, we observe large uncertainties even in regions of phase spacemore » where fixed-order calculations are theoretically well motivated and parton shower effects expected to be small. We compare our results to NLO matched parton shower simulations and analytic resummation results that are available in the literature.« less

  19. Propagating Mixed Uncertainties in Cyber Attacker Payoffs: Exploration of Two-Phase Monte Carlo Sampling and Probability Bounds Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.

    Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker andmore » system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.« less

  20. Assessment of Uncertainty-Based Screening Volumes for NASA Robotic LEO and GEO Conjunction Risk Assessment

    NASA Technical Reports Server (NTRS)

    Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.

    2011-01-01

    Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.

  1. Fast radio burst event rate counts - I. Interpreting the observations

    NASA Astrophysics Data System (ADS)

    Macquart, J.-P.; Ekers, R. D.

    2018-02-01

    The fluence distribution of the fast radio burst (FRB) population (the `source count' distribution, N (>F) ∝Fα), is a crucial diagnostic of its distance distribution, and hence the progenitor evolutionary history. We critically reanalyse current estimates of the FRB source count distribution. We demonstrate that the Lorimer burst (FRB 010724) is subject to discovery bias, and should be excluded from all statistical studies of the population. We re-examine the evidence for flat, α > -1, source count estimates based on the ratio of single-beam to multiple-beam detections with the Parkes multibeam receiver, and show that current data imply only a very weak constraint of α ≲ -1.3. A maximum-likelihood analysis applied to the portion of the Parkes FRB population detected above the observational completeness fluence of 2 Jy ms yields α = -2.6_{-1.3}^{+0.7 }. Uncertainties in the location of each FRB within the Parkes beam render estimates of the Parkes event rate uncertain in both normalizing survey area and the estimated post-beam-corrected completeness fluence; this uncertainty needs to be accounted for when comparing the event rate against event rates measured at other telescopes.

  2. Reproducing an extreme flood with uncertain post-event information

    NASA Astrophysics Data System (ADS)

    Fuentes-Andino, Diana; Beven, Keith; Halldin, Sven; Xu, Chong-Yu; Reynolds, José Eduardo; Di Baldassarre, Giuliano

    2017-07-01

    Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum-Cunge-Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE) uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events can be added into the analysis as they become available.

  3. Charm dimuon production in neutrino-nucleon interactions in the NOMAD experiment

    NASA Astrophysics Data System (ADS)

    Petti, Roberto; Samoylov, Oleg

    2012-09-01

    We present our new measurement of charm dimuon production in neutrino-iron interactions based upon the full statistics collected by the NOMAD experiment. After background subtraction we observe 15,340 charm dimuon events, providing the largest sample currently available. The analysis exploits the large inclusive charged current sample (about 9 million events after all analysis cuts) to constrain the total systematic uncertainty to about 2%. The extraction of strange sea and charm production parameters is also discussed.

  4. Charm dimuon production in neutrino-nucleon interactions in the NOMAD experiment

    NASA Astrophysics Data System (ADS)

    Petti, R.; Samoylov, O. B.

    2011-12-01

    We present our new measurement of charm dimuon production in neutrino-iron interactions based upon the full statistics collected by the NOMAD experiment. After background subtraction we observe 15,340 charm dimuon events, providing the largest sample currently available. The analysis exploits the large inclusive charged current sample (about 9 million events after all analysis cuts) to constrain the total systematic uncertainty to ˜2%. The extraction of strange sea and charm production parameters is also discussed.

  5. Extreme geomagnetic storms: Probabilistic forecasts and their uncertainties

    USGS Publications Warehouse

    Riley, Pete; Love, Jeffrey J.

    2017-01-01

    Extreme space weather events are low-frequency, high-risk phenomena. Estimating their rates of occurrence, as well as their associated uncertainties, is difficult. In this study, we derive statistical estimates and uncertainties for the occurrence rate of an extreme geomagnetic storm on the scale of the Carrington event (or worse) occurring within the next decade. We model the distribution of events as either a power law or lognormal distribution and use (1) Kolmogorov-Smirnov statistic to estimate goodness of fit, (2) bootstrapping to quantify the uncertainty in the estimates, and (3) likelihood ratio tests to assess whether one distribution is preferred over another. Our best estimate for the probability of another extreme geomagnetic event comparable to the Carrington event occurring within the next 10 years is 10.3% 95%  confidence interval (CI) [0.9,18.7] for a power law distribution but only 3.0% 95% CI [0.6,9.0] for a lognormal distribution. However, our results depend crucially on (1) how we define an extreme event, (2) the statistical model used to describe how the events are distributed in intensity, (3) the techniques used to infer the model parameters, and (4) the data and duration used for the analysis. We test a major assumption that the data represent time stationary processes and discuss the implications. If the current trends persist, suggesting that we are entering a period of lower activity, our forecasts may represent upper limits rather than best estimates.

  6. Predictive uncertainty in auditory sequence processing

    PubMed Central

    Hansen, Niels Chr.; Pearce, Marcus T.

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

  7. Object-Based Land Use Classification of Agricultural Land by Coupling Multi-Temporal Spectral Characteristics and Phenological Events in Germany

    NASA Astrophysics Data System (ADS)

    Knoefel, Patrick; Loew, Fabian; Conrad, Christopher

    2015-04-01

    Crop maps based on classification of remotely sensed data are of increased attendance in agricultural management. This induces a more detailed knowledge about the reliability of such spatial information. However, classification of agricultural land use is often limited by high spectral similarities of the studied crop types. More, spatially and temporally varying agro-ecological conditions can introduce confusion in crop mapping. Classification errors in crop maps in turn may have influence on model outputs, like agricultural production monitoring. One major goal of the PhenoS project ("Phenological structuring to determine optimal acquisition dates for Sentinel-2 data for field crop classification"), is the detection of optimal phenological time windows for land cover classification purposes. Since many crop species are spectrally highly similar, accurate classification requires the right selection of satellite images for a certain classification task. In the course of one growing season, phenological phases exist where crops are separable with higher accuracies. For this purpose, coupling of multi-temporal spectral characteristics and phenological events is promising. The focus of this study is set on the separation of spectrally similar cereal crops like winter wheat, barley, and rye of two test sites in Germany called "Harz/Central German Lowland" and "Demmin". However, this study uses object based random forest (RF) classification to investigate the impact of image acquisition frequency and timing on crop classification uncertainty by permuting all possible combinations of available RapidEye time series recorded on the test sites between 2010 and 2014. The permutations were applied to different segmentation parameters. Then, classification uncertainty was assessed and analysed, based on the probabilistic soft-output from the RF algorithm at the per-field basis. From this soft output, entropy was calculated as a spatial measure of classification uncertainty. The results indicate that uncertainty estimates provide a valuable addition to traditional accuracy assessments and helps the user to allocate error in crop maps.

  8. Quantifications of Geomagnetic Storm Impact on TEC and NmF2 during 2013 Mar. event

    NASA Astrophysics Data System (ADS)

    Shim, J. S.; Tsagouri, I.; Goncharenko, L. P.; Mays, M. L.; Taktakishvili, A.; Rastaetter, L.; Kuznetsova, M. M.

    2016-12-01

    We investigate the ionospheric response to 2013 Mar. geomagnetic storm event using GPS TEC, ISR and ionosonde observations in North American sector. In order to quantify variations of TEC and NmF2 (or foF2) due to the storm, we remove the background quiet-time values (e.g., TEC of one day prior to the storm, NmF2 median and average of five quietest days for 30 days prior to the storm). In addition, in order to assess modeling capability of reproducing storm impacts on TEC and NmF2, we compare the observations with various model simulations, which are obtained from empirical, physics-based, and data assimilation models. Further, we investigate how uncertainty in the interplanetary magnetic field (IMF) impacts on TEC and NmF2 during the geomagnetic storm event. For this uncertainty study, we use a physics-based coupled ionosphere-thermosphere model, CTIPe, and solar wind parameters obtained from ensemble of WSA-ENLIL+Cone model simulations. This study has been supported by the Community Coordinated Modeling Center (CCMC) at the Goddard Space Flight Center. Model outputs and observational data used for the study will be permanently posted at the CCMC website (http://ccmc.gsfc.nasa.gov) for the space science communities to use.

  9. Search for electroweak single top quark production with cdf in proton - anti-proton collisions at √s = 1.96-TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, Thorsten

    2005-06-17

    In this thesis two searches for electroweak single top quark production with the CDF experiment have been presented, a cutbased search and an iterated discriminant analysis. Both searches find no significant evidence for electroweak single top production using a data set corresponding to an integrated luminosity of 162 pb -1 collected with CDF. Therefore limits on s- and t-channel single top production are determined using a likelihood technique. For the cutbased search a likelihood function based on lepton charge times pseudorapidity of the non-bottom jet was used if exactly one bottom jet was identified in the event. In case ofmore » two identified bottom jets a likelihood function based on the total number of observed events was used. The systematic uncertainties have been treated in a Bayesian approach, all sources of systematic uncertainties have been integrated out. An improved signal modeling using the MadEvent Monte Carlo program matched to NLO calculations has been used. The obtained limits for the s- and t-channel single top production cross sections are 13.6 pb and 10.1 pb, respectively. To date, these are most stringent limits published for the s- and the t-channel single top quark production modes.« less

  10. Quasi-continuous stochastic simulation framework for flood modelling

    NASA Astrophysics Data System (ADS)

    Moustakis, Yiannis; Kossieris, Panagiotis; Tsoukalas, Ioannis; Efstratiadis, Andreas

    2017-04-01

    Typically, flood modelling in the context of everyday engineering practices is addressed through event-based deterministic tools, e.g., the well-known SCS-CN method. A major shortcoming of such approaches is the ignorance of uncertainty, which is associated with the variability of soil moisture conditions and the variability of rainfall during the storm event.In event-based modeling, the sole expression of uncertainty is the return period of the design storm, which is assumed to represent the acceptable risk of all output quantities (flood volume, peak discharge, etc.). On the other hand, the varying antecedent soil moisture conditions across the basin are represented by means of scenarios (e.g., the three AMC types by SCS),while the temporal distribution of rainfall is represented through standard deterministic patterns (e.g., the alternative blocks method). In order to address these major inconsistencies,simultaneously preserving the simplicity and parsimony of the SCS-CN method, we have developed a quasi-continuous stochastic simulation approach, comprising the following steps: (1) generation of synthetic daily rainfall time series; (2) update of potential maximum soil moisture retention, on the basis of accumulated five-day rainfall; (3) estimation of daily runoff through the SCS-CN formula, using as inputs the daily rainfall and the updated value of soil moisture retention;(4) selection of extreme events and application of the standard SCS-CN procedure for each specific event, on the basis of synthetic rainfall.This scheme requires the use of two stochastic modelling components, namely the CastaliaR model, for the generation of synthetic daily data, and the HyetosMinute model, for the disaggregation of daily rainfall to finer temporal scales. Outcomes of this approach are a large number of synthetic flood events, allowing for expressing the design variables in statistical terms and thus properly evaluating the flood risk.

  11. New Insights into the Estimation of Extreme Geomagnetic Storm Occurrences

    NASA Astrophysics Data System (ADS)

    Ruffenach, Alexis; Winter, Hugo; Lavraud, Benoit; Bernardara, Pietro

    2017-04-01

    Space weather events such as intense geomagnetic storms are major disturbances of the near-Earth environment that may lead to serious impacts on our modern society. As such, it is of great importance to estimate their probability, and in particular that of extreme events. One approach largely used in statistical sciences for extreme events probability estimates is Extreme Value Analysis (EVA). Using this rigorous statistical framework, estimations of the occurrence of extreme geomagnetic storms are performed here based on the most relevant global parameters related to geomagnetic storms, such as ground parameters (e.g. geomagnetic Dst and aa indexes), and space parameters related to the characteristics of Coronal Mass Ejections (CME) (velocity, southward magnetic field component, electric field). Using our fitted model, we estimate the annual probability of a Carrington-type event (Dst = -850nT) to be on the order of 10-3, with a lower limit of the uncertainties on the return period of ˜500 years. Our estimate is significantly higher than that of most past studies, which typically had a return period of a few 100 years at maximum. Thus precautions are required when extrapolating intense values. Currently, the complexity of the processes and the length of available data inevitably leads to significant uncertainties in return period estimates for the occurrence of extreme geomagnetic storms. However, our application of extreme value models for extrapolating into the tail of the distribution provides a mathematically justified framework for the estimation of extreme return periods, thereby enabling the determination of more accurate estimates and reduced associated uncertainties.

  12. PubMed Central

    Capanna, A.; Gervasi, G.; Terracciano, E.; Zaratti, L.

    2017-01-01

    Summary Routine mass immunization programs have contributed greatly to the control of infectious diseases and to the improvement of the health of populations. Over the last decades, the rise of antivaccination movements has threatened the advances made in this field to the point that vaccination coverage rates have decreased and outbreaks of vaccine-preventable diseases have resurfaced. One of the critical points of the immunization debate revolves around the level of risk attributable to vaccination, namely the possibility of experiencing serious and possibly irreversible adverse events. Unfortunately, the knowledge about adverse events, especially rare ones, is usually incomplete at best and the attribution of a causal relationship with vaccinations is subject to significant uncertainties. The aim of this paper is to provide a narrative review of seven rare or very rare adverse events: hypotonic hyporesponsive episode, multiple sclerosis, apnea in preterm newborns, Guillain-Barré syndrome, vasculitides, arthritis/ arthralgia, immune thrombocytopenic purpura. We have selected these adverse events based on our experience of questions asked by health care workers involved in vaccination services. Information on the chosen adverse events was retrieved from Medline using appropriate search terms. The review is in the form of questions and answers for each adverse event, with a view to providing useful and actionable concepts while not ignoring the uncertainties that remain. We also highlight in the conclusion possible future improvements to adverse event detection and assessment that could help identify individuals at higher risk against the probable future backdrop of ever-greater abandonment of compulsory vaccination policies. PMID:28515627

  13. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  14. A Measurement of the proton-proton inelastic scattering cross-section at center off mass energy = 7 TeV with the ATLAS detector at the LHC

    NASA Astrophysics Data System (ADS)

    Tompkins, Lauren Alexandra

    The first measurement of the inelastic cross-section for proton-proton collisions at a center of mass energy of 7 TeV using the ATLAS detector at the Large Hadron Collider is presented. From a dataset corresponding to an integrated luminosity of 20 inverse microbarns, events are selected by requiring activity in scintillation counters mounted in the forward region of the ATLAS detector. An inelastic cross-section of 60.1 +/- 2.1 millibarns is measured for the subset of events visible to the scintillation counters. The uncertainty includes the statistical and systematic uncertainty on the measurement. The visible events satisfy xi > 5 x 10 -6, where xi=MX 2/s is calculated from the invariant mass, MX, of hadrons selected using the largest rapidity gap in the event. For diffractive events this corresponds to requiring at least one of the dissociation masses to be larger than 15.7~GeV. Using an extrapolation dependent on the model for the differential diffractive mass distribution, an inelastic cross-section of 69.1 +/- 2.4 (exp) +/- 6.9 (extr) millibarns is determined, where (exp) indicates the experimental uncertainties and (extr) indicates the uncertainty due to the extrapolation from the limited xi-range to the full inelastic cross-section.

  15. Quantifying and Qualifying USGS ShakeMap Uncertainty

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent

    2008-01-01

    We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions and numerous stations, depending on the density of station/data coverage. Due to these dependencies, the letter grade can change with subsequent ShakeMap revisions if more data are added or when finite-faulting dimensions are added. We emphasize that the greatest uncertainties are associated with unconstrained source dimensions for large earthquakes where the distance term in the GMPE is most uncertain; this uncertainty thus scales with magnitude (and consequently rupture dimension). Since this distance uncertainty produces potentially large uncertainties in ShakeMap ground-motion estimates, this factor dominates over compensating constraints for all but the most dense station distributions.

  16. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  17. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  18. Efficient spatial privacy preserving scheme for sensor network

    NASA Astrophysics Data System (ADS)

    Debnath, Ashmita; Singaravelu, Pradheepkumar; Verma, Shekhar

    2013-03-01

    The privacy of sensitive events observed by a wireless sensor networks (WSN) needs to be protected. Adversaries with the knowledge of sensor deployment and network protocols can infer the location of a sensed event by monitoring the communication from the sensors even when the messages are encrypted. Encryption provides confidentiality; however, the context of the event can used to breach the privacy of sensed objects. An adversary can track the trajectory of a moving object or determine the location of the occurrence of a critical event to breach its privacy. In this paper, we propose ring signature to obfuscate the spatial information. Firstly, the extended region of location of an event of interest as estimated from a sensor communication is presented. Then, the increase in this region of spatial uncertainty due to the effect of ring signature is determined. We observe that ring signature can effectively enhance the region of location uncertainty of a sensed event. As the event of interest can be situated anywhere in the enhanced region of uncertainty, its privacy against local or global adversary is ensured. Both analytical and simulation results show that induced delay and throughput are insignificant with negligible impact on the performance of a WSN.

  19. Characterize kinematic rupture history of large earthquakes with Multiple Haskell sources

    NASA Astrophysics Data System (ADS)

    Jia, Z.; Zhan, Z.

    2017-12-01

    Earthquakes are often regarded as continuous rupture along a single fault, but the occurrence of complex large events involving multiple faults and dynamic triggering challenges this view. Such rupture complexities cause difficulties in existing finite fault inversion algorithms, because they rely on specific parameterizations and regularizations to obtain physically meaningful solutions. Furthermore, it is difficult to assess reliability and uncertainty of obtained rupture models. Here we develop a Multi-Haskell Source (MHS) method to estimate rupture process of large earthquakes as a series of sub-events of varying location, timing and directivity. Each sub-event is characterized by a Haskell rupture model with uniform dislocation and constant unilateral rupture velocity. This flexible yet simple source parameterization allows us to constrain first-order rupture complexity of large earthquakes robustly. Additionally, relatively few parameters in the inverse problem yields improved uncertainty analysis based on Markov chain Monte Carlo sampling in a Bayesian framework. Synthetic tests and application of MHS method on real earthquakes show that our method can capture major features of large earthquake rupture process, and provide information for more detailed rupture history analysis.

  20. Enterprise Systems Analysis

    DTIC Science & Technology

    2016-03-14

    flows , or continuous state changes, with feedback loops and lags modeled in the flow system. Agent based simulations operate using a discrete event...DeLand, S. M., Rutherford, B . M., Diegert, K. V., & Alvin, K. F. (2002). Error and uncertainty in modeling and simulation . Reliability Engineering...intrinsic complexity of the underlying social systems fundamentally limits the ability to make

  1. Drought Persistence Errors in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, H.; Gudmundsson, L.; Seneviratne, S. I.

    2018-04-01

    The persistence of drought events largely determines the severity of socioeconomic and ecological impacts, but the capability of current global climate models (GCMs) to simulate such events is subject to large uncertainties. In this study, the representation of drought persistence in GCMs is assessed by comparing state-of-the-art GCM model simulations to observation-based data sets. For doing so, we consider dry-to-dry transition probabilities at monthly and annual scales as estimates for drought persistence, where a dry status is defined as negative precipitation anomaly. Though there is a substantial spread in the drought persistence bias, most of the simulations show systematic underestimation of drought persistence at global scale. Subsequently, we analyzed to which degree (i) inaccurate observations, (ii) differences among models, (iii) internal climate variability, and (iv) uncertainty of the employed statistical methods contribute to the spread in drought persistence errors using an analysis of variance approach. The results show that at monthly scale, model uncertainty and observational uncertainty dominate, while the contribution from internal variability is small in most cases. At annual scale, the spread of the drought persistence error is dominated by the statistical estimation error of drought persistence, indicating that the partitioning of the error is impaired by the limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current GCMs and suggest directions for further model improvement.

  2. Paleoseismic event dating and the conditional probability of large earthquakes on the southern San Andreas fault, California

    USGS Publications Warehouse

    Biasi, G.P.; Weldon, R.J.; Fumal, T.E.; Seitz, G.G.

    2002-01-01

    We introduce a quantitative approach to paleoearthquake dating and apply it to paleoseismic data from the Wrightwood and Pallett Creek sites on the southern San Andreas fault. We illustrate how stratigraphic ordering, sedimentological, and historical data can be used quantitatively in the process of estimating earthquake ages. Calibrated radiocarbon age distributions are used directly from layer dating through recurrence intervals and recurrence probability estimation. The method does not eliminate subjective judgements in event dating, but it does provide a means of systematically and objectively approaching the dating process. Date distributions for the most recent 14 events at Wrightwood are based on sample and contextual evidence in Fumal et al. (2002) and site context and slip history in Weldon et al. (2002). Pallett Creek event and dating descriptions are from published sources. For the five most recent events at Wrightwood, our results are consistent with previously published estimates, with generally comparable or narrower uncertainties. For Pallett Creek, our earthquake date estimates generally overlap with previous results but typically have broader uncertainties. Some event date estimates are very sensitive to details of data interpretation. The historical earthquake in 1857 ruptured the ground at both sites but is not constrained by radiocarbon data. Radiocarbon ages, peat accumulation rates, and historical constraints at Pallett Creek for event X yield a date estimate in the earliest 1800s and preclude a date in the late 1600s. This event is almost certainly the historical 1812 earthquake, as previously concluded by Sieh et al. (1989). This earthquake also produced ground deformation at Wrightwood. All events at Pallett Creek, except for event T, about A.D. 1360, and possibly event I, about A.D. 960, have corresponding events at Wrightwood with some overlap in age ranges. Event T falls during a period of low sedimentation at Wrightwood when conditions were not favorable for recording earthquake evidence. Previously proposed correlations of Pallett Creek X with Wrightwood W3 in the 1690s and Pallett Creek event V with W5 around 1480 (Fumal et al., 1993) appear unlikely after our dating reevaluation. Apparent internal inconsistencies among event, layer, and dating relationships around events R and V identify them as candidates for further investigation at the site. Conditional probabilities of earthquake recurrence were estimated using Poisson, lognormal, and empirical models. The presence of 12 or 13 events at Wrightwood during the same interval that 10 events are reported at Pallett Creek is reflected in mean recurrence intervals of 105 and 135 years, respectively. Average Poisson model 30-year conditional probabilities are about 20% at Pallett Creek and 25% at Wrightwood. The lognormal model conditional probabilities are somewhat higher, about 25% for Pallett Creek and 34% for Wrightwood. Lognormal variance ??ln estimates of 0.76 and 0.70, respectively, imply only weak time predictability. Conditional probabilities of 29% and 46%, respectively, were estimated for an empirical distribution derived from the data alone. Conditional probability uncertainties are dominated by the brevity of the event series; dating uncertainty contributes only secondarily. Wrightwood and Pallett Creek event chronologies both suggest variations in recurrence interval with time, hinting that some form of recurrence rate modulation may be at work, but formal testing shows that neither series is more ordered than might be produced by a Poisson process.

  3. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    NASA Astrophysics Data System (ADS)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the cold climate event 8200 years ago by meltwater outburst from lake Agassiz. Paleoceanography 19:PA3014, (2004) [2] T. Schneider von Deimling, H. Held, A. Ganopolski, S. Rahmstorf, Climate sensitivity estimated from ensemble simulations of glacial climates, Climate Dynamics 27, 149-163, DOI 10.1007/s00382-006-0126-8 (2006). [3] A. Lorenz, Diploma Thesis, U Potsdam (2007).

  4. Robust Adaptation? Assessing the sensitivity of safety margins in flood defences to uncertainty in future simulations - a case study from Ireland.

    NASA Astrophysics Data System (ADS)

    Murphy, Conor; Bastola, Satish; Sweeney, John

    2013-04-01

    Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally associated with such return periods. Sensitivity results show that the impact of climate change is not as great for flood peaks with higher return periods. The average width of the uncertainty range and the size of the range for each catchment reveals that the uncertainties in low frequency events are greater than high frequency events. In addition, the uncertainty interval, estimated as the average width of the uncertainty range of flow for the five return periods, grows wider with a decrease in the runoff coefficient and wetness index of each catchment, both of which tend to increase the nonlinearity in the rainfall response. A key management question that emerges is the acceptability of residual risk where high exposure of vulnerable populations and/or critical infrastructure coincide with high costs of additional capacity in safety margins.

  5. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  6. Automated parton-shower variations in PYTHIA 8

    DOE PAGES

    Mrenna, S.; Skands, P.

    2016-10-03

    In the era of precision physics measurements at the LHC, efficient and exhaustive estimations of theoretical uncertainties play an increasingly crucial role. In the context of Monte Carlo (MC) event generators, the estimation of such uncertainties traditionally requires independent MC runs for each variation, for a linear increase in total run time. In this work, we report on an automated evaluation of the dominant (renormalization-scale and nonsingular) perturbative uncertainties in the pythia 8 event generator, with only a modest computational overhead. Each generated event is accompanied by a vector of alternative weights (one for each uncertainty variation), with each set separatelymore » preserving the total cross section. Explicit scale-compensating terms can be included, reflecting known coefficients of higher-order splitting terms and reducing the effect of the variations. In conclusion, the formalism also allows for the enhancement of rare partonic splittings, such as g→bb¯ and q→qγ, to obtain weighted samples enriched in these splittings while preserving the correct physical Sudakov factors.« less

  7. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    PubMed

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly because a large part of the total uncertainty is dependent on depth-damage curves. Improving the estimation of these curves may provide better results in term of uncertainty reduction than the adoption of detailed hydraulic models.

  8. Jet energy measurement and its systematic uncertainty in proton-proton collisions at TeV with the ATLAS detector

    NASA Astrophysics Data System (ADS)

    Aad, G.; Abajyan, T.; Abbott, B.; Abdallah, J.; Abdel Khalek, S.; Abdinov, O.; Aben, R.; Abi, B.; Abolins, M.; AbouZeid, O. S.; Abramowicz, H.; Abreu, H.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Addy, T. N.; Adelman, J.; Adomeit, S.; Adye, T.; Aefsky, S.; Agatonovic-Jovin, T.; Aguilar-Saavedra, J. A.; Agustoni, M.; Ahlen, S. P.; Ahmad, A.; Ahmadov, F.; Aielli, G.; Åkesson, T. P. A.; Akimoto, G.; Akimov, A. V.; Alam, M. A.; Albert, J.; Albrand, S.; Alconada Verzini, M. J.; Aleksa, M.; Aleksandrov, I. N.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alio, L.; Alison, J.; Allbrooke, B. M. M.; Allison, L. J.; Allport, P. P.; Allwood-Spiers, S. E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alonso, F.; Altheimer, A.; Alvarez Gonzalez, B.; Alviggi, M. G.; Amako, K.; Amaral Coutinho, Y.; Amelung, C.; Ammosov, V. V.; Amor Dos Santos, S. P.; Amorim, A.; Amoroso, S.; Amram, N.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Anduaga, X. S.; Angelidakis, S.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoki, M.; Aperio Bella, L.; Apolle, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Arfaoui, S.; Arguin, J.-F.; Argyropoulos, S.; Arik, E.; Arik, M.; Armbruster, A. J.; Arnaez, O.; Arnal, V.; Arslan, O.; Artamonov, A.; Artoni, G.; Asai, S.; Asbah, N.; Ask, S.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Astbury, A.; Atkinson, M.; Atlay, N. B.; Auerbach, B.; Auge, E.; Augsten, K.; Aurousseau, M.; Avolio, G.; Azuelos, G.; Azuma, Y.; Baak, M. A.; Bacci, C.; Bach, A. M.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Backus Mayes, J.; Badescu, E.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Bailey, D. C.; Bain, T.; Baines, J. T.; Baker, O. K.; Baker, S.; Balek, P.; Balli, F.; Banas, E.; Banerjee, Sw.; Banfi, D.; Bangert, A.; Bansal, V.; Bansil, H. S.; Barak, L.; Baranov, S. P.; Barber, T.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B. M.; Barnett, R. M.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Bartoldus, R.; Barton, A. E.; Bartos, P.; Bartsch, V.; Bassalat, A.; Basye, A.; Bates, R. L.; Batkova, L.; Batley, J. R.; Battistin, M.; Bauer, F.; Bawa, H. S.; Beau, T.; Beauchemin, P. H.; Beccherle, R.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, S.; Beckingham, M.; Beddall, A. J.; Beddall, A.; Bedikian, S.; Bednyakov, V. A.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, K.; Belanger-Champagne, C.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belloni, A.; Beloborodova, O. L.; Belotskiy, K.; Beltramello, O.; Benary, O.; Benchekroun, D.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Benhar Noccioli, E.; Benitez Garcia, J. A.; Benjamin, D. P.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernard, C.; Bernat, P.; Bernhard, R.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertolucci, F.; Besana, M. I.; Besjes, G. J.; Bessidskaia, O.; Besson, N.; Bethke, S.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Bieniek, S. P.; Bierwagen, K.; Biesiada, J.; Biglietti, M.; Bilbao De Mendizabal, J.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Bittner, B.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blazek, T.; Bloch, I.; Blocker, C.; Blocki, J.; Blum, W.; Blumenschein, U.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Boddy, C. R.; Boehler, M.; Boek, J.; Boek, T. T.; Boelaert, N.; Bogaerts, J. A.; Bogdanchikov, A. G.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Boldyrev, A. S.; Bolnet, N. M.; Bomben, M.; Bona, M.; Boonekamp, M.; Bordoni, S.; Borer, C.; Borisov, A.; Borissov, G.; Borri, M.; Borroni, S.; Bortfeldt, J.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Bouchami, J.; Boudreau, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Bousson, N.; Boutouil, S.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozovic-Jelisavcic, I.; Bracinik, J.; Branchini, P.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Brazzale, S. F.; Brelier, B.; Brendlinger, K.; Brenner, R.; Bressler, S.; Bristow, T. M.; Britton, D.; Brochu, F. M.; Brock, I.; Brock, R.; Broggi, F.; Bromberg, C.; Bronner, J.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Brown, G.; Brown, J.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.; Bruschi, M.; Bryngemark, L.; Buanes, T.; Buat, Q.; Bucci, F.; Buchholz, P.; Buckingham, R. M.; Buckley, A. G.; Buda, S. I.; Budagov, I. A.; Budick, B.; Buehrer, F.; Bugge, L.; Bugge, M. K.; Bulekov, O.; Bundock, A. C.; Bunse, M.; Burckhart, H.; Burdin, S.; Burgess, T.; Burghgrave, B.; Burke, S.; Burmeister, I.; Busato, E.; Büscher, V.; Bussey, P.; Buszello, C. P.; Butler, B.; Butler, J. M.; Butt, A. I.; Buttar, C. M.; Butterworth, J. M.; Buttinger, W.; Buzatu, A.; Byszewski, M.; Cabrera Urbán, S.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calfayan, P.; Calkins, R.; Caloba, L. P.; Caloi, R.; Calvet, D.; Calvet, S.; Camacho Toro, R.; Camarri, P.; Cameron, D.; Caminada, L. M.; Caminal Armadans, R.; Campana, S.; Campanelli, M.; Canale, V.; Canelli, F.; Canepa, A.; Cantero, J.; Cantrill, R.; Cao, T.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Cardarelli, R.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Caso, C.; Castaneda-Miranda, E.; Castelli, A.; Castillo Gimenez, V.; Castro, N. F.; Catastini, P.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Cattani, G.; Caughron, S.; Cavaliere, V.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerio, B.; Cerny, K.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chalupkova, I.; Chan, K.; Chang, P.; Chapleau, B.; Chapman, J. D.; Charfeddine, D.; Charlton, D. G.; Chavda, V.; Chavez Barajas, C. A.; Cheatham, S.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, L.; Chen, S.; Chen, X.; Chen, Y.; Cheng, Y.; Cheplakov, A.; Cherkaoui El Moursli, R.; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiefari, G.; Childers, J. T.; Chilingarov, A.; Chiodini, G.; Chisholm, A. S.; Chislett, R. T.; Chitan, A.; Chizhov, M. V.; Chouridou, S.; Chow, B. K. B.; Christidi, I. A.; Chromek-Burckhart, D.; Chu, M. L.; Chudoba, J.; Ciapetti, G.; Ciftci, A. K.; Ciftci, R.; Cinca, D.; Cindro, V.; Ciocio, A.; Cirilli, M.; Cirkovic, P.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, P. J.; Clarke, R. N.; Cleland, W.; Clemens, J. C.; Clement, B.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coelli, S.; Coffey, L.; Cogan, J. G.; Coggeshall, J.; Colas, J.; Cole, B.; Cole, S.; Colijn, A. P.; Collins-Tooth, C.; Collot, J.; Colombo, T.; Colon, G.; Compostella, G.; Conde Muiño, P.; Coniavitis, E.; Conidi, M. C.; Connelly, I. A.; Consonni, S. M.; Consorti, V.; Constantinescu, S.; Conta, C.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cooper-Smith, N. J.; Copic, K.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Côté, D.; Cottin, G.; Courneyea, L.; Cowan, G.; Cox, B. E.; Cranmer, K.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Crispin Ortuzar, M.; Cristinziani, M.; Crosetti, G.; Cuciuc, C.-M.; Cuenca Almenar, C.; Cuhadar Donszelmann, T.; Cummings, J.; Curatolo, M.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; D'Orazio, A.; Da Cunha Sargedas De Sousa, M. J.; Da Via, C.; Dabrowski, W.; Dafinca, A.; Dai, T.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Daniells, A. C.; Dano Hoffmann, M.; Dao, V.; Darbo, G.; Darlea, G. L.; Darmora, S.; Dassoulas, J. A.; Davey, W.; David, C.; Davidek, T.; Davies, E.; Davies, M.; Davignon, O.; Davison, A. R.; Davygora, Y.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Castro, S.; De Cecco, S.; de Graat, J.; De Groot, N.; de Jong, P.; De La Taille, C.; De la Torre, H.; De Lorenzi, F.; De Nooij, L.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; De Zorzi, G.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dechenaux, B.; Dedovich, D. V.; Degenhardt, J.; Del Peso, J.; Del Prete, T.; Delemontex, T.; Deliot, F.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; Deluca, C.; Demers, S.; Demichev, M.; Demilly, A.; Demirkoz, B.; Denisov, S. P.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deviveiros, P. O.; Dewhurst, A.; DeWilde, B.; Dhaliwal, S.; Dhullipudi, R.; Di Ciaccio, A.; Di Ciaccio, L.; Di Domenico, A.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Mattia, A.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Dietzsch, T. A.; Diglio, S.; Dindar Yagci, K.; Dingfelder, J.; Dionisi, C.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; do Vale, M. A. B.; Do Valle Wemans, A.; Doan, T. K. O.; Dobos, D.; Dobson, E.; Dodd, J.; Doglioni, C.; Doherty, T.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dos Anjos, A.; Dotti, A.; Dova, M. T.; Doyle, A. T.; Dris, M.; Dubbert, J.; Dube, S.; Dubreuil, E.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Dudziak, F.; Duflot, L.; Duguid, L.; Dührssen, M.; Dunford, M.; Duran Yildiz, H.; Düren, M.; Dwuznik, M.; Ebke, J.; Edson, W.; Edwards, C. A.; Edwards, N. C.; Ehrenfeld, W.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellinghaus, F.; Ellis, K.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Engelmann, R.; Erdmann, J.; Ereditato, A.; Eriksson, D.; Ernis, G.; Ernst, J.; Ernst, M.; Ernwein, J.; Errede, D.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Etienne, F.; Etienvre, A. I.; Etzion, E.; Evangelakou, D.; Evans, H.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Fatholahzadeh, B.; Favareto, A.; Fayard, L.; Federic, P.; Fedin, O. L.; Fedorko, W.; Fehling-Kaschek, M.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Fernando, W.; Ferrag, S.; Ferrando, J.; Ferrara, V.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferreira de Lima, D. E.; Ferrer, A.; Ferrere, D.; Ferretti, C.; Ferretto Parodi, A.; Fiascaris, M.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, J.; Fisher, M. J.; Fitzgerald, E. A.; Flechl, M.; Fleck, I.; Fleischmann, P.; Fleischmann, S.; Fletcher, G. T.; Fletcher, G.; Flick, T.; Floderus, A.; Flores Castillo, L. R.; Florez Bustos, A. C.; Flowerdew, M. J.; Fonseca Martin, T.; Formica, A.; Forti, A.; Fortin, D.; Fournier, D.; Fox, H.; Francavilla, P.; Franchini, M.; Franchino, S.; Francis, D.; Franklin, M.; Franz, S.; Fraternali, M.; Fratina, S.; French, S. T.; Friedrich, C.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fullana Torregrosa, E.; Fulsom, B. G.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gadatsch, S.; Gadfort, T.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallo, V.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gandrajula, R. P.; Gao, J.; Gao, Y. S.; Garay Walls, F. M.; Garberson, F.; García, C.; García Navarro, J. E.; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gatti, C.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Ge, P.; Gecse, Z.; Gee, C. N. P.; Geerts, D. A. A.; Geich-Gimbel, Ch.; Gellerstedt, K.; Gemme, C.; Gemmell, A.; Genest, M. H.; Gentile, S.; George, M.; George, S.; Gerbaudo, D.; Gershon, A.; Ghazlane, H.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giangiobbe, V.; Giannetti, P.; Gianotti, F.; Gibbard, B.; Gibson, S. M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gillman, A. R.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giordano, R.; Giorgi, F. M.; Giovannini, P.; Giraud, P. F.; Giugni, D.; Giuliani, C.; Giunta, M.; Gjelsten, B. K.; Gkialas, I.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glazov, A.; Glonti, G. L.; Goblirsch-Kolb, M.; Goddard, J. R.; Godfrey, J.; Godlewski, J.; Goeringer, C.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gomez Fajardo, L. S.; Gonçalo, R.; Goncalves Pinto Firmino Da Costa, J.; Gonella, L.; González de la Hoz, S.; Gonzalez Parra, G.; Gonzalez Silva, M. L.; Gonzalez-Sevilla, S.; Goodson, J. J.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Gouighri, M.; Goujdami, D.; Goulette, M. P.; Goussiou, A. G.; Goy, C.; Gozpinar, S.; Grabas, H. M. X.; Graber, L.; Grabowska-Bold, I.; Grafström, P.; Grahn, K.-J.; Gramling, J.; Gramstad, E.; Grancagnolo, F.; Grancagnolo, S.; Grassi, V.; Gratchev, V.; Gray, H. M.; Gray, J. A.; Graziani, E.; Grebenyuk, O. G.; Greenwood, Z. D.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Griffiths, J.; Grigalashvili, N.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grishkevich, Y. V.; Grivaz, J.-F.; Grohs, J. P.; Grohsjean, A.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Groth-Jensen, J.; Grout, Z. J.; Grybel, K.; Guescini, F.; Guest, D.; Gueta, O.; Guicheney, C.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Gunther, J.; Guo, J.; Gupta, S.; Gutierrez, P.; Gutierrez Ortiz, N. G.; Gutschow, C.; Guttman, N.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haefner, P.; Hageboeck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Hall, D.; Halladjian, G.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamer, M.; Hamilton, A.; Hamilton, S.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Hanke, P.; Hansen, J. R.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hansson, P.; Hara, K.; Hard, A. S.; Harenberg, T.; Harkusha, S.; Harper, D.; Harrington, R. D.; Harris, O. M.; Harrison, P. F.; Hartjes, F.; Harvey, A.; Hasegawa, S.; Hasegawa, Y.; Hassani, S.; Haug, S.; Hauschild, M.; Hauser, R.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, A. D.; Hayashi, T.; Hayden, D.; Hays, C. P.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heinemann, B.; Heisterkamp, S.; Hejbal, J.; Helary, L.; Heller, C.; Heller, M.; Hellman, S.; Hellmich, D.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Henrichs, A.; Henriques Correia, A. M.; Henrot-Versille, S.; Hensel, C.; Herbert, G. H.; Hernandez, C. M.; Hernández Jiménez, Y.; Herrberg-Schubert, R.; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hickling, R.; Higón-Rodriguez, E.; Hill, J. C.; Hiller, K. H.; Hillert, S.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, D.; Hofmann, J. I.; Hohlfeld, M.; Holmes, T. R.; Hong, T. M.; Hooft van Huysduynen, L.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howard, J.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, X.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huettmann, A.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Hülsing, T. A.; Hurwitz, M.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Idarraga, J.; Ideal, E.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikematsu, K.; Ikeno, M.; Iliadis, D.; Ilic, N.; Inamaru, Y.; Ince, T.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Irles Quiles, A.; Isaksson, C.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, B.; Jackson, J. N.; Jackson, M.; Jackson, P.; Jaekel, M. R.; Jain, V.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jakubek, J.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansen, H.; Janssen, J.; Janus, M.; Jared, R. C.; Jarlskog, G.; Jeanty, L.; Jeng, G.-Y.; Jen-La Plante, I.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Jha, M. K.; Ji, H.; Ji, W.; Jia, J.; Jiang, Y.; Jimenez Belenguer, M.; Jin, S.; Jinaru, A.; Jinnouchi, O.; Joergensen, M. D.; Joffe, D.; Johansson, K. E.; Johansson, P.; Johns, K. A.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, T. J.; Jorge, P. M.; Joshi, K. D.; Jovicevic, J.; Ju, X.; Jung, C. A.; Jungst, R. M.; Jussel, P.; Juste Rozas, A.; Kaci, M.; Kaczmarska, A.; Kadlecik, P.; Kado, M.; Kagan, H.; Kagan, M.; Kajomovitz, E.; Kalinin, S.; Kama, S.; Kanaya, N.; Kaneda, M.; Kaneti, S.; Kanno, T.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kar, D.; Karakostas, K.; Karastathis, N.; Karnevskiy, M.; Karpov, S. N.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasieczka, G.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Katre, A.; Katzy, J.; Kaushik, V.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Kazarinov, M. Y.; Keeler, R.; Keener, P. T.; Kehoe, R.; Keil, M.; Keller, J. S.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Kessoku, K.; Keung, J.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Khomich, A.; Khoo, T. J.; Khoriauli, G.; Khoroshilov, A.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kim, H.; Kim, S. H.; Kimura, N.; Kind, O.; King, B. T.; King, M.; King, R. S. B.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kitamura, T.; Kittelmann, T.; Kiuchi, K.; Kladiva, E.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koenig, S.; Koevesarki, P.; Koffas, T.; Koffeman, E.; Kogan, L. A.; Kohlmann, S.; Kohout, Z.; Kohriki, T.; Koi, T.; Kolanoski, H.; Koletsou, I.; Koll, J.; Komar, A. A.; Komori, Y.; Kondo, T.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Korotkov, V. A.; Kortner, O.; Kortner, S.; Kostyukhin, V. V.; Kotov, S.; Kotov, V. M.; Kotwal, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewski, R.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V. A.; Kramberger, G.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kreiss, S.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, N.; Krieger, P.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruker, T.; Krumnack, N.; Krumshteyn, Z. V.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kuday, S.; Kuehn, S.; Kugel, A.; Kuhl, T.; Kukhtin, V.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunkle, J.; Kupco, A.; Kurashige, H.; Kurata, M.; Kurochkin, Y. A.; Kurumida, R.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwee, R.; La Rosa, A.; La Rotonda, L.; Labarga, L.; Lablak, S.; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Laier, H.; Laisne, E.; Lambourne, L.; Lampen, C. L.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Larner, A.; Lassnig, M.; Laurelli, P.; Lavorini, V.; Lavrijsen, W.; Laycock, P.; Le, B. T.; Le Dortz, O.; Le Guirriec, E.; Le Menedeu, E.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Lehmacher, M.; Lehmann Miotto, G.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Lendermann, V.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lenzi, B.; Leone, R.; Leonhardt, K.; Leontsinis, S.; Leroy, C.; Lessard, J.-R.; Lester, C. G.; Lester, C. M.; Levêque, J.; Levin, D.; Levinson, L. J.; Lewis, A.; Lewis, G. H.; Leyko, A. M.; Leyton, M.; Li, B.; Li, B.; Li, H.; Li, H. L.; Li, S.; Li, X.; Liang, Z.; Liao, H.; Liberti, B.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limbach, C.; Limosani, A.; Limper, M.; Lin, S. C.; Linde, F.; Lindquist, B. E.; Linnemann, J. T.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y.; Livan, M.; Livermore, S. S. A.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lo Sterzo, F.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Lombardo, V. P.; Long, J. D.; Long, R. E.; Lopes, L.; Lopez Mateos, D.; Lopez Paredes, B.; Lorenz, J.; Lorenzo Martinez, N.; Losada, M.; Loscutoff, P.; Losty, M. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lowe, A. J.; Lu, F.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Ludwig, D.; Ludwig, I.; Luehring, F.; Lukas, W.; Luminari, L.; Lund, E.; Lundberg, J.; Lundberg, O.; Lund-Jensen, B.; Lungwitz, M.; Lynn, D.; Lysak, R.; Lytken, E.; Ma, H.; Ma, L. L.; Maccarrone, G.; Macchiolo, A.; Maček, B.; Machado Miguens, J.; Macina, D.; Mackeprang, R.; Madar, R.; Madaras, R. J.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeno, M.; Maeno, T.; Magnoni, L.; Magradze, E.; Mahboubi, K.; Mahlstedt, J.; Mahmoud, S.; Mahout, G.; Maiani, C.; Maidantchik, C.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Mal, P.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyshev, V. M.; Malyukov, S.; Mamuzic, J.; Mandelli, L.; Mandić, I.; Mandrysch, R.; Maneira, J.; Manfredini, A.; Manhaes de Andrade Filho, L.; Manjarres Ramos, J. A.; Mann, A.; Manning, P. M.; Manousakis-Katsikakis, A.; Mansoulie, B.; Mantifel, R.; Mapelli, L.; March, L.; Marchand, J. F.; Marchese, F.; Marchiori, G.; Marcisovsky, M.; Marino, C. P.; Marques, C. N.; Marroquim, F.; Marshall, Z.; Marti, L. F.; Marti-Garcia, S.; Martin, B.; Martin, B.; Martin, J. P.; Martin, T. A.; Martin, V. J.; Martin dit Latour, B.; Martinez, H.; Martinez, M.; Martin-Haugh, S.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massol, N.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Matsunaga, H.; Matsushita, T.; Mättig, P.; Mättig, S.; Mattmann, J.; Mattravers, C.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazzaferro, L.; Mazzanti, M.; Mc Goldrick, G.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McCubbin, N. A.; McFarlane, K. W.; Mcfayden, J. A.; Mchedlidze, G.; Mclaughlan, T.; McMahon, S. J.; McPherson, R. A.; Meade, A.; Mechnich, J.; Mechtel, M.; Medinnis, M.; Meehan, S.; Meera-Lebbai, R.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Melachrinos, C.; Mellado Garcia, B. R.; Meloni, F.; Mendoza Navas, L.; Mengarelli, A.; Menke, S.; Meoni, E.; Mercurio, K. M.; Mergelmeyer, S.; Meric, N.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Merritt, H.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Meyer, J.; Michal, S.; Middleton, R. P.; Migas, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Milstein, D.; Minaenko, A. A.; Miñano Moya, M.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mirabelli, G.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Mitsui, S.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Moeller, V.; Mohapatra, S.; Mohr, W.; Molander, S.; Moles-Valls, R.; Molfetas, A.; Mönig, K.; Monini, C.; Monk, J.; Monnier, E.; Montejo Berlingen, J.; Monticelli, F.; Monzani, S.; Moore, R. W.; Mora Herrera, C.; Moraes, A.; Morange, N.; Morel, J.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Morgenstern, M.; Morii, M.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Morvaj, L.; Moser, H. G.; Mosidze, M.; Moss, J.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, K.; Mueller, T.; Mueller, T.; Muenstermann, D.; Munwes, Y.; Murillo Quijada, J. A.; Murray, W. J.; Mussche, I.; Musto, E.; Myagkov, A. G.; Myska, M.; Nackenhorst, O.; Nadal, J.; Nagai, K.; Nagai, R.; Nagai, Y.; Nagano, K.; Nagarkar, A.; Nagasaka, Y.; Nagel, M.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Nanava, G.; Napier, A.; Narayan, R.; Nash, M.; Nattermann, T.; Naumann, T.; Navarro, G.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Negri, A.; Negri, G.; Negrini, M.; Nektarijevic, S.; Nelson, A.; Nelson, T. K.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neusiedl, A.; Neves, R. M.; Nevski, P.; Newcomer, F. M.; Newman, P. R.; Nguyen, D. H.; Nguyen Thi Hong, V.; Nickerson, R. B.; Nicolaidou, R.; Nicquevert, B.; Nielsen, J.; Nikiforou, N.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolics, K.; Nikolopoulos, K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Norberg, S.; Nordberg, M.; Novakova, J.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nuncio-Quiroz, A.-E.; Nunes Hanninger, G.; Nunnemann, T.; Nurse, E.; O'Brien, B. J.; O'Grady, F.; O'Neil, D. C.; O'Shea, V.; Oakes, L. B.; Oakham, F. G.; Oberlack, H.; Ocariz, J.; Ochi, A.; Ochoa, M. I.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohshima, T.; Okamura, W.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Olchevski, A. G.; Olivares Pino, S. A.; Oliveira, M.; Oliveira Damazio, D.; Oliver Garcia, E.; Olivito, D.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Oropeza Barrera, C.; Orr, R. S.; Osculati, B.; Ospanov, R.; Otero y Garzon, G.; Otono, H.; Ouchrif, M.; Ouellette, E. A.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Ovcharova, A.; Owen, M.; Owen, S.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pacheco Pages, A.; Padilla Aranda, C.; Pagan Griso, S.; Paganis, E.; Pahl, C.; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palestini, S.; Pallin, D.; Palma, A.; Palmer, J. D.; Pan, Y. B.; Panagiotopoulou, E.; Panduro Vazquez, J. G.; Pani, P.; Panikashvili, N.; Panitkin, S.; Pantea, D.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Paredes Hernandez, D.; Parker, M. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pashapour, S.; Pasqualucci, E.; Passaggio, S.; Passeri, A.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N. D.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pearce, J.; Pedersen, M.; Pedraza Lopez, S.; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Peng, H.; Penning, B.; Penwell, J.; Perepelitsa, D. V.; Perez Cavalcanti, T.; Perez Codina, E.; Pérez García-Estañ, M. T.; Perez Reale, V.; Perini, L.; Pernegger, H.; Perrino, R.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, J.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petteni, M.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Piec, S. M.; Piegaia, R.; Pignotti, D. T.; Pilcher, J. E.; Pilkington, A. D.; Pina, J.; Pinamonti, M.; Pinder, A.; Pinfold, J. L.; Pingel, A.; Pinto, B.; Pizio, C.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Poddar, S.; Podlyski, F.; Poettgen, R.; Poggioli, L.; Pohl, D.; Pohl, M.; Polesello, G.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pomeroy, D.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Portell Bueso, X.; Pospelov, G. E.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Prabhu, R.; Pralavorio, P.; Pranko, A.; Prasad, S.; Pravahan, R.; Prell, S.; Price, D.; Price, J.; Price, L. E.; Prieur, D.; Primavera, M.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopapadaki, E.; Protopopescu, S.; Proudfoot, J.; Prudent, X.; Przybycien, M.; Przysiezniak, H.; Psoroulas, S.; Ptacek, E.; Pueschel, E.; Puldon, D.; Purohit, M.; Puzo, P.; Pylypchenko, Y.; Qian, J.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Quilty, D.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Ragusa, F.; Rahal, G.; Rajagopalan, S.; Rammensee, M.; Rammes, M.; Randle-Conde, A. S.; Rangel-Smith, C.; Rao, K.; Rauscher, F.; Rave, T. C.; Ravenscroft, T.; Raymond, M.; Read, A. L.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Reinsch, A.; Reisin, H.; Reisinger, I.; Relich, M.; Rembser, C.; Ren, Z. L.; Renaud, A.; Rescigno, M.; Resconi, S.; Resende, B.; Reznicek, P.; Rezvani, R.; Richter, R.; Ridel, M.; Rieck, P.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Ritsch, E.; Riu, I.; Rivoltella, G.; Rizatdinova, F.; Rizvi, E.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Rocha de Lima, J. G.; Roda, C.; Roda Dos Santos, D.; Rodrigues, L.; Roe, S.; Røhne, O.; Rolli, S.; Romaniouk, A.; Romano, M.; Romeo, G.; Romero Adam, E.; Rompotis, N.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, A.; Rose, M.; Rosendahl, P. L.; Rosenthal, O.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rubinskiy, I.; Rud, V. I.; Rudolph, C.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rumyantsev, L.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Rutherfoord, J. P.; Ruthmann, N.; Ruzicka, P.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryder, N. C.; Saavedra, A. F.; Sacerdoti, S.; Saddique, A.; Sadeh, I.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Saleem, M.; Salek, D.; Sales De Bruin, P. H.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvachua Ferrando, B. M.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Sanchez Martinez, V.; Sandaker, H.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandoval, T.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Santoyo Castillo, I.; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarkisyan-Grinbaum, E.; Sarrazin, B.; Sartisohn, G.; Sasaki, O.; Sasaki, Y.; Sasao, N.; Satsounkevitch, I.; Sauvage, G.; Sauvan, E.; Sauvan, J. B.; Savard, P.; Savinov, V.; Savu, D. O.; Sawyer, C.; Sawyer, L.; Saxon, D. H.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Schaarschmidt, J.; Schacht, P.; Schaefer, D.; Schaelicke, A.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schieck, J.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt, E.; Schmieden, K.; Schmitt, C.; Schmitt, C.; Schmitt, S.; Schneider, B.; Schnellbach, Y. J.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schorlemmer, A. L. S.; Schott, M.; Schouten, D.; Schovancova, J.; Schram, M.; Schramm, S.; Schreyer, M.; Schroeder, C.; Schroer, N.; Schuh, N.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwegler, Ph.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Schwoerer, M.; Sciacca, F. G.; Scifo, E.; Sciolla, G.; Scott, W. G.; Scutti, F.; Searcy, J.; Sedov, G.; Sedykh, E.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekula, S. J.; Selbach, K. E.; Seliverstov, D. M.; Sellers, G.; Seman, M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Serre, T.; Seuster, R.; Severini, H.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shamim, M.; Shan, L. Y.; Shank, J. T.; Shao, Q. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Sherwood, P.; Shimizu, S.; Shimojima, M.; Shin, T.; Shiyakova, M.; Shmeleva, A.; Shochet, M. J.; Short, D.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Shushkevich, S.; Sicho, P.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silbert, O.; Silva, J.; Silver, Y.; Silverstein, D.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simoniello, R.; Simonyan, M.; Sinervo, P.; Sinev, N. B.; Sipica, V.; Siragusa, G.; Sircar, A.; Sisakyan, A. N.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinnari, L. A.; Skottowe, H. P.; Skovpen, K. Yu.; Skubic, P.; Slater, M.; Slavicek, T.; Sliwa, K.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, K. M.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snidero, G.; Snow, J.; Snyder, S.; Sobie, R.; Socher, F.; Sodomka, J.; Soffer, A.; Soh, D. A.; Solans, C. A.; Solar, M.; Solc, J.; Soldatov, E. Yu.; Soldevila, U.; Solfaroli Camillocci, E.; Solodkov, A. A.; Solovyanov, O. V.; Solovyev, V.; Soni, N.; Sood, A.; Sopko, V.; Sopko, B.; Sosebee, M.; Soualah, R.; Soueid, P.; Soukharev, A. M.; South, D.; Spagnolo, S.; Spanò, F.; Spearman, W. R.; Spighi, R.; Spigo, G.; Spousta, M.; Spreitzer, T.; Spurlock, B.; St. Denis, R. D.; Stahlman, J.; Stamen, R.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, J.; Staroba, P.; Starovoitov, P.; Staszewski, R.; Stavina, P.; Steele, G.; Steinbach, P.; Steinberg, P.; Stekl, I.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stern, S.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoerig, K.; Stoicea, G.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, E.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Stucci, S. A.; Stugu, B.; Stumer, I.; Stupak, J.; Sturm, P.; Styles, N. A.; Su, D.; Su, J.; Subramania, HS.; Subramaniam, R.; Succurro, A.; Sugaya, Y.; Suhr, C.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, Y.; Svatos, M.; Swedish, S.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takahashi, Y.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tam, J. Y. C.; Tamsett, M. C.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanasijczuk, A. J.; Tani, K.; Tannoury, N.; Tapprogge, S.; Tarem, S.; Tarrade, F.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Tavares Delgado, A.; Tayalati, Y.; Taylor, C.; Taylor, F. E.; Taylor, G. N.; Taylor, W.; Teischinger, F. A.; Teixeira Dias Castanheira, M.; Teixeira-Dias, P.; Temming, K. K.; Ten Kate, H.; Teng, P. K.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Therhaag, J.; Theveneaux-Pelzer, T.; Thoma, S.; Thomas, J. P.; Thompson, E. N.; Thompson, P. D.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Thong, W. M.; Thun, R. P.; Tian, F.; Tibbetts, M. J.; Tic, T.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tiouchichine, E.; Tipton, P.; Tisserant, S.; Todorov, T.; Todorova-Nova, S.; Toggerson, B.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tollefson, K.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Topilin, N. D.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Tran, H. L.; Trefzger, T.; Tremblet, L.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Triplett, N.; Trischuk, W.; Trocmé, B.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; True, P.; Trzebinski, M.; Trzupek, A.; Tsarouchas, C.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsionou, D.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsung, J.-W.; Tsuno, S.; Tsybychev, D.; Tua, A.; Tudorache, A.; Tudorache, V.; Tuggle, J. M.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turk Cakir, I.; Turra, R.; Tuts, P. M.; Tykhonov, A.; Tylmad, M.; Tyndel, M.; Uchida, K.; Ueda, I.; Ueno, R.; Ughetto, M.; Ugland, M.; Uhlenbrock, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Urbaniec, D.; Urquijo, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Valladolid Gallego, E.; Vallecorsa, S.; Valls Ferrer, J. A.; Van Berg, R.; Van Der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; Van Der Leeuw, R.; van der Ster, D.; van Eldik, N.; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vaniachine, A.; Vankov, P.; Vannucci, F.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vassilakopoulos, V. I.; Vazeille, F.; Vazquez Schroeder, T.; Veatch, J.; Veloso, F.; Veneziano, S.; Ventura, A.; Ventura, D.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Vickey Boeriu, O. E.; Viehhauser, G. H. A.; Viel, S.; Vigne, R.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Virzi, J.; Vitells, O.; Viti, M.; Vivarelli, I.; Vives Vaque, F.; Vlachos, S.; Vladoiu, D.; Vlasak, M.; Vogel, A.; Vokac, P.; Volpi, G.; Volpi, M.; Volpini, G.; von der Schmitt, H.; von Radziewski, H.; von Toerne, E.; Vorobel, V.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vu Anh, T.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, W.; Wagner, P.; Wahrmund, S.; Wakabayashi, J.; Walch, S.; Walder, J.; Walker, R.; Walkowiak, W.; Wall, R.; Waller, P.; Walsh, B.; Wang, C.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, X.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Warsinsky, M.; Washbrook, A.; Wasicki, C.; Watanabe, I.; Watkins, P. M.; Watson, A. T.; Watson, I. J.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, A. T.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weigell, P.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wendland, D.; Weng, Z.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; White, A.; White, M. J.; White, R.; White, S.; Whiteson, D.; Whittington, D.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wijeratne, P. A.; Wildauer, A.; Wildt, M. A.; Wilhelm, I.; Wilkens, H. G.; Will, J. Z.; Williams, H. H.; Williams, S.; Willis, W.; Willocq, S.; Wilson, J. A.; Wilson, A.; Wingerter-Seez, I.; Winkelmann, S.; Winklmeier, F.; Wittgen, M.; Wittig, T.; Wittkowski, J.; Wollstadt, S. J.; Wolter, M. W.; Wolters, H.; Wong, W. C.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wraight, K.; Wright, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wulf, E.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xiao, M.; Xu, C.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yamada, M.; Yamaguchi, H.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, K.; Yamamoto, S.; Yamamura, T.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, U. K.; Yang, Y.; Yanush, S.; Yao, L.; Yasu, Y.; Yatsenko, E.; Yau Wong, K. H.; Ye, J.; Ye, S.; Yen, A. L.; Yildirim, E.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J.; Yuan, L.; Yurkewicz, A.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zaytsev, A.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zevi della Porta, G.; Zhang, D.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, X.; Zhang, Z.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, L.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, R.; Zimmermann, S.; Zimmermann, S.; Zinonos, Z.; Ziolkowski, M.; Zitoun, R.; Zobernig, G.; Zoccoli, A.; zur Nedden, M.; Zurzolo, G.; Zutshi, V.; Zwalinski, L.

    2015-01-01

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of TeV corresponding to an integrated luminosity of . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- algorithm with distance parameters or , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a boson, for and pseudorapidities . The effect of multiple proton-proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region () for jets with . For central jets at lower , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton-proton collisions and test-beam data, which also provide the estimate for TeV. The calibration of forward jets is derived from dijet balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- jets at . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5-3 %.

  9. Electrons for Neutrinos: Using Electron Scattering to Develop New Energy Reconstruction for Future Deuterium-Based Neutrino Detectors

    NASA Astrophysics Data System (ADS)

    Silva, Adrian; Schmookler, Barak; Papadopoulou, Afroditi; Schmidt, Axel; Hen, Or; Khachatryan, Mariana; Weinstein, Lawrence

    2017-09-01

    Using wide phase-space electron scattering data, we study a novel technique for neutrino energy reconstruction for future neutrino oscillation experiments. Accelerator-based neutrino oscillation experiments require detailed understanding of neutrino-nucleus interactions, which are complicated by the underlying nuclear physics that governs the process. One area of concern is that neutrino energy must be reconstructed event-by-event from the final-state kinematics. In charged-current quasielastic scattering, Fermi motion of nucleons prevents exact energy reconstruction. However, in scattering from deuterium, the momentum of the electron and proton constrain the neutrino energy exactly, offering a new avenue for reducing systematic uncertainties. To test this approach, we analyzed d (e ,e' p) data taken with the CLAS detector at Jefferson Lab Hall B and made kinematic selection cuts to obtain quasielastic events. We estimated the remaining inelastic background by using d (e ,e' pπ-) events to produce a simulated dataset of events with an undetected π-. These results demonstrate the feasibility of energy reconstruction in a hypothetical future deuterium-based neutrino detector. Supported by the Paul E. Gray UROP Fund, MIT.

  10. Reconstruction of droughts in India using multiple land-surface models (1951-2015)

    NASA Astrophysics Data System (ADS)

    Mishra, Vimal; Shah, Reepal; Azhar, Syed; Shah, Harsh; Modi, Parth; Kumar, Rohini

    2018-04-01

    India has witnessed some of the most severe historical droughts in the current decade, and severity, frequency, and areal extent of droughts have been increasing. As a large part of the population of India is dependent on agriculture, soil moisture drought affecting agricultural activities (crop yields) has significant impacts on socio-economic conditions. Due to limited observations, soil moisture is generally simulated using land-surface hydrological models (LSMs); however, these LSM outputs have uncertainty due to many factors, including errors in forcing data and model parameterization. Here we reconstruct agricultural drought events over India during the period of 1951-2015 based on simulated soil moisture from three LSMs, the Variable Infiltration Capacity (VIC), the Noah, and the Community Land Model (CLM). Based on simulations from the three LSMs, we find that major drought events occurred in 1987, 2002, and 2015 during the monsoon season (June through September). During the Rabi season (November through February), major soil moisture droughts occurred in 1966, 1973, 2001, and 2003. Soil moisture droughts estimated from the three LSMs are comparable in terms of their spatial coverage; however, differences are found in drought severity. Moreover, we find a higher uncertainty in simulated drought characteristics over a large part of India during the major crop-growing season (Rabi season, November to February: NDJF) compared to those of the monsoon season (June to September: JJAS). Furthermore, uncertainty in drought estimates is higher for severe and localized droughts. Higher uncertainty in the soil moisture droughts is largely due to the difference in model parameterizations (especially soil depth), resulting in different persistence of soil moisture simulated by the three LSMs. Our study highlights the importance of accounting for the LSMs' uncertainty and consideration of the multi-model ensemble system for the real-time monitoring and prediction of drought over India.

  11. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; de Moel, H.

    2016-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage functions and maximum damages can have large effects on flood damage estimates. This explanation is then used to quantify the uncertainty in the damage estimates with a Monte Carlo analysis. The Monte Carlo analysis uses a damage function library with 272 functions from seven different flood damage models. The paper shows that the resulting uncertainties in estimated damages are in the order of magnitude of a factor of 2 to 5. The uncertainty is typically larger for flood events with small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  12. Uncertainty evaluation of a regional real-time system for rain-induced landslides

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Stanley, Thomas; Yatheendradas, Soni

    2015-04-01

    A new prototype regional model and evaluation framework has been developed over Central America and the Caribbean region using satellite-based information including precipitation estimates, modeled soil moisture, topography, soils, as well as regionally available datasets such as road networks and distance to fault zones. The algorithm framework incorporates three static variables: a susceptibility map; a 24-hr rainfall triggering threshold; and an antecedent soil moisture variable threshold, which have been calibrated using historic landslide events. The thresholds are regionally heterogeneous and are based on the percentile distribution of the rainfall or antecedent moisture time series. A simple decision tree algorithm framework integrates all three variables with the rainfall and soil moisture time series and generates a landslide nowcast in real-time based on the previous 24 hours over this region. This system has been evaluated using several available landslide inventories over the Central America and Caribbean region. Spatiotemporal uncertainty and evaluation metrics of the model are presented here based on available landslides reports. This work also presents a probabilistic representation of potential landslide activity over the region which can be used to further refine and improve the real-time landslide hazard assessment system as well as better identify and characterize the uncertainties inherent in this type of regional approach. The landslide algorithm provides a flexible framework to improve hazard estimation and reduce uncertainty at any spatial and temporal scale.

  13. Extreme risk assessment based on normalized historic loss data

    NASA Astrophysics Data System (ADS)

    Eichner, Jan

    2017-04-01

    Natural hazard risk assessment and risk management focuses on the expected loss magnitudes of rare and extreme events. Such large-scale loss events typically comprise all aspects of compound events and accumulate losses from multiple sectors (including knock-on effects). Utilizing Munich Re's NatCatSERVICE direct economic loss data, we beriefly recap a novel methodology of peril-specific loss data normalization which improves the stationarity properties of highly non-stationary historic loss data (due to socio-economic growth of assets prone to destructive forces), and perform extreme value analysis (peaks-over-threshold method) to come up with return level estimates of e.g. 100-yr loss event scenarios for various types of perils, globally or per continent, and discuss uncertainty in the results.

  14. Awe, uncertainty, and agency detection.

    PubMed

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  15. Multivariate Probabilistic Analysis of an Hydrological Model

    NASA Astrophysics Data System (ADS)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.

  16. Uncertainty quantification in downscaling procedures for effective decisions in energy systems

    NASA Astrophysics Data System (ADS)

    Constantinescu, E. M.

    2010-12-01

    Weather is a major driver both of energy supply and demand, and with the massive adoption of renewable energy sources and changing economic and producer-consumer paradigms, the management of the next-generation energy systems is becoming ever more challenging. The operational and planning decisions in energy systems are guided by efficiency and reliability, and therefore a central role in these decisions will be played by the ability to obtain weather condition forecasts with accurate uncertainty estimates. The appropriate temporal and spatial resolutions needed for effective decision-making, be it operational or planning, is not clear. It is arguably certain however, that such temporal scales as hourly variations of temperature or wind conditions and ramp events are essential in this process. Planning activities involve decade or decades-long projections of weather. One sensible way to achieve this is to embed regional weather models in a global climate system. This strategy acts as a downscaling procedure. Uncertainty modeling techniques must be developed in order to quantify and minimize forecast errors as well as target variables that impact the decision-making process the most. We discuss the challenges of obtaining a realistic uncertainty quantification estimate using mathematical algorithms based on scalable matrix-free computations and physics-based statistical models. The process of making decisions for energy management systems based on future weather scenarios is a very complex problem. We shall focus on the challenges in generating wind power predictions based on regional weather predictions, and discuss the implications of making the common assumptions about the uncertainty models.

  17. Intensity, magnitude, location and attenuation in India for felt earthquakes since 1762

    USGS Publications Warehouse

    Szeliga, Walter; Hough, Susan; Martin, Stacey; Bilham, Roger

    2010-01-01

    A comprehensive, consistently interpreted new catalog of felt intensities for India (Martin and Szeliga, 2010, this issue) includes intensities for 570 earthquakes; instrumental magnitudes and locations are available for 100 of these events. We use the intensity values for 29 of the instrumentally recorded events to develop new intensity versus attenuation relations for the Indian subcontinent and the Himalayan region. We then use these relations to determine the locations and magnitudes of 234 historical events, using the method of Bakun and Wentworth (1997). For the remaining 336 events, intensity distributions are too sparse to determine magnitude or location. We evaluate magnitude and location accuracy of newly located events by comparing the instrumental- with the intensity-derived location for 29 calibration events, for which more than 15 intensity observations are available. With few exceptions, most intensity-derived locations lie within a fault length of the instrumentally determined location. For events in which the azimuthal distribution of intensities is limited, we conclude that the formal error bounds from the regression of Bakun and Wentworth (1997) do not reflect the true uncertainties. We also find that the regression underestimates the uncertainties of the location and magnitude of the 1819 Allah Bund earthquake, for which a location has been inferred from mapped surface deformation. Comparing our inferred attenuation relations to those developed for other regions, we find that attenuation for Himalayan events is comparable to intensity attenuation in California (Bakun and Wentworth, 1997), while intensity attenuation for cratonic events is higher than intensity attenuation reported for central/eastern North America (Bakun et al., 2003). Further, we present evidence that intensities of intraplate earthquakes have a nonlinear dependence on magnitude such that attenuation relations based largely on small-to-moderate earthquakes may significantly overestimate the magnitudes of historical earthquakes.

  18. Intensity, magnitude, location, and attenuation in India for felt earthquakes since 1762

    USGS Publications Warehouse

    Szeliga, W.; Hough, S.; Martin, S.; Bilham, R.

    2010-01-01

    A comprehensive, consistently interpreted new catalog of felt intensities for India (Martin and Szeliga, 2010, this issue) includes intensities for 570 earth-quakes; instrumental magnitudes and locations are available for 100 of these events. We use the intensity values for 29 of the instrumentally recorded events to develop new intensity versus attenuation relations for the Indian subcontinent and the Himalayan region. We then use these relations to determine the locations and magnitudes of 234 historical events, using the method of Bakun and Wentworth (1997). For the remaining 336 events, intensity distributions are too sparse to determine magnitude or location. We evaluate magnitude and location accuracy of newly located events by comparing the instrumental-with the intensity-derived location for 29 calibration events, for which more than 15 intensity observations are available. With few exceptions, most intensity-derived locations lie within a fault length of the instrumentally determined location. For events in which the azimuthal distribution of intensities is limited, we conclude that the formal error bounds from the regression of Bakun and Wentworth (1997) do not reflect the true uncertainties. We also find that the regression underestimates the uncertainties of the location and magnitude of the 1819 Allah Bund earthquake, for which a location has been inferred from mapped surface deformation. Comparing our inferred attenuation relations to those developed for other regions, we find that attenuation for Himalayan events is comparable to intensity attenuation in California (Bakun and Wentworth, 1997), while intensity attenuation for cratonic events is higher than intensity attenuation reported for central/eastern North America (Bakun et al., 2003). Further, we present evidence that intensities of intraplate earth-quakes have a nonlinear dependence on magnitude such that attenuation relations based largely on small-to-moderate earthquakes may significantly overestimate the magnitudes of historical earthquakes.

  19. Learning rational temporal eye movement strategies.

    PubMed

    Hoppe, David; Rothkopf, Constantin A

    2016-07-19

    During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.

  20. Robustness for slope stability modelling under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  1. Representing uncertainty in a spatial invasion model that incorporates human-mediated dispersal

    Treesearch

    Frank H. Koch; Denys Yemshanov; Robert A. Haack

    2013-01-01

    Most modes of human-mediated dispersal of invasive species are directional and vector-based. Classical spatial spread models usually depend on probabilistic dispersal kernels that emphasize distance over direction and have limited ability to depict rare but influential long-distance dispersal events. These aspects are problematic if such models are used to estimate...

  2. Evaluation of pollutant loads from stormwater BMPs to receiving water using load frequency curves with uncertainty analysis.

    PubMed

    Park, Daeryong; Roesner, Larry A

    2012-12-15

    This study examined pollutant loads released to receiving water from a typical urban watershed in the Los Angeles (LA) Basin of California by applying a best management practice (BMP) performance model that includes uncertainty. This BMP performance model uses the k-C model and incorporates uncertainty analysis and the first-order second-moment (FOSM) method to assess the effectiveness of BMPs for removing stormwater pollutants. Uncertainties were considered for the influent event mean concentration (EMC) and the aerial removal rate constant of the k-C model. The storage treatment overflow and runoff model (STORM) was used to simulate the flow volume from watershed, the bypass flow volume and the flow volume that passes through the BMP. Detention basins and total suspended solids (TSS) were chosen as representatives of stormwater BMP and pollutant, respectively. This paper applies load frequency curves (LFCs), which replace the exceedance percentage with an exceedance frequency as an alternative to load duration curves (LDCs), to evaluate the effectiveness of BMPs. An evaluation method based on uncertainty analysis is suggested because it applies a water quality standard exceedance based on frequency and magnitude. As a result, the incorporation of uncertainty in the estimates of pollutant loads can assist stormwater managers in determining the degree of total daily maximum load (TMDL) compliance that could be expected from a given BMP in a watershed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Pyroclastic density current hazard maps at Campi Flegrei caldera (Italy): the effects of event scale, vent location and time forecasts.

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina

    2016-04-01

    Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.

  4. SU-F-T-185: Study of the Robustness of a Proton Arc Technique Based On PBS Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Z; Zheng, Y

    Purpose: One potential technique to realize proton arc is through using PBS beams from many directions to form overlaid Bragg peak (OBP) spots and placing these OBP spots throughout the target volume to achieve desired dose distribution. In this study, we analyzed the robustness of this proton arc technique. Methods: We used a cylindrical water phantom of 20 cm in radius in our robustness analysis. To study the range uncertainty effect, we changed the density of the phantom by ±3%. To study the setup uncertainty effect, we shifted the phantom by 3 & 5 mm. We also combined the rangemore » and setup uncertainties (3mm/±3%). For each test plan, we performed dose calculation for the nominal and 6 disturbed scenarios. Two test plans were used, one with single OBP spot and the other consisting of 121 OBP spots covering a 10×10cm{sup 2} area. We compared the dose profiles between the nominal and disturbed scenarios to estimate the impact of the uncertainties. Dose calculation was performed with Gate/GEANT based Monte Carlo software in cloud computing environment. Results: For each of the 7 scenarios, we simulated 100k & 10M events for plans consisting of single OBP spot and 121 OBP spots respectively. For single OBP spot, the setup uncertainty had minimum impact on the spot’s dose profile while range uncertainty had significant impact on the dose profile. For plan consisting of 121 OBP spots, similar effect was observed but the extent of disturbance was much less compared to single OBP spot. Conclusion: For PBS arc technique, range uncertainty has significantly more impact than setup uncertainty. Although single OBP spot can be severely disturbed by the range uncertainty, the overall effect is much less when a large number of OBP spots are used. Robustness optimization for PBS arc technique should consider range uncertainty with priority.« less

  5. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  6. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Blanford; E. Keldrauk; M. Laufer

    2010-09-20

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement,more » and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using factory prefabricated structural modules, for application to external event shell and base isolated structures.« less

  7. Search for gamma-ray events in the BATSE data base

    NASA Technical Reports Server (NTRS)

    Lewin, Walter

    1994-01-01

    We find large location errors and error radii in the locations of channel 1 Cygnus X-1 events. These errors and their associated uncertainties are a result of low signal-to-noise ratios (a few sigma) in the two brightest detectors for each event. The untriggered events suffer from similarly low signal-to-noise ratios, and their location errors are expected to be at least as large as those found for Cygnus X-1 with a given signal-to-noise ratio. The statistical error radii are consistent with those found for Cygnus X-1 and with the published estimates. We therefore expect approximately 20 - 30 deg location errors for the untriggered events. Hence, many of the untriggered events occurring within a few months of the triggered activity from SGR 1900 plus 14 are indeed consistent with the SGR source location, although Cygnus X-1 is also a good candidate.

  8. Simulation Based Earthquake Forecasting with RSQSim

    NASA Astrophysics Data System (ADS)

    Gilchrist, J. J.; Jordan, T. H.; Dieterich, J. H.; Richards-Dinger, K. B.

    2016-12-01

    We are developing a physics-based forecasting model for earthquake ruptures in California. We employ the 3D boundary element code RSQSim to generate synthetic catalogs with millions of events that span up to a million years. The simulations incorporate rate-state fault constitutive properties in complex, fully interacting fault systems. The Unified California Earthquake Rupture Forecast Version 3 (UCERF3) model and data sets are used for calibration of the catalogs and specification of fault geometry. Fault slip rates match the UCERF3 geologic slip rates and catalogs are tuned such that earthquake recurrence matches the UCERF3 model. Utilizing the Blue Waters Supercomputer, we produce a suite of million-year catalogs to investigate the epistemic uncertainty in the physical parameters used in the simulations. In particular, values of the rate- and state-friction parameters a and b, the initial shear and normal stress, as well as the earthquake slip speed, are varied over several simulations. In addition to testing multiple models with homogeneous values of the physical parameters, the parameters a, b, and the normal stress are varied with depth as well as in heterogeneous patterns across the faults. Cross validation of UCERF3 and RSQSim is performed within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) to determine the affect of the uncertainties in physical parameters observed in the field and measured in the lab, on the uncertainties in probabilistic forecasting. We are particularly interested in the short-term hazards of multi-event sequences due to complex faulting and multi-fault ruptures.

  9. Climate Change Extreme Events: Meeting the Information Needs of Water Resource Managers

    NASA Astrophysics Data System (ADS)

    Quay, R.; Garfin, G. M.; Dominguez, F.; Hirschboeck, K. K.; Woodhouse, C. A.; Guido, Z.; White, D. D.

    2013-12-01

    Information about climate has long been used by water managers to develop short term and long term plans and strategies for regional and local water resources. Inherent within longer term forecasts is an element of uncertainty, which is particularly evident in Global Climate model results for precipitation. For example in the southwest estimates in the flow of the Colorado River based on GCM results indicate changes from 120% or current flow to 60%. Many water resource managers are now using global climate model down scaled estimates results as indications of potential climate change as part of that planning. They are addressing the uncertainty within these estimates by using an anticipatory planning approach looking at a range of possible futures. One aspect of climate that is important for such planning are estimates of future extreme storm (short term) and drought (long term) events. However, the climate science of future possible changes in extreme events is less mature than general climate change science. At a recent workshop among climate scientists and water managers in the southwest, it was concluded the science of climate change extreme events is at least a decade away from being robust enough to be useful for water managers in their water resource management activities. However, it was proposed that there are existing estimates and records of past flooding and drought events that could be combined with general climate change science to create possible future events. These derived events could be of sufficient detail to be used by water resource managers until such time that the science of extreme events is able to provide more detailed estimates. Based on the results of this workshop and other work being done by the Decision Center for a Desert City at Arizona State University and the Climate Assessment for the Southwest center at University of Arizona., this article will 1) review what are the extreme event data needs of Water Resource Managers in the southwest, 2) review of the current state of extreme event climate science, 3) review what information is available about past extreme events in the southwest, 4) report the results of the 2012 workshop on climate change and extreme events, and 5) propose a method for combining this past information with current climate science information to produce estimates of possible future extreme events in sufficient detail to be useful to water resource managers.

  10. Using uncertainty to link and rank evidence from biomedical literature for model curation

    PubMed Central

    Zerva, Chrysoula; Batista-Navarro, Riza; Day, Philip; Ananiadou, Sophia

    2017-01-01

    Abstract Motivation In recent years, there has been great progress in the field of automated curation of biomedical networks and models, aided by text mining methods that provide evidence from literature. Such methods must not only extract snippets of text that relate to model interactions, but also be able to contextualize the evidence and provide additional confidence scores for the interaction in question. Although various approaches calculating confidence scores have focused primarily on the quality of the extracted information, there has been little work on exploring the textual uncertainty conveyed by the author. Despite textual uncertainty being acknowledged in biomedical text mining as an attribute of text mined interactions (events), it is significantly understudied as a means of providing a confidence measure for interactions in pathways or other biomedical models. In this work, we focus on improving identification of textual uncertainty for events and explore how it can be used as an additional measure of confidence for biomedical models. Results We present a novel method for extracting uncertainty from the literature using a hybrid approach that combines rule induction and machine learning. Variations of this hybrid approach are then discussed, alongside their advantages and disadvantages. We use subjective logic theory to combine multiple uncertainty values extracted from different sources for the same interaction. Our approach achieves F-scores of 0.76 and 0.88 based on the BioNLP-ST and Genia-MK corpora, respectively, making considerable improvements over previously published work. Moreover, we evaluate our proposed system on pathways related to two different areas, namely leukemia and melanoma cancer research. Availability and implementation The leukemia pathway model used is available in Pathway Studio while the Ras model is available via PathwayCommons. Online demonstration of the uncertainty extraction system is available for research purposes at http://argo.nactem.ac.uk/test. The related code is available on https://github.com/c-zrv/uncertainty_components.git. Details on the above are available in the Supplementary Material. Contact sophia.ananiadou@manchester.ac.uk Supplementary information Supplementary data are available at Bioinformatics online. PMID:29036627

  11. Using uncertainty to link and rank evidence from biomedical literature for model curation.

    PubMed

    Zerva, Chrysoula; Batista-Navarro, Riza; Day, Philip; Ananiadou, Sophia

    2017-12-01

    In recent years, there has been great progress in the field of automated curation of biomedical networks and models, aided by text mining methods that provide evidence from literature. Such methods must not only extract snippets of text that relate to model interactions, but also be able to contextualize the evidence and provide additional confidence scores for the interaction in question. Although various approaches calculating confidence scores have focused primarily on the quality of the extracted information, there has been little work on exploring the textual uncertainty conveyed by the author. Despite textual uncertainty being acknowledged in biomedical text mining as an attribute of text mined interactions (events), it is significantly understudied as a means of providing a confidence measure for interactions in pathways or other biomedical models. In this work, we focus on improving identification of textual uncertainty for events and explore how it can be used as an additional measure of confidence for biomedical models. We present a novel method for extracting uncertainty from the literature using a hybrid approach that combines rule induction and machine learning. Variations of this hybrid approach are then discussed, alongside their advantages and disadvantages. We use subjective logic theory to combine multiple uncertainty values extracted from different sources for the same interaction. Our approach achieves F-scores of 0.76 and 0.88 based on the BioNLP-ST and Genia-MK corpora, respectively, making considerable improvements over previously published work. Moreover, we evaluate our proposed system on pathways related to two different areas, namely leukemia and melanoma cancer research. The leukemia pathway model used is available in Pathway Studio while the Ras model is available via PathwayCommons. Online demonstration of the uncertainty extraction system is available for research purposes at http://argo.nactem.ac.uk/test. The related code is available on https://github.com/c-zrv/uncertainty_components.git. Details on the above are available in the Supplementary Material. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  12. Sensitivity analysis and uncertainty estimation in ash concentration simulations and tephra deposit daily forecasted at Mt. Etna, in Italy

    NASA Astrophysics Data System (ADS)

    Prestifilippo, Michele; Scollo, Simona; Tarantola, Stefano

    2015-04-01

    The uncertainty in volcanic ash forecasts may depend on our knowledge of the model input parameters and our capability to represent the dynamic of an incoming eruption. Forecasts help governments to reduce risks associated with volcanic eruptions and for this reason different kinds of analysis that help to understand the effect that each input parameter has on model outputs are necessary. We present an iterative approach based on the sequential combination of sensitivity analysis, parameter estimation procedure and Monte Carlo-based uncertainty analysis, applied to the lagrangian volcanic ash dispersal model PUFF. We modify the main input parameters as the total mass, the total grain-size distribution, the plume thickness, the shape of the eruption column, the sedimentation models and the diffusion coefficient, perform thousands of simulations and analyze the results. The study is carried out on two different Etna scenarios: the sub-plinian eruption of 22 July 1998 that formed an eruption column rising 12 km above sea level and lasted some minutes and the lava fountain eruption having features similar to the 2011-2013 events that produced eruption column high up to several kilometers above sea level and lasted some hours. Sensitivity analyses and uncertainty estimation results help us to address the measurements that volcanologists should perform during volcanic crisis to reduce the model uncertainty.

  13. Impact of Synoptic-Scale Factors on Rainfall Forecast in Different Stages of a Persistent Heavy Rainfall Event in South China

    NASA Astrophysics Data System (ADS)

    Zhang, Murong; Meng, Zhiyong

    2018-04-01

    This study investigates the stage-dependent rainfall forecast skills and the associated synoptic-scale features in a persistent heavy rainfall event in south China, Guangdong Province, during 29-31 March 2014, using operational global ensemble forecasts from the European Centre for Medium-Range Weather Forecasts. This persistent rainfall was divided into two stages with a better precipitation forecast skill in Stage 2 (S2) than Stage 1 (S1) although S2 had a longer lead time. Using ensemble-based sensitivity analysis, key synoptic-scale factors that affected the rainfall were diagnosed by correlating the accumulated precipitation of each stage to atmospheric state variables in the middle of respective stage. The precipitation in both stages was found to be significantly correlated with midlevel trough, low-level vortex, and particularly the low-level jet on the southeast flank of the vortex and its associated moisture transport. The rainfall forecast skill was mainly determined by the forecast accuracy in the location of the low-level jet, which was possibly related to the different juxtapositions between the direction of the movement of the low-level vortex and the orientation of the low-level jet. The uncertainty in rainfall forecast in S1 was mainly from the location uncertainty of the low-level jet, while the uncertainty in rainfall forecast in S2 was mainly from the width uncertainty of the low-level jet with the relatively accurate location of the low-level jet.

  14. Real-time Social Internet Data to Guide Forecasting Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Del Valle, Sara Y.

    Our goal is to improve decision support by monitoring and forecasting events using social media, mathematical models, and quantifying model uncertainty. Our approach is real-time, data-driven forecasts with quantified uncertainty: Not just for weather anymore. Information flow from human observations of events through an Internet system and classification algorithms is used to produce quantitatively uncertain forecast. In summary, we want to develop new tools to extract useful information from Internet data streams, develop new approaches to assimilate real-time information into predictive models, validate approaches by forecasting events, and our ultimate goal is to develop an event forecasting system using mathematicalmore » approaches and heterogeneous data streams.« less

  15. [Modeling in value-based medicine].

    PubMed

    Neubauer, A S; Hirneiss, C; Kampik, A

    2010-03-01

    Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.

  16. Pharmacological Fingerprints of Contextual Uncertainty

    PubMed Central

    Ruge, Diane; Stephan, Klaas E.

    2016-01-01

    Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses. PMID:27846219

  17. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  18. The Orbit of Transneptunian Binary Manwe and Thorondor and Their Upcoming Mutual Events

    NASA Technical Reports Server (NTRS)

    Grundy, W. M.; Benecchi, S. D.; Porter, S. B.; Noll, K. S.

    2014-01-01

    A new Hubble Space Telescope observation of the 7:4 resonant transneptunian binary system (385446) Manwe has shown that, of two previously reported solutions for the orbit of its satellite Thorondor, the prograde one is correct. The orbit has a period of 110.18 +/- 0.02 days, semimajor axis of 6670 +/- 40 km, and an eccentricity of 0.563 +/- 0.007. It will be viewable edge-on from the inner Solar System during 2015- 2017, presenting opportunities to observe mutual occultation and eclipse events. However, the number of observable events will be small, owing to the long orbital period and expected small sizes of the bodies relative to their separation. This paper presents predictions for events observable from Earth-based telescopes and discusses the associated uncertainties and challenges.

  19. Uncertainty Forecasts Improve Weather-Related Decisions and Attenuate the Effects of Forecast Error

    ERIC Educational Resources Information Center

    Joslyn, Susan L.; LeClerc, Jared E.

    2012-01-01

    Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather…

  20. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  1. Limiting global warming to 2°C is unlikely to save most coral reefs

    NASA Astrophysics Data System (ADS)

    Frieler, K.; Meinshausen, M.; Golly, A.; Mengel, M.; Lebek, K.; Donner, S. D.; Hoegh-Guldberg, O.

    2013-02-01

    Mass coral bleaching events have become a widespread phenomenon causing serious concerns with regard to the survival of corals. Triggered by high ocean temperatures, bleaching events are projected to increase in frequency and intensity. Here, we provide a comprehensive global study of coral bleaching in terms of global mean temperature change, based on an extended set of emissions scenarios and models. We show that preserving >10% of coral reefs worldwide would require limiting warming to below 1.5°C (atmosphere-ocean general circulation models (AOGCMs) range: 1.3-1.8°C) relative to pre-industrial levels. Even under optimistic assumptions regarding corals' thermal adaptation, one-third (9-60%, 68% uncertainty range) of the world's coral reefs are projected to be subject to long-term degradation under the most optimistic new IPCC emissions scenario, RCP3-PD. Under RCP4.5 this fraction increases to two-thirds (30-88%, 68% uncertainty range). Possible effects of ocean acidification reducing thermal tolerance are assessed within a sensitivity experiment.

  2. Behavior coordination of mobile robotics using supervisory control of fuzzy discrete event systems.

    PubMed

    Jayasiri, Awantha; Mann, George K I; Gosine, Raymond G

    2011-10-01

    In order to incorporate the uncertainty and impreciseness present in real-world event-driven asynchronous systems, fuzzy discrete event systems (DESs) (FDESs) have been proposed as an extension to crisp DESs. In this paper, first, we propose an extension to the supervisory control theory of FDES by redefining fuzzy controllable and uncontrollable events. The proposed supervisor is capable of enabling feasible uncontrollable and controllable events with different possibilities. Then, the extended supervisory control framework of FDES is employed to model and control several navigational tasks of a mobile robot using the behavior-based approach. The robot has limited sensory capabilities, and the navigations have been performed in several unmodeled environments. The reactive and deliberative behaviors of the mobile robotic system are weighted through fuzzy uncontrollable and controllable events, respectively. By employing the proposed supervisory controller, a command-fusion-type behavior coordination is achieved. The observability of fuzzy events is incorporated to represent the sensory imprecision. As a systematic analysis of the system, a fuzzy-state-based controllability measure is introduced. The approach is implemented in both simulation and real time. A performance evaluation is performed to quantitatively estimate the validity of the proposed approach over its counterparts.

  3. Probabilistic description of probable maximum precipitation

    NASA Astrophysics Data System (ADS)

    Ben Alaya, Mohamed Ali; Zwiers, Francis W.; Zhang, Xuebin

    2017-04-01

    Probable Maximum Precipitation (PMP) is the key parameter used to estimate probable Maximum Flood (PMF). PMP and PMF are important for dam safety and civil engineering purposes. Even if the current knowledge of storm mechanisms remains insufficient to properly evaluate limiting values of extreme precipitation, PMP estimation methods are still based on deterministic consideration, and give only single values. This study aims to provide a probabilistic description of the PMP based on the commonly used method, the so-called moisture maximization. To this end, a probabilistic bivariate extreme values model is proposed to address the limitations of traditional PMP estimates via moisture maximization namely: (i) the inability to evaluate uncertainty and to provide a range PMP values, (ii) the interpretation that a maximum of a data series as a physical upper limit (iii) and the assumption that a PMP event has maximum moisture availability. Results from simulation outputs of the Canadian Regional Climate Model CanRCM4 over North America reveal the high uncertainties inherent in PMP estimates and the non-validity of the assumption that PMP events have maximum moisture availability. This later assumption leads to overestimation of the PMP by an average of about 15% over North America, which may have serious implications for engineering design.

  4. A three-stage birandom program for unit commitment with wind power uncertainty.

    PubMed

    Zhang, Na; Li, Weidong; Liu, Rao; Lv, Quan; Sun, Liang

    2014-01-01

    The integration of large-scale wind power adds a significant uncertainty to power system planning and operating. The wind forecast error is decreased with the forecast horizon, particularly when it is from one day to several hours ahead. Integrating intraday unit commitment (UC) adjustment process based on updated ultra-short term wind forecast information is one way to improve the dispatching results. A novel three-stage UC decision method, in which the day-ahead UC decisions are determined in the first stage, the intraday UC adjustment decisions of subfast start units are determined in the second stage, and the UC decisions of fast-start units and dispatching decisions are determined in the third stage is presented. Accordingly, a three-stage birandom UC model is presented, in which the intraday hours-ahead forecasted wind power is formulated as a birandom variable, and the intraday UC adjustment event is formulated as a birandom event. The equilibrium chance constraint is employed to ensure the reliability requirement. A birandom simulation based hybrid genetic algorithm is designed to solve the proposed model. Some computational results indicate that the proposed model provides UC decisions with lower expected total costs.

  5. Risk assessment for construction projects of transport infrastructure objects

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris

    2017-10-01

    The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.

  6. Dealing with uncertainty in the probability of overtopping of a flood mitigation dam

    NASA Astrophysics Data System (ADS)

    Michailidi, Eleni Maria; Bacchi, Baldassare

    2017-05-01

    In recent years, copula multivariate functions were used to model, probabilistically, the most important variables of flood events: discharge peak, flood volume and duration. However, in most of the cases, the sampling uncertainty, from which small-sized samples suffer, is neglected. In this paper, considering a real reservoir controlled by a dam as a case study, we apply a structure-based approach to estimate the probability of reaching specific reservoir levels, taking into account the key components of an event (flood peak, volume, hydrograph shape) and of the reservoir (rating curve, volume-water depth relation). Additionally, we improve information about the peaks from historical data and reports through a Bayesian framework, allowing the incorporation of supplementary knowledge from different sources and its associated error. As it is seen here, the extra information can result in a very different inferred parameter set and consequently this is reflected as a strong variability of the reservoir level, associated with a given return period. Most importantly, the sampling uncertainty is accounted for in both cases (single-site and multi-site with historical information scenarios), and Monte Carlo confidence intervals for the maximum water level are calculated. It is shown that water levels of specific return periods in a lot of cases overlap, thus making risk assessment, without providing confidence intervals, deceiving.

  7. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  8. High cumulants of conserved charges and their statistical uncertainties

    NASA Astrophysics Data System (ADS)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  9. Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data

    NASA Astrophysics Data System (ADS)

    Reno, B. L.; Brown, M.; Piccoli, P. M.

    2007-12-01

    Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.

  10. Uncertainties in future-proof decision-making: the Dutch Delta Model

    NASA Astrophysics Data System (ADS)

    IJmker, Janneke; Snippen, Edwin; Ruijgh, Erik

    2013-04-01

    In 1953, a number of European countries experienced flooding after a major storm event coming from the northwest. Over 2100 people died of the resulting floods, 1800 of them being Dutch. This gave rise to the development of the so-called Delta Works and Zuiderzee Works that strongly reduced the flood risk in the Netherlands. These measures were a response to a large flooding event. As boundary conditions have changed (increasing population, increasing urban development, etc.) , the flood risk should be evaluated continuously, and measures should be taken if necessary. The Delta Programme was designed to be prepared for future changes and to limit the flood risk, taking into account economics, nature, landscape, residence and recreation . To support decisions in the Delta Programme, the Delta Model was developed. By using four different input scenarios (extremes in climate and economics) and variations in system setup, the outcomes of the Delta Model represent a range of possible outcomes for the hydrological situation in 2050 and 2100. These results flow into effect models that give insight in the integrated effects on freshwater supply (including navigation, industry and ecology) and flood risk. As the long-term water management policy of the Netherlands for the next decades will be based on these results, they have to be reliable. Therefore, a study was carried out to investigate the impact of uncertainties on the model outcomes. The study focused on "known unknowns": uncertainties in the boundary conditions, in the parameterization and in the model itself. This showed that for different parts of the Netherlands, the total uncertainty is in the order of meters! Nevertheless, (1) the total uncertainty is dominated by uncertainties in boundary conditions. Internal model uncertainties are subordinate to that. Furthermore, (2) the model responses develop in a logical way, such that the exact model outcomes might be uncertain, but the outcomes of different model runs are reliable relative to each other. The Delta Model therefore is a reliable instrument for finding the optimal water management policy for the future. As the exact model outcomes show a high degree of uncertainty, the model analysis will be on a large numbers of model runs to gain insight in the sensitivity of the model for different setups and boundary conditions. The results allow fast investigation of (relative) effects of measures. Furthermore, it helps to identify bottlenecks in the system. To summarize, the Delta Model is a tool for policy makers to base their policy strategies on quantitative rather than qualitative information. It can be applied to the current and future situation, and feeds the political discussion. The uncertainty of the model has no determinative effect on the analysis that can be done by the Delta Model.

  11. Quantifying uncertainties in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  12. History, rare, and multiple events of mechanical unfolding of repeat proteins

    NASA Astrophysics Data System (ADS)

    Sumbul, Fidan; Marchesi, Arin; Rico, Felix

    2018-03-01

    Mechanical unfolding of proteins consisting of repeat domains is an excellent tool to obtain large statistics. Force spectroscopy experiments using atomic force microscopy on proteins presenting multiple domains have revealed that unfolding forces depend on the number of folded domains (history) and have reported intermediate states and rare events. However, the common use of unspecific attachment approaches to pull the protein of interest holds important limitations to study unfolding history and may lead to discarding rare and multiple probing events due to the presence of unspecific adhesion and uncertainty on the pulling site. Site-specific methods that have recently emerged minimize this uncertainty and would be excellent tools to probe unfolding history and rare events. However, detailed characterization of these approaches is required to identify their advantages and limitations. Here, we characterize a site-specific binding approach based on the ultrastable complex dockerin/cohesin III revealing its advantages and limitations to assess the unfolding history and to investigate rare and multiple events during the unfolding of repeated domains. We show that this approach is more robust, reproducible, and provides larger statistics than conventional unspecific methods. We show that the method is optimal to reveal the history of unfolding from the very first domain and to detect rare events, while being more limited to assess intermediate states. Finally, we quantify the forces required to unfold two molecules pulled in parallel, difficult when using unspecific approaches. The proposed method represents a step forward toward more reproducible measurements to probe protein unfolding history and opens the door to systematic probing of rare and multiple molecule unfolding mechanisms.

  13. Uncertainty and extreme events in future climate and hydrologic projections for the Pacific Northwest: providing a basis for vulnerability and core/corridor assessments

    USGS Publications Warehouse

    Littell, Jeremy S.; Mauger, Guillaume S.; Salathe, Eric P.; Hamlet, Alan F.; Lee, Se-Yeun; Stumbaugh, Matt R.; Elsner, Marketa; Norheim, Robert; Lutz, Eric R.; Mantua, Nathan J.

    2014-01-01

    The purpose of this project was to (1) provide an internally-consistent set of downscaled projections across the Western U.S., (2) include information about projection uncertainty, and (3) assess projected changes of hydrologic extremes. These objectives were designed to address decision support needs for climate adaptation and resource management actions. Specifically, understanding of uncertainty in climate projections – in particular for extreme events – is currently a key scientific and management barrier to adaptation planning and vulnerability assessment. The new dataset fills in the Northwest domain to cover a key gap in the previous dataset, adds additional projections (both from other global climate models and a comparison with dynamical downscaling) and includes an assessment of changes to flow and soil moisture extremes. This new information can be used to assess variations in impacts across the landscape, uncertainty in projections, and how these differ as a function of region, variable, and time period. In this project, existing University of Washington Climate Impacts Group (UW CIG) products were extended to develop a comprehensive data archive that accounts (in a reigorous and physically based way) for climate model uncertainty in future climate and hydrologic scenarios. These products can be used to determine likely impacts on vegetation and aquatic habitat in the Pacific Northwest (PNW) region, including WA, OR, ID, northwest MT to the continental divide, northern CA, NV, UT, and the Columbia Basin portion of western WY New data series and summaries produced for this project include: 1) extreme statistics for surface hydrology (e.g. frequency of soil moisture and summer water deficit) and streamflow (e.g. the 100-year flood, extreme 7-day low flows with a 10-year recurrence interval); 2) snowpack vulnerability as indicated by the ratio of April 1 snow water to cool-season precipitation; and, 3) uncertainty analyses for multiple climate scenarios.

  14. A probabilistic tornado wind hazard model for the continental United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hossain, Q; Kimball, J; Mensing, R

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectanglemore » and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.« less

  15. Model Uncertainties for Valencia RPA Effect for MINERvA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gran, Richard

    2017-05-08

    This technical note describes the application of the Valencia RPA multi-nucleon effect and its uncertainty to QE reactions from the GENIE neutrino event generator. The analysis of MINERvA neutrino data in Rodrigues et al. PRL 116 071802 (2016) paper makes clear the need for an RPA suppression, especially at very low momentum and energy transfer. That published analysis does not constrain the magnitude of the effect; it only tests models with and without the effect against the data. Other MINERvA analyses need an expression of the model uncertainty in the RPA effect. A well-described uncertainty can be used for systematics for unfolding, for model errors in the analysis of non-QE samples, and as input for fitting exercises for model testing or constraining backgrounds. This prescription takes uncertainties on the parameters in the Valencia RPA model and adds a (not-as-tight) constraint from muon capture data. For MINERvA we apply it as a 2D (more » $$q_0$$,$$q_3$$) weight to GENIE events, in lieu of generating a full beyond-Fermi-gas quasielastic events. Because it is a weight, it can be applied to the generated and fully Geant4 simulated events used in analysis without a special GENIE sample. For some limited uses, it could be cast as a 1D $Q^2$ weight without much trouble. This procedure is a suitable starting point for NOvA and DUNE where the energy dependence is modest, but probably not adequate for T2K or MicroBooNE.« less

  16. An image-based model of brain volume biomarker changes in Huntington's disease.

    PubMed

    Wijeratne, Peter A; Young, Alexandra L; Oxtoby, Neil P; Marinescu, Razvan V; Firth, Nicholas C; Johnson, Eileanoir B; Mohan, Amrita; Sampaio, Cristina; Scahill, Rachael I; Tabrizi, Sarah J; Alexander, Daniel C

    2018-05-01

    Determining the sequence in which Huntington's disease biomarkers become abnormal can provide important insights into the disease progression and a quantitative tool for patient stratification. Here, we construct and present a uniquely fine-grained model of temporal progression of Huntington's disease from premanifest through to manifest stages. We employ a probabilistic event-based model to determine the sequence of appearance of atrophy in brain volumes, learned from structural MRI in the Track-HD study, as well as to estimate the uncertainty in the ordering. We use longitudinal and phenotypic data to demonstrate the utility of the patient staging system that the resulting model provides. The model recovers the following order of detectable changes in brain region volumes: putamen, caudate, pallidum, insula white matter, nonventricular cerebrospinal fluid, amygdala, optic chiasm, third ventricle, posterior insula, and basal forebrain. This ordering is mostly preserved even under cross-validation of the uncertainty in the event sequence. Longitudinal analysis performed using 6 years of follow-up data from baseline confirms efficacy of the model, as subjects consistently move to later stages with time, and significant correlations are observed between the estimated stages and nonimaging phenotypic markers. We used a data-driven method to provide new insight into Huntington's disease progression as well as new power to stage and predict conversion. Our results highlight the potential of disease progression models, such as the event-based model, to provide new insight into Huntington's disease progression and to support fine-grained patient stratification for future precision medicine in Huntington's disease.

  17. A retrospective streamflow ensemble forecast for an extreme hydrologic event: a case study of Hurricane Irene and on the Hudson River basin

    NASA Astrophysics Data System (ADS)

    Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie

    2016-07-01

    This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.

  18. Measuring System Value in the Ares 1 Rocket Using an Uncertainty-Based Coupling Analysis Approach

    NASA Astrophysics Data System (ADS)

    Wenger, Christopher

    Coupling of physics in large-scale complex engineering systems must be correctly accounted for during the systems engineering process to ensure no unanticipated behaviors or unintended consequences arise in the system during operation. Structural vibration of large segmented solid rocket motors, known as thrust oscillation, is a well-documented problem that can affect the health and safety of any crew onboard. Within the Ares 1 rocket, larger than anticipated vibrations were recorded during late stage flight that propagated from the engine chamber to the Orion crew module. Upon investigation engineers found the root cause to be the structure of the rockets feedback onto fluid flow within the engine. The goal of this paper is to showcase a coupling strength analysis from the field of Multidisciplinary Design Optimization to identify the major impacts that caused the Thrust Oscillation event in the Ares 1. Once identified an uncertainty analysis of the coupled system using an uncertainty based optimization technique is used to identify the likelihood of occurrence for these strong or weak interactions to take place.

  19. A probabilistic strategy for parametric catastrophe insurance

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin

    2017-04-01

    Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss events. Due to the nature of parametric programmes, it is still necessary to clearly define when a payout is due or not, and so a decision threshold probability above which a loss event is considered to occur must be set, effectively converting the issued probabilities into deterministic binary outcomes. Model skill and value are evaluated over the range of possible threshold probabilities, with the objective of defining the optimal one. The predictive ability of the model is assessed. In terms of value assessment, a decision model is proposed, allowing users to quantify monetarily their expected expenses when different combinations of model event triggering and actual event occurrence take place, directly tackling the problem of basis risk.

  20. Evaluation of vector coastline features extracted from 'structure from motion'-derived elevation data

    USGS Publications Warehouse

    Kinsman, Nicole; Gibbs, Ann E.; Nolan, Matt

    2015-01-01

    For extensive and remote coastlines, the absence of high-quality elevation models—for example, those produced with lidar—leaves some coastal populations lacking one of the essential elements for mapping shoreline positions or flood extents. Here, we compare seven different elevation products in a lowlying area in western Alaska to establish their appropriateness for coastal mapping applications that require the delineation of elevation-based vectors. We further investigate the effective use of a Structure from Motion (SfM)-derived surface model (vertical RMSE<20 cm) by generating a tidal datum-based shoreline and an inundation extent map for a 2011 flood event. Our results suggest that SfM-derived elevation products can yield elevation-based vector features that have horizontal positional uncertainties comparable to those derived from other techniques. We also provide a rule-of-thumb equation to aid in the selection of minimum elevation model specifications based on terrain slope, vertical uncertainties, and desired horizontal accuracy.

  1. Quantifying rainfall-derived inflow and infiltration in sanitary sewer systems based on conductivity monitoring

    NASA Astrophysics Data System (ADS)

    Zhang, Mingkai; Liu, Yanchen; Cheng, Xun; Zhu, David Z.; Shi, Hanchang; Yuan, Zhiguo

    2018-03-01

    Quantifying rainfall-derived inflow and infiltration (RDII) in a sanitary sewer is difficult when RDII and overflow occur simultaneously. This study proposes a novel conductivity-based method for estimating RDII. The method separately decomposes rainfall-derived inflow (RDI) and rainfall-induced infiltration (RII) on the basis of conductivity data. Fast Fourier transform was adopted to analyze variations in the flow and water quality during dry weather. Nonlinear curve fitting based on the least squares algorithm was used to optimize parameters in the proposed RDII model. The method was successfully applied to real-life case studies, in which inflow and infiltration were successfully estimated for three typical rainfall events with total rainfall volumes of 6.25 mm (light), 28.15 mm (medium), and 178 mm (heavy). Uncertainties of model parameters were estimated using the generalized likelihood uncertainty estimation (GLUE) method and were found to be acceptable. Compared with traditional flow-based methods, the proposed approach exhibits distinct advantages in estimating RDII and overflow, particularly when the two processes happen simultaneously.

  2. Determination of the number of J/ψ events with inclusive J/ψ decays

    DOE PAGES

    Ablikim, M.; Achasov, M. N.; Ai, X. C.; ...

    2016-08-26

    A measurement of the number of J/ψ events collected with the BESIII detector in 2009 and 2012 is performed using inclusive decays of the J/ψ. The number of J/ψ events taken in 2009 is recalculated to be (223.7 ± 1.4) × 10 6, which is in good agreement with the previous measurement, but with significantly improved precision due to improvements in the BESIII software. The number of J/ψ events taken in 2012 is determined to be (1086.9 ± 6.0) × 10 6. In total, the number of J/ψ events collected with the BESIII detector is measured to be (1310.6 ±more » 7.0) × 10 6, where the uncertainty is dominated by systematic effects and the statistical uncertainty is negligible.« less

  3. Determination of the number of J/ψ events with inclusive J/ψ decays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ablikim, M.; Achasov, M. N.; Ai, X. C.

    A measurement of the number of J/ψ events collected with the BESIII detector in 2009 and 2012 is performed using inclusive decays of the J/ψ. The number of J/ψ events taken in 2009 is recalculated to be (223.7 ± 1.4) × 10 6, which is in good agreement with the previous measurement, but with significantly improved precision due to improvements in the BESIII software. The number of J/ψ events taken in 2012 is determined to be (1086.9 ± 6.0) × 10 6. In total, the number of J/ψ events collected with the BESIII detector is measured to be (1310.6 ±more » 7.0) × 10 6, where the uncertainty is dominated by systematic effects and the statistical uncertainty is negligible.« less

  4. Uncertainty and Surprise: Ideas from the Open Discussion

    NASA Astrophysics Data System (ADS)

    Jordan, Michelle E.

    Approximately one hundred participants met for three days at a conference entitled "Uncertainty and Surprise: Questions on Working with the Unexpected and Unknowable." There were a diversity of conference participants ranging from researchers in the natural sciences and researchers in the social sciences (business professors, physicists, ethnographers, nursing school deans) to practitioners and executives in public policy and management (business owners, health care managers, high tech executives), all of whom had varying levels of experience and expertise in dealing with uncertainty and surprise. One group held the traditional, statistical view that uncertainty comes from variance and events that are described by usually unimodal probability law. A second group was comfortable on the one hand with phase diagrams and the phase transitions that come from systems with multi-modal distributions, and on the other hand, with deterministic chaos. A third group was comfortable with the emergent events from evolutionary processes that may not have any probability laws at all.

  5. Cancer Risk Assessment for Space Radiation

    NASA Technical Reports Server (NTRS)

    Richmond, Robert C.; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    Predicting the occurrence of human cancer following exposure to any agent causing genetic damage is a difficult task. This is because the uncertainty of uniform exposure to the damaging agent, and the uncertainty of uniform processing of that damage within a complex set of biological variables, degrade the confidence of predicting the delayed expression of cancer as a relatively rare event within any given clinically normal individual. The radiation health research priorities for enabling long-duration human exploration of space were established in the 1996 NRC Report entitled "Radiation Hazards to Crews of Interplanetary Missions: Biological Issues and Research Strategies". This report emphasized that a 15-fold uncertainty in predicting radiation-induced cancer incidence must be reduced before NASA can commit humans to extended interplanetary missions. That report concluded that the great majority of this uncertainty is biologically based, while a minority is physically based due to uncertainties in radiation dosimetry and radiation transport codes. Since that report, the biologically based uncertainty has remained large, and the relatively small uncertainty associated with radiation dosimetry has increased due to the considerations raised by concepts of microdosimetry. In a practical sense, however, the additional uncertainties introduced by microdosimetry are encouraging since they are in a direction of lowered effective dose absorbed through infrequent interactions of any given cell with the high energy particle component of space radiation. The biological uncertainty in predicting cancer risk for space radiation derives from two primary facts. 1) One animal tumor study has been reported that includes a relevant spectrum of particle radiation energies, and that is the Harderian gland model in mice. Fact #1: Extension of cancer risk from animal models, and especially from a single study in an animal model, to humans is inherently uncertain. 2) One human database is predominantly used for assessing cancer risk caused by space radiation, and that is the Japanese atomic bomb survivors. Fact #2: The atomic-bomb-survivor database, itself a remarkable achievement, contains uncertainties. These include the actual exposure to each individual, the radiation quality of that exposure, and the fact that the exposure was to acute doses of predominantly low-LET radiation, not to chronic exposures of high-LET radiation expected on long-duration interplanetary manned missions.

  6. Occultation of 2UCAC 42376428 by (423) Diotima on 2005 March 06

    NASA Astrophysics Data System (ADS)

    Vasundhara, R.; Kuppuswamy, K.; Ramamoorthy, S.; Velu, C.; Venkataramana, A. K.

    2006-03-01

    Observations of the occultation of the star 2UCAC 42376428 by (423) Diotima on 2005 March 06 at the Vainu Bappu Observatory are reported. The observed mid time of the event at 15:12:25.1 UT occurred 3.4 s later than the predicted time but within the 1 uncertainty of 4.3 s of the predictions by IOTA. The duration of the event of 4.2 s was found to be shorter than the predictions even allowing for a one sigma uncertainty in the impact parameter. This implies a narrower projected width of the asteroid along the occultation track at the time of the event.

  7. Sampling procedures for throughfall monitoring: A simulation study

    NASA Astrophysics Data System (ADS)

    Zimmermann, Beate; Zimmermann, Alexander; Lark, Richard Murray; Elsenbeer, Helmut

    2010-01-01

    What is the most appropriate sampling scheme to estimate event-based average throughfall? A satisfactory answer to this seemingly simple question has yet to be found, a failure which we attribute to previous efforts' dependence on empirical studies. Here we try to answer this question by simulating stochastic throughfall fields based on parameters for statistical models of large monitoring data sets. We subsequently sampled these fields with different sampling designs and variable sample supports. We evaluated the performance of a particular sampling scheme with respect to the uncertainty of possible estimated means of throughfall volumes. Even for a relative error limit of 20%, an impractically large number of small, funnel-type collectors would be required to estimate mean throughfall, particularly for small events. While stratification of the target area is not superior to simple random sampling, cluster random sampling involves the risk of being less efficient. A larger sample support, e.g., the use of trough-type collectors, considerably reduces the necessary sample sizes and eliminates the sensitivity of the mean to outliers. Since the gain in time associated with the manual handling of troughs versus funnels depends on the local precipitation regime, the employment of automatically recording clusters of long troughs emerges as the most promising sampling scheme. Even so, a relative error of less than 5% appears out of reach for throughfall under heterogeneous canopies. We therefore suspect a considerable uncertainty of input parameters for interception models derived from measured throughfall, in particular, for those requiring data of small throughfall events.

  8. How Historical Information Can Improve Extreme Value Analysis of Coastal Water Levels

    NASA Astrophysics Data System (ADS)

    Le Cozannet, G.; Bulteau, T.; Idier, D.; Lambert, J.; Garcin, M.

    2016-12-01

    The knowledge of extreme coastal water levels is useful for coastal flooding studies or the design of coastal defences. While deriving such extremes with standard analyses using tide gauge measurements, one often needs to deal with limited effective duration of observation which can result in large statistical uncertainties. This is even truer when one faces outliers, those particularly extreme values distant from the others. In a recent work (Bulteau et al., 2015), we investigated how historical information of past events reported in archives can reduce statistical uncertainties and relativize such outlying observations. We adapted a Bayesian Markov Chain Monte Carlo method, initially developed in the hydrology field (Reis and Stedinger, 2005), to the specific case of coastal water levels. We applied this method to the site of La Rochelle (France), where the storm Xynthia in 2010 generated a water level considered so far as an outlier. Based on 30 years of tide gauge measurements and 8 historical events since 1890, the results showed a significant decrease in statistical uncertainties on return levels when historical information is used. Also, Xynthia's water level no longer appeared as an outlier and we could have reasonably predicted the annual exceedance probability of that level beforehand (predictive probability for 2010 based on data until the end of 2009 of the same order of magnitude as the standard estimative probability using data until the end of 2010). Such results illustrate the usefulness of historical information in extreme value analyses of coastal water levels, as well as the relevance of the proposed method to integrate heterogeneous data in such analyses.

  9. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    NASA Astrophysics Data System (ADS)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  10. Understanding London's Water Supply Tradeoffs When Scheduling Interventions Under Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2015-12-01

    Water supply planning in many major world cities faces several challenges associated with but not limited to climate change, population growth and insufficient land availability for infrastructure development. Long-term plans to maintain supply-demand balance and ecosystem services require careful consideration of uncertainties associated with future conditions. The current approach for London's water supply planning utilizes least cost optimization of future intervention schedules with limited uncertainty consideration. Recently, the focus of the long-term plans has shifted from solely least cost performance to robustness and resilience of the system. Identifying robust scheduling of interventions requires optimizing over a statistically representative sample of stochastic inputs which may be computationally difficult to achieve. In this study we optimize schedules using an ensemble of plausible scenarios and assess how manipulating that ensemble influences the different Pareto-approximate intervention schedules. We investigate how a major stress event's location in time as well as the optimization problem formulation influence the Pareto-approximate schedules. A bootstrapping method that respects the non-stationary trend of climate change scenarios and ensures the even distribution of the major stress event in the scenario ensemble is proposed. Different bootstrapped hydrological scenario ensembles are assessed using many-objective scenario optimization of London's future water supply and demand intervention scheduling. However, such a "fixed" scheduling of interventions approach does not aim to embed flexibility or adapt effectively as the future unfolds. Alternatively, making decisions based on the observations of occurred conditions could help planners who prefer adaptive planning. We will show how rules to guide the implementation of interventions based on observations may result in more flexible strategies.

  11. EnKF with closed-eye period - bridging intermittent model structural errors in soil hydrology

    NASA Astrophysics Data System (ADS)

    Bauser, Hannes H.; Jaumann, Stefan; Berg, Daniel; Roth, Kurt

    2017-04-01

    The representation of soil water movement exposes uncertainties in all model components, namely dynamics, forcing, subscale physics and the state itself. Especially model structural errors in the description of the dynamics are difficult to represent and can lead to an inconsistent estimation of the other components. We address the challenge of a consistent aggregation of information for a manageable specific hydraulic situation: a 1D soil profile with TDR-measured water contents during a time period of less than 2 months. We assess the uncertainties for this situation and detect initial condition, soil hydraulic parameters, small-scale heterogeneity, upper boundary condition, and (during rain events) the local equilibrium assumption by the Richards equation as the most important ones. We employ an iterative Ensemble Kalman Filter (EnKF) with an augmented state. Based on a single rain event, we are able to reduce all uncertainties directly, except for the intermittent violation of the local equilibrium assumption. We detect these times by analyzing the temporal evolution of estimated parameters. By introducing a closed-eye period - during which we do not estimate parameters, but only guide the state based on measurements - we can bridge these times. The introduced closed-eye period ensured constant parameters, suggesting that they resemble the believed true material properties. The closed-eye period improves predictions during periods when the local equilibrium assumption is met, but consequently worsens predictions when the assumption is violated. Such a prediction requires a description of the dynamics during local non-equilibrium phases, which remains an open challenge.

  12. Evaluation of rainfall structure on hydrograph simulation: Comparison of radar and interpolated methods, a study case in a tropical catchment

    NASA Astrophysics Data System (ADS)

    Velasquez, N.; Ochoa, A.; Castillo, S.; Hoyos Ortiz, C. D.

    2017-12-01

    The skill of river discharge simulation using hydrological models strongly depends on the quality and spatio-temporal representativeness of precipitation during storm events. All precipitation measurement strategies have their own strengths and weaknesses that translate into discharge simulation uncertainties. Distributed hydrological models are based on evolving rainfall fields in the same time scale as the hydrological simulation. In general, rainfall measurements from a dense and well maintained rain gauge network provide a very good estimation of the total volume for each rainfall event, however, the spatial structure relies on interpolation strategies introducing considerable uncertainty in the simulation process. On the other hand, rainfall retrievals from radar reflectivity achieve a better spatial structure representation but with higher uncertainty in the surface precipitation intensity and volume depending on the vertical rainfall characteristics and radar scan strategy. To assess the impact of both rainfall measurement methodologies on hydrological simulations, and in particular the effects of the rainfall spatio-temporal variability, a numerical modeling experiment is proposed including the use of a novel QPE (Quantitative Precipitation Estimation) method based on disdrometer data in order to estimate surface rainfall from radar reflectivity. The experiment is based on the simulation of 84 storms, the hydrological simulations are carried out using radar QPE and two different interpolation methods (IDW and TIN), and the assessment of simulated peak flow. Results show significant rainfall differences between radar QPE and the interpolated fields, evidencing a poor representation of storms in the interpolated fields, which tend to miss the precise location of the intense precipitation cores, and to artificially generate rainfall in some areas of the catchment. Regarding streamflow modelling, the potential improvement achieved by using radar QPE depends on the density of the rain gauge network and its distribution relative to the precipitation events. The results for the 84 storms show a better model skill using radar QPE than the interpolated fields. Results using interpolated fields are highly affected by the dominant rainfall type and the basin scale.

  13. Satellite Re-entry Modeling and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Horsley, M.

    2012-09-01

    LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty quantification results from the recent uncontrolled re-entry of the Phobos-Grunt satellite will be presented and discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  14. Large Pt processes in ppbar collisions at 2 TeV: measurement of ttbar production cross section in ppbar collisions at s**(1/2) = 1.96 TeV in the dielectron final states at the D0 experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Ashish; /Delhi U.

    2005-10-01

    The measurement of the top-antitop pair production cross section in p{bar p} collisions at {radical}s = 1.96 TeV in the dielectron decay channel using 384 pb{sup -1} of D0 data yields a t{bar t} production cross-section of {sigma}{sub t{bar t}} = 7.9{sub -3.8}{sup +5.2}(stat){sub -1.0}{sup +1.3}(syst) {+-} 0.5 (lumi) pb. This measurement [98] is based on 5 observed events with a prediction of 1.04 background events. The cross-section corresponds to the top mass of 175 GeV, and is in good agreement with the Standard Model expectation of 6.77 {+-} 0.42 pb based on next-to-next-leading-order (NNLO) perturbative QCD calculations [78]. Thismore » analysis shows significant improvement from our previous cross-section measurement in this channel [93] with 230 pb{sup -1} dataset in terms of significantly better signal to background ratio and uncertainties on the measured cross-section. Combination of all the dilepton final states [98] yields a yields a t{bar t} cross-section of {sigma}{sub t{bar t}} = 8.6{sub -2.0}{sup +2.3}(stat){sub -1.0}{sup +1.2}(syst) {+-} 0.6(lumi) pb, which again is in good agreement with theoretical predictions and with measurements in other final states. Hence, these results show no discernible deviation from the Standard Model. Fig. 6.1 shows the summary of cross-section measurements in different final states by the D0 in Run II. This measurement of cross-section in the dilepton channels is the best dilepton result from D0 till date. Previous D0 result based on analysis of 230 pb{sup -1} of data (currently under publication in Physics Letters B) is {sigma}{sub t{bar t}} = 8.6{sub -2.7}{sup +3.2}(stat){sub -1.1}{sup +1.1}(syst) {+-} 0.6(lumi) pb. It can be seen that the present cross-section suffers from less statistical uncertainty. This result is also quite consistent with CDF collaboration's result of {sigma}{sub t{bar t}} = 8.6{sub -2.4}{sup +2.5}(stat){sub -1.1}{sup +1.1}(syst) pb. These results have been presented as D0's preliminary results in the high energy physics conferences in the Summer of 2005 (Hadron Collider Physics Symposium, European Physical Society Conference, etc.). The uncertainty on the cross-section is still dominated by statistics due to the small number of observed events. It can be seen that we are at a level where statistical uncertainties are becoming closer to the systematic ones. Future measurements of the cross section will benefit from considerably more integrated luminosity, leading to a smaller statistical error. Thus the next generation of measurements will be limited by systematic uncertainties. Monte Carlo samples with higher statistics are also being generated in order to decrease the uncertainty on the background estimation. In addition, as the jet energy scale, the electron energy scale, the detector resolutions, and the luminosity measurement are fine-tuned, the systematic uncertainties will continue to decrease.« less

  15. Detection and Attribution of Simulated Climatic Extreme Events and Impacts: High Sensitivity to Bias Correction

    NASA Astrophysics Data System (ADS)

    Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.

    2015-12-01

    Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of climate extremes and associated impacts. [1] http://www.climateprediction.net/weatherathome/

  16. Perception-based Impact upon Community Resilience in the Aftermath of Natural Disasters

    NASA Astrophysics Data System (ADS)

    Fernandez, S.; Li, H.

    2008-05-01

    In the event of natural disasters, recovery from the direct and indirect effects of the disaster phenomena are topics of most community emergency response plans. In addition to the direct and indirect impacts that represent changes in activity that can be tied directly to an event, additional perception-based impacts are possible. Usually these perception-based impacts are larger and more difficult to measure or mitigate than direct impacts. These impacts are based primarily on the affected population's changes in attitudes toward a particular neighborhood or region based on fear of a future event or future losses. These impacts can be motivated by fear of future storms or man-caused events, lingering toxic contamination that may or may not have been removed, and any other behavior by individuals that cannot be explained by actual events or calculated measures of risk or uncertainty. Perception-based impacts are often difficult to estimate directly. In many instaces, case studies of comparable events are used to attempt to develop judgemental estimates of possible future impacts on the area of question. For example, impacts from such events as Love Canal, the Three Mile Island nucear accident, the September 11 attacks, and the Goiana radioactive material spill are used to get a sense of the severity and duration of possible perception-based impacts of a particular event. Perception-based impacts can include additional losses in property value, losses in population (or reduced rates of population due to lower migration rates) that cannot be attributed to actual economic and demographic changes that can be tied to the event directly. Additional perception-based impacts can include long-term worker absenteeism by an asymptomatic public (i.e., the worried well), losses in tourism, losses in cargo at transportation hubs due to fears by shippers and recipients who choose alternative modes of transportation for shipping goods into the affected area. Another proxy for perception-based impacts from man-caused events can be additional security expenditures by government, the private sector, and individuals. This presentation will describe potential methodologies for estimating or anticipating these potential events for generic planning scenarios.

  17. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  18. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  19. A Bootstrap-Based Probabilistic Optimization Method to Explore and Efficiently Converge in Solution Spaces of Earthquake Source Parameter Estimation Problems: Application to Volcanic and Tectonic Earthquakes

    NASA Astrophysics Data System (ADS)

    Dahm, T.; Heimann, S.; Isken, M.; Vasyura-Bathke, H.; Kühn, D.; Sudhaus, H.; Kriegerowski, M.; Daout, S.; Steinberg, A.; Cesca, S.

    2017-12-01

    Seismic source and moment tensor waveform inversion is often ill-posed or non-unique if station coverage is poor or signals are weak. Therefore, the interpretation of moment tensors can become difficult, if not the full model space is explored, including all its trade-offs and uncertainties. This is especially true for non-double couple components of weak or shallow earthquakes, as for instance found in volcanic, geothermal or mining environments.We developed a bootstrap-based probabilistic optimization scheme (Grond), which is based on pre-calculated Greens function full waveform databases (e.g. fomosto tool, doi.org/10.5880/GFZ.2.1.2017.001). Grond is able to efficiently explore the full model space, the trade-offs and the uncertainties of source parameters. The program is highly flexible with respect to the adaption to specific problems, the design of objective functions, and the diversity of empirical datasets.It uses an integrated, robust waveform data processing based on a newly developed Python toolbox for seismology (Pyrocko, see Heimann et al., 2017, http://doi.org/10.5880/GFZ.2.1.2017.001), and allows for visual inspection of many aspects of the optimization problem. Grond has been applied to the CMT moment tensor inversion using W-phases, to nuclear explosions in Korea, to meteorite atmospheric explosions, to volcano-tectonic events during caldera collapse and to intra-plate volcanic and tectonic crustal events.Grond can be used to optimize simultaneously seismological waveforms, amplitude spectra and static displacements of geodetic data as InSAR and GPS (e.g. KITE, Isken et al., 2017, http://doi.org/10.5880/GFZ.2.1.2017.002). We present examples of Grond optimizations to demonstrate the advantage of a full exploration of source parameter uncertainties for interpretation.

  20. Using a High-Resolution Ensemble Modeling Method to Inform Risk-Based Decision-Making at Taylor Park Dam, Colorado

    NASA Astrophysics Data System (ADS)

    Mueller, M.; Mahoney, K. M.; Holman, K. D.

    2015-12-01

    The Bureau of Reclamation (Reclamation) is responsible for the safety of Taylor Park Dam, located in central Colorado at an elevation of 9300 feet. A key aspect of dam safety is anticipating extreme precipitation, runoff and the associated inflow of water to the reservoir within a probabilistic framework for risk analyses. The Cooperative Institute for Research in Environmental Sciences (CIRES) has partnered with Reclamation to improve understanding and estimation of precipitation in the western United States, including the Taylor Park watershed. A significant challenge is that Taylor Park Dam is located in a relatively data-sparse region, surrounded by mountains exceeding 12,000 feet. To better estimate heavy precipitation events in this basin, a high-resolution modeling approach is used. The Weather Research and Forecasting (WRF) model is employed to simulate events that have produced observed peaks in streamflow at the location of interest. Importantly, an ensemble of model simulations are run on each event so that uncertainty bounds (i.e., forecast error) may be provided such that the model outputs may be more effectively used in Reclamation's risk assessment framework. Model estimates of precipitation (and the uncertainty thereof) are then used in rainfall runoff models to determine the probability of inflows to the reservoir for use in Reclamation's dam safety risk analyses.

  1. Jet energy measurement and its systematic uncertainty in proton-proton collisions at [Formula: see text] TeV with the ATLAS detector.

    PubMed

    Aad, G; Abajyan, T; Abbott, B; Abdallah, J; Abdel Khalek, S; Abdinov, O; Aben, R; Abi, B; Abolins, M; AbouZeid, O S; Abramowicz, H; Abreu, H; Abulaiti, Y; Acharya, B S; Adamczyk, L; Adams, D L; Addy, T N; Adelman, J; Adomeit, S; Adye, T; Aefsky, S; Agatonovic-Jovin, T; Aguilar-Saavedra, J A; Agustoni, M; Ahlen, S P; Ahmad, A; Ahmadov, F; Aielli, G; Åkesson, T P A; Akimoto, G; Akimov, A V; Alam, M A; Albert, J; Albrand, S; Alconada Verzini, M J; Aleksa, M; Aleksandrov, I N; Alessandria, F; Alexa, C; Alexander, G; Alexandre, G; Alexopoulos, T; Alhroob, M; Aliev, M; Alimonti, G; Alio, L; Alison, J; Allbrooke, B M M; Allison, L J; Allport, P P; Allwood-Spiers, S E; Almond, J; Aloisio, A; Alon, R; Alonso, A; Alonso, F; Altheimer, A; Alvarez Gonzalez, B; Alviggi, M G; Amako, K; Amaral Coutinho, Y; Amelung, C; Ammosov, V V; Amor Dos Santos, S P; Amorim, A; Amoroso, S; Amram, N; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anderson, K J; Andreazza, A; Andrei, V; Anduaga, X S; Angelidakis, S; Anger, P; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antonaki, A; Antonelli, M; Antonov, A; Antos, J; Anulli, F; Aoki, M; Aperio Bella, L; Apolle, R; Arabidze, G; Aracena, I; Arai, Y; Arce, A T H; Arfaoui, S; Arguin, J-F; Argyropoulos, S; Arik, E; Arik, M; Armbruster, A J; Arnaez, O; Arnal, V; Arslan, O; Artamonov, A; Artoni, G; Asai, S; Asbah, N; Ask, S; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Astbury, A; Atkinson, M; Atlay, N B; Auerbach, B; Auge, E; Augsten, K; Aurousseau, M; Avolio, G; Azuelos, G; Azuma, Y; Baak, M A; Bacci, C; Bach, A M; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Backus Mayes, J; Badescu, E; Bagiacchi, P; Bagnaia, P; Bai, Y; Bailey, D C; Bain, T; Baines, J T; Baker, O K; Baker, S; Balek, P; Balli, F; Banas, E; Banerjee, Sw; Banfi, D; Bangert, A; Bansal, V; Bansil, H S; Barak, L; Baranov, S P; Barber, T; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barisonzi, M; Barklow, T; Barlow, N; Barnett, B M; Barnett, R M; Baroncelli, A; Barone, G; Barr, A J; Barreiro, F; Barreiro Guimarães da Costa, J; Bartoldus, R; Barton, A E; Bartos, P; Bartsch, V; Bassalat, A; Basye, A; Bates, R L; Batkova, L; Batley, J R; Battistin, M; Bauer, F; Bawa, H S; Beau, T; Beauchemin, P H; Beccherle, R; Bechtle, P; Beck, H P; Becker, K; Becker, S; Beckingham, M; Beddall, A J; Beddall, A; Bedikian, S; Bednyakov, V A; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, K; Belanger-Champagne, C; Bell, P J; Bell, W H; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belloni, A; Beloborodova, O L; Belotskiy, K; Beltramello, O; Benary, O; Benchekroun, D; Bendtz, K; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez Garcia, J A; Benjamin, D P; Bensinger, J R; Benslama, K; Bentvelsen, S; Berge, D; Bergeaas Kuutmann, E; Berger, N; Berghaus, F; Berglund, E; Beringer, J; Bernard, C; Bernat, P; Bernhard, R; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertolucci, F; Besana, M I; Besjes, G J; Bessidskaia, O; Besson, N; Bethke, S; Bhimji, W; Bianchi, R M; Bianchini, L; Bianco, M; Biebel, O; Bieniek, S P; Bierwagen, K; Biesiada, J; Biglietti, M; Bilbao De Mendizabal, J; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Bittner, B; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blanchard, J-B; Blazek, T; Bloch, I; Blocker, C; Blocki, J; Blum, W; Blumenschein, U; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Boddy, C R; Boehler, M; Boek, J; Boek, T T; Boelaert, N; Bogaerts, J A; Bogdanchikov, A G; Bogouch, A; Bohm, C; Bohm, J; Boisvert, V; Bold, T; Boldea, V; Boldyrev, A S; Bolnet, N M; Bomben, M; Bona, M; Boonekamp, M; Bordoni, S; Borer, C; Borisov, A; Borissov, G; Borri, M; Borroni, S; Bortfeldt, J; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Boterenbrood, H; Bouchami, J; Boudreau, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Bousson, N; Boutouil, S; Boveia, A; Boyd, J; Boyko, I R; Bozovic-Jelisavcic, I; Bracinik, J; Branchini, P; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Brazzale, S F; Brelier, B; Brendlinger, K; Brenner, R; Bressler, S; Bristow, T M; Britton, D; Brochu, F M; Brock, I; Brock, R; Broggi, F; Bromberg, C; Bronner, J; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Brown, G; Brown, J; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Brunet, S; Bruni, A; Bruni, G; Bruschi, M; Bryngemark, L; Buanes, T; Buat, Q; Bucci, F; Buchholz, P; Buckingham, R M; Buckley, A G; Buda, S I; Budagov, I A; Budick, B; Buehrer, F; Bugge, L; Bugge, M K; Bulekov, O; Bundock, A C; Bunse, M; Burckhart, H; Burdin, S; Burgess, T; Burghgrave, B; Burke, S; Burmeister, I; Busato, E; Büscher, V; Bussey, P; Buszello, C P; Butler, B; Butler, J M; Butt, A I; Buttar, C M; Butterworth, J M; Buttinger, W; Buzatu, A; Byszewski, M; Cabrera Urbán, S; Caforio, D; Cakir, O; Calafiura, P; Calderini, G; Calfayan, P; Calkins, R; Caloba, L P; Caloi, R; Calvet, D; Calvet, S; Camacho Toro, R; Camarri, P; Cameron, D; Caminada, L M; Caminal Armadans, R; Campana, S; Campanelli, M; Canale, V; Canelli, F; Canepa, A; Cantero, J; Cantrill, R; Cao, T; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capua, M; Caputo, R; Cardarelli, R; Carli, T; Carlino, G; Carminati, L; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, A A; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Caso, C; Castaneda-Miranda, E; Castelli, A; Castillo Gimenez, V; Castro, N F; Catastini, P; Catinaccio, A; Catmore, J R; Cattai, A; Cattani, G; Caughron, S; Cavaliere, V; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerio, B; Cerny, K; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chalupkova, I; Chan, K; Chang, P; Chapleau, B; Chapman, J D; Charfeddine, D; Charlton, D G; Chavda, V; Chavez Barajas, C A; Cheatham, S; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, L; Chen, S; Chen, X; Chen, Y; Cheng, Y; Cheplakov, A; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiefari, G; Childers, J T; Chilingarov, A; Chiodini, G; Chisholm, A S; Chislett, R T; Chitan, A; Chizhov, M V; Chouridou, S; Chow, B K B; Christidi, I A; Chromek-Burckhart, D; Chu, M L; Chudoba, J; Ciapetti, G; Ciftci, A K; Ciftci, R; Cinca, D; Cindro, V; Ciocio, A; Cirilli, M; Cirkovic, P; Citron, Z H; Citterio, M; Ciubancan, M; Clark, A; Clark, P J; Clarke, R N; Cleland, W; Clemens, J C; Clement, B; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coelli, S; Coffey, L; Cogan, J G; Coggeshall, J; Colas, J; Cole, B; Cole, S; Colijn, A P; Collins-Tooth, C; Collot, J; Colombo, T; Colon, G; Compostella, G; Conde Muiño, P; Coniavitis, E; Conidi, M C; Connelly, I A; Consonni, S M; Consorti, V; Constantinescu, S; Conta, C; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cooper-Smith, N J; Copic, K; Cornelissen, T; Corradi, M; Corriveau, F; Corso-Radu, A; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Côté, D; Cottin, G; Courneyea, L; Cowan, G; Cox, B E; Cranmer, K; Cree, G; Crépé-Renaudin, S; Crescioli, F; Crispin Ortuzar, M; Cristinziani, M; Crosetti, G; Cuciuc, C-M; Cuenca Almenar, C; Cuhadar Donszelmann, T; Cummings, J; Curatolo, M; Cuthbert, C; Czirr, H; Czodrowski, P; Czyczula, Z; D'Auria, S; D'Onofrio, M; D'Orazio, A; Da Cunha Sargedas De Sousa, M J; Da Via, C; Dabrowski, W; Dafinca, A; Dai, T; Dallaire, F; Dallapiccola, C; Dam, M; Daniells, A C; Dano Hoffmann, M; Dao, V; Darbo, G; Darlea, G L; Darmora, S; Dassoulas, J A; Davey, W; David, C; Davidek, T; Davies, E; Davies, M; Davignon, O; Davison, A R; Davygora, Y; Dawe, E; Dawson, I; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Castro, S; De Cecco, S; de Graat, J; De Groot, N; de Jong, P; De La Taille, C; De la Torre, H; De Lorenzi, F; De Nooij, L; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; De Zorzi, G; Dearnaley, W J; Debbe, R; Debenedetti, C; Dechenaux, B; Dedovich, D V; Degenhardt, J; Del Peso, J; Del Prete, T; Delemontex, T; Deliot, F; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; Deluca, C; Demers, S; Demichev, M; Demilly, A; Demirkoz, B; Denisov, S P; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deviveiros, P O; Dewhurst, A; DeWilde, B; Dhaliwal, S; Dhullipudi, R; Di Ciaccio, A; Di Ciaccio, L; Di Domenico, A; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Mattia, A; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Di Valentino, D; Diaz, M A; Diehl, E B; Dietrich, J; Dietzsch, T A; Diglio, S; Dindar Yagci, K; Dingfelder, J; Dionisi, C; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; do Vale, M A B; Do Valle Wemans, A; Doan, T K O; Dobos, D; Dobson, E; Dodd, J; Doglioni, C; Doherty, T; Dohmae, T; Dolejsi, J; Dolezal, Z; Dolgoshein, B A; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dos Anjos, A; Dotti, A; Dova, M T; Doyle, A T; Dris, M; Dubbert, J; Dube, S; Dubreuil, E; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Dudziak, F; Duflot, L; Duguid, L; Dührssen, M; Dunford, M; Duran Yildiz, H; Düren, M; Dwuznik, M; Ebke, J; Edson, W; Edwards, C A; Edwards, N C; Ehrenfeld, W; Eifert, T; Eigen, G; Einsweiler, K; Eisenhandler, E; Ekelof, T; El Kacimi, M; Ellert, M; Elles, S; Ellinghaus, F; Ellis, K; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Endo, M; Engelmann, R; Erdmann, J; Ereditato, A; Eriksson, D; Ernis, G; Ernst, J; Ernst, M; Ernwein, J; Errede, D; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Espinal Curull, X; Esposito, B; Etienne, F; Etienvre, A I; Etzion, E; Evangelakou, D; Evans, H; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Fatholahzadeh, B; Favareto, A; Fayard, L; Federic, P; Fedin, O L; Fedorko, W; Fehling-Kaschek, M; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Fernando, W; Ferrag, S; Ferrando, J; Ferrara, V; Ferrari, A; Ferrari, P; Ferrari, R; Ferreira de Lima, D E; Ferrer, A; Ferrere, D; Ferretti, C; Ferretto Parodi, A; Fiascaris, M; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, J; Fisher, M J; Fitzgerald, E A; Flechl, M; Fleck, I; Fleischmann, P; Fleischmann, S; Fletcher, G T; Fletcher, G; Flick, T; Floderus, A; Flores Castillo, L R; Florez Bustos, A C; Flowerdew, M J; Fonseca Martin, T; Formica, A; Forti, A; Fortin, D; Fournier, D; Fox, H; Francavilla, P; Franchini, M; Franchino, S; Francis, D; Franklin, M; Franz, S; Fraternali, M; Fratina, S; French, S T; Friedrich, C; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fulsom, B G; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gadatsch, S; Gadfort, T; Gadomski, S; Gagliardi, G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallo, V; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Gandrajula, R P; Gao, J; Gao, Y S; Garay Walls, F M; Garberson, F; García, C; García Navarro, J E; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Gatti, C; Gaudio, G; Gaur, B; Gauthier, L; Gauzzi, P; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Ge, P; Gecse, Z; Gee, C N P; Geerts, D A A; Geich-Gimbel, Ch; Gellerstedt, K; Gemme, C; Gemmell, A; Genest, M H; Gentile, S; George, M; George, S; Gerbaudo, D; Gershon, A; Ghazlane, H; Ghodbane, N; Giacobbe, B; Giagu, S; Giangiobbe, V; Giannetti, P; Gianotti, F; Gibbard, B; Gibson, S M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gillman, A R; Gingrich, D M; Giokaris, N; Giordani, M P; Giordano, R; Giorgi, F M; Giovannini, P; Giraud, P F; Giugni, D; Giuliani, C; Giunta, M; Gjelsten, B K; Gkialas, I; Gladilin, L K; Glasman, C; Glatzer, J; Glazov, A; Glonti, G L; Goblirsch-Kolb, M; Goddard, J R; Godfrey, J; Godlewski, J; Goeringer, C; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gomez Fajardo, L S; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, L; González de la Hoz, S; Gonzalez Parra, G; Gonzalez Silva, M L; Gonzalez-Sevilla, S; Goodson, J J; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorfine, G; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Gouighri, M; Goujdami, D; Goulette, M P; Goussiou, A G; Goy, C; Gozpinar, S; Grabas, H M X; Graber, L; Grabowska-Bold, I; Grafström, P; Grahn, K-J; Gramling, J; Gramstad, E; Grancagnolo, F; Grancagnolo, S; Grassi, V; Gratchev, V; Gray, H M; Gray, J A; Graziani, E; Grebenyuk, O G; Greenwood, Z D; Gregersen, K; Gregor, I M; Grenier, P; Griffiths, J; Grigalashvili, N; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grishkevich, Y V; Grivaz, J-F; Grohs, J P; Grohsjean, A; Gross, E; Grosse-Knetter, J; Grossi, G C; Groth-Jensen, J; Grout, Z J; Grybel, K; Guescini, F; Guest, D; Gueta, O; Guicheney, C; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Gunther, J; Guo, J; Gupta, S; Gutierrez, P; Gutierrez Ortiz, N G; Gutschow, C; Guttman, N; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Haefner, P; Hageboeck, S; Hajduk, Z; Hakobyan, H; Haleem, M; Hall, D; Halladjian, G; Hamacher, K; Hamal, P; Hamano, K; Hamer, M; Hamilton, A; Hamilton, S; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Hanke, P; Hansen, J R; Hansen, J B; Hansen, J D; Hansen, P H; Hansson, P; Hara, K; Hard, A S; Harenberg, T; Harkusha, S; Harper, D; Harrington, R D; Harris, O M; Harrison, P F; Hartjes, F; Harvey, A; Hasegawa, S; Hasegawa, Y; Hassani, S; Haug, S; Hauschild, M; Hauser, R; Havranek, M; Hawkes, C M; Hawkings, R J; Hawkins, A D; Hayashi, T; Hayden, D; Hays, C P; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heinemann, B; Heisterkamp, S; Hejbal, J; Helary, L; Heller, C; Heller, M; Hellman, S; Hellmich, D; Helsens, C; Henderson, J; Henderson, R C W; Henrichs, A; Henriques Correia, A M; Henrot-Versille, S; Hensel, C; Herbert, G H; Hernandez, C M; Hernández Jiménez, Y; Herrberg-Schubert, R; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hickling, R; Higón-Rodriguez, E; Hill, J C; Hiller, K H; Hillert, S; Hillier, S J; Hinchliffe, I; Hines, E; Hirose, M; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoffman, J; Hoffmann, D; Hofmann, J I; Hohlfeld, M; Holmes, T R; Hong, T M; Hooft van Huysduynen, L; Hostachy, J-Y; Hou, S; Hoummada, A; Howard, J; Howarth, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hsu, P J; Hsu, S-C; Hu, D; Hu, X; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huettmann, A; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Hülsing, T A; Hurwitz, M; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Idarraga, J; Ideal, E; Iengo, P; Igonkina, O; Iizawa, T; Ikegami, Y; Ikematsu, K; Ikeno, M; Iliadis, D; Ilic, N; Inamaru, Y; Ince, T; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Irles Quiles, A; Isaksson, C; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Ivashin, A V; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jackson, B; Jackson, J N; Jackson, M; Jackson, P; Jaekel, M R; Jain, V; Jakobs, K; Jakobsen, S; Jakoubek, T; Jakubek, J; Jamin, D O; Jana, D K; Jansen, E; Jansen, H; Janssen, J; Janus, M; Jared, R C; Jarlskog, G; Jeanty, L; Jeng, G-Y; Jen-La Plante, I; Jennens, D; Jenni, P; Jentzsch, J; Jeske, C; Jézéquel, S; Jha, M K; Ji, H; Ji, W; Jia, J; Jiang, Y; Jimenez Belenguer, M; Jin, S; Jinaru, A; Jinnouchi, O; Joergensen, M D; Joffe, D; Johansson, K E; Johansson, P; Johns, K A; Jon-And, K; Jones, G; Jones, R W L; Jones, T J; Jorge, P M; Joshi, K D; Jovicevic, J; Ju, X; Jung, C A; Jungst, R M; Jussel, P; Juste Rozas, A; Kaci, M; Kaczmarska, A; Kadlecik, P; Kado, M; Kagan, H; Kagan, M; Kajomovitz, E; Kalinin, S; Kama, S; Kanaya, N; Kaneda, M; Kaneti, S; Kanno, T; Kantserov, V A; Kanzaki, J; Kaplan, B; Kapliy, A; Kar, D; Karakostas, K; Karastathis, N; Karnevskiy, M; Karpov, S N; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kashif, L; Kasieczka, G; Kass, R D; Kastanas, A; Kataoka, Y; Katre, A; Katzy, J; Kaushik, V; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazama, S; Kazanin, V F; Kazarinov, M Y; Keeler, R; Keener, P T; Kehoe, R; Keil, M; Keller, J S; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Kessoku, K; Keung, J; Khalil-Zada, F; Khandanyan, H; Khanov, A; Kharchenko, D; Khodinov, A; Khomich, A; Khoo, T J; Khoriauli, G; Khoroshilov, A; Khovanskiy, V; Khramov, E; Khubua, J; Kim, H; Kim, S H; Kimura, N; Kind, O; King, B T; King, M; King, R S B; King, S B; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kitamura, T; Kittelmann, T; Kiuchi, K; Kladiva, E; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klinger, J A; Klinkby, E B; Klioutchnikova, T; Klok, P F; Kluge, E-E; Kluit, P; Kluth, S; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koenig, S; Koevesarki, P; Koffas, T; Koffeman, E; Kogan, L A; Kohlmann, S; Kohout, Z; Kohriki, T; Koi, T; Kolanoski, H; Koletsou, I; Koll, J; Komar, A A; Komori, Y; Kondo, T; Köneke, K; König, A C; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Korotkov, V A; Kortner, O; Kortner, S; Kostyukhin, V V; Kotov, S; Kotov, V M; Kotwal, A; Kourkoumelis, C; Kouskoura, V; Koutsman, A; Kowalewski, R; Kowalski, T Z; Kozanecki, W; Kozhin, A S; Kral, V; Kramarenko, V A; Kramberger, G; Krasny, M W; Krasznahorkay, A; Kraus, J K; Kravchenko, A; Kreiss, S; Kretzschmar, J; Kreutzfeldt, K; Krieger, N; Krieger, P; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Kruker, T; Krumnack, N; Krumshteyn, Z V; Kruse, A; Kruse, M C; Kruskal, M; Kubota, T; Kuday, S; Kuehn, S; Kugel, A; Kuhl, T; Kukhtin, V; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunkle, J; Kupco, A; Kurashige, H; Kurata, M; Kurochkin, Y A; Kurumida, R; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwee, R; La Rosa, A; La Rotonda, L; Labarga, L; Lablak, S; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Laier, H; Laisne, E; Lambourne, L; Lampen, C L; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lang, V S; Lange, C; Lankford, A J; Lanni, F; Lantzsch, K; Lanza, A; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Larner, A; Lassnig, M; Laurelli, P; Lavorini, V; Lavrijsen, W; Laycock, P; Le, B T; Le Dortz, O; Le Guirriec, E; Le Menedeu, E; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, H; Lee, J S H; Lee, S C; Lee, L; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Lehmacher, M; Lehmann Miotto, G; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Lendermann, V; Leney, K J C; Lenz, T; Lenzen, G; Lenzi, B; Leone, R; Leonhardt, K; Leontsinis, S; Leroy, C; Lessard, J-R; Lester, C G; Lester, C M; Levêque, J; Levin, D; Levinson, L J; Lewis, A; Lewis, G H; Leyko, A M; Leyton, M; Li, B; Li, B; Li, H; Li, H L; Li, S; Li, X; Liang, Z; Liao, H; Liberti, B; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limbach, C; Limosani, A; Limper, M; Lin, S C; Linde, F; Lindquist, B E; Linnemann, J T; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lissauer, D; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, M; Liu, Y; Livan, M; Livermore, S S A; Lleres, A; Llorente Merino, J; Lloyd, S L; Lo Sterzo, F; Lobodzinska, E; Loch, P; Lockman, W S; Loddenkoetter, T; Loebinger, F K; Loevschall-Jensen, A E; Loginov, A; Loh, C W; Lohse, T; Lohwasser, K; Lokajicek, M; Lombardo, V P; Long, J D; Long, R E; Lopes, L; Lopez Mateos, D; Lopez Paredes, B; Lorenz, J; Lorenzo Martinez, N; Losada, M; Loscutoff, P; Losty, M J; Lou, X; Lounis, A; Love, J; Love, P A; Lowe, A J; Lu, F; Lubatti, H J; Luci, C; Lucotte, A; Ludwig, D; Ludwig, I; Luehring, F; Lukas, W; Luminari, L; Lund, E; Lundberg, J; Lundberg, O; Lund-Jensen, B; Lungwitz, M; Lynn, D; Lysak, R; Lytken, E; Ma, H; Ma, L L; Maccarrone, G; Macchiolo, A; Maček, B; Machado Miguens, J; Macina, D; Mackeprang, R; Madar, R; Madaras, R J; Maddocks, H J; Mader, W F; Madsen, A; Maeno, M; Maeno, T; Magnoni, L; Magradze, E; Mahboubi, K; Mahlstedt, J; Mahmoud, S; Mahout, G; Maiani, C; Maidantchik, C; Maio, A; Majewski, S; Makida, Y; Makovec, N; Mal, P; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyshev, V M; Malyukov, S; Mamuzic, J; Mandelli, L; Mandić, I; Mandrysch, R; Maneira, J; Manfredini, A; Manhaes de Andrade Filho, L; Manjarres Ramos, J A; Mann, A; Manning, P M; Manousakis-Katsikakis, A; Mansoulie, B; Mantifel, R; Mapelli, L; March, L; Marchand, J F; Marchese, F; Marchiori, G; Marcisovsky, M; Marino, C P; Marques, C N; Marroquim, F; Marshall, Z; Marti, L F; Marti-Garcia, S; Martin, B; Martin, B; Martin, J P; Martin, T A; Martin, V J; Martin Dit Latour, B; Martinez, H; Martinez, M; Martin-Haugh, S; Martyniuk, A C; Marx, M; Marzano, F; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massol, N; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Matsunaga, H; Matsushita, T; Mättig, P; Mättig, S; Mattmann, J; Mattravers, C; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Mazzaferro, L; Mazzanti, M; Mc Goldrick, G; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McCubbin, N A; McFarlane, K W; Mcfayden, J A; Mchedlidze, G; Mclaughlan, T; McMahon, S J; McPherson, R A; Meade, A; Mechnich, J; Mechtel, M; Medinnis, M; Meehan, S; Meera-Lebbai, R; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Melachrinos, C; Mellado Garcia, B R; Meloni, F; Mendoza Navas, L; Mengarelli, A; Menke, S; Meoni, E; Mercurio, K M; Mergelmeyer, S; Meric, N; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Merritt, H; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer, J; Michal, S; Middleton, R P; Migas, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Miller, D W; Mills, C; Milov, A; Milstead, D A; Milstein, D; Minaenko, A A; Miñano Moya, M; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mirabelli, G; Mitani, T; Mitrevski, J; Mitsou, V A; Mitsui, S; Miyagawa, P S; Mjörnmark, J U; Moa, T; Moeller, V; Mohapatra, S; Mohr, W; Molander, S; Moles-Valls, R; Molfetas, A; Mönig, K; Monini, C; Monk, J; Monnier, E; Montejo Berlingen, J; Monticelli, F; Monzani, S; Moore, R W; Mora Herrera, C; Moraes, A; Morange, N; Morel, J; Moreno, D; Moreno Llácer, M; Morettini, P; Morgenstern, M; Morii, M; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Morvaj, L; Moser, H G; Mosidze, M; Moss, J; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Mudd, R D; Mueller, F; Mueller, J; Mueller, K; Mueller, T; Mueller, T; Muenstermann, D; Munwes, Y; Murillo Quijada, J A; Murray, W J; Mussche, I; Musto, E; Myagkov, A G; Myska, M; Nackenhorst, O; Nadal, J; Nagai, K; Nagai, R; Nagai, Y; Nagano, K; Nagarkar, A; Nagasaka, Y; Nagel, M; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Namasivayam, H; Nanava, G; Napier, A; Narayan, R; Nash, M; Nattermann, T; Naumann, T; Navarro, G; Neal, H A; Nechaeva, P Yu; Neep, T J; Negri, A; Negri, G; Negrini, M; Nektarijevic, S; Nelson, A; Nelson, T K; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neusiedl, A; Neves, R M; Nevski, P; Newcomer, F M; Newman, P R; Nguyen, D H; Nguyen Thi Hong, V; Nickerson, R B; Nicolaidou, R; Nicquevert, B; Nielsen, J; Nikiforou, N; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolics, K; Nikolopoulos, K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nodulman, L; Nomachi, M; Nomidis, I; Norberg, S; Nordberg, M; Novakova, J; Nozaki, M; Nozka, L; Ntekas, K; Nuncio-Quiroz, A-E; Nunes Hanninger, G; Nunnemann, T; Nurse, E; O'Brien, B J; O'Grady, F; O'Neil, D C; O'Shea, V; Oakes, L B; Oakham, F G; Oberlack, H; Ocariz, J; Ochi, A; Ochoa, M I; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohshima, T; Okamura, W; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Olchevski, A G; Olivares Pino, S A; Oliveira, M; Oliveira Damazio, D; Oliver Garcia, E; Olivito, D; Olszewski, A; Olszowska, J; Onofre, A; Onyisi, P U E; Oram, C J; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Oropeza Barrera, C; Orr, R S; Osculati, B; Ospanov, R; Otero Y Garzon, G; Otono, H; Ouchrif, M; Ouellette, E A; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Ovcharova, A; Owen, M; Owen, S; Ozcan, V E; Ozturk, N; Pachal, K; Pacheco Pages, A; Padilla Aranda, C; Pagan Griso, S; Paganis, E; Pahl, C; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palestini, S; Pallin, D; Palma, A; Palmer, J D; Pan, Y B; Panagiotopoulou, E; Panduro Vazquez, J G; Pani, P; Panikashvili, N; Panitkin, S; Pantea, D; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Paredes Hernandez, D; Parker, M A; Parodi, F; Parsons, J A; Parzefall, U; Pashapour, S; Pasqualucci, E; Passaggio, S; Passeri, A; Pastore, F; Pastore, Fr; Pásztor, G; Pataraia, S; Patel, N D; Pater, J R; Patricelli, S; Pauly, T; Pearce, J; Pedersen, M; Pedraza Lopez, S; Pedro, R; Peleganchuk, S V; Pelikan, D; Peng, H; Penning, B; Penwell, J; Perepelitsa, D V; Perez Cavalcanti, T; Perez Codina, E; Pérez García-Estañ, M T; Perez Reale, V; Perini, L; Pernegger, H; Perrino, R; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, J; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petrolo, E; Petrucci, F; Petteni, M; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Piec, S M; Piegaia, R; Pignotti, D T; Pilcher, J E; Pilkington, A D; Pina, J; Pinamonti, M; Pinder, A; Pinfold, J L; Pingel, A; Pinto, B; Pizio, C; Pleier, M-A; Pleskot, V; Plotnikova, E; Plucinski, P; Poddar, S; Podlyski, F; Poettgen, R; Poggioli, L; Pohl, D; Pohl, M; Polesello, G; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pomeroy, D; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Portell Bueso, X; Pospelov, G E; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Prabhu, R; Pralavorio, P; Pranko, A; Prasad, S; Pravahan, R; Prell, S; Price, D; Price, J; Price, L E; Prieur, D; Primavera, M; Proissl, M; Prokofiev, K; Prokoshin, F; Protopapadaki, E; Protopopescu, S; Proudfoot, J; Prudent, X; Przybycien, M; Przysiezniak, H; Psoroulas, S; Ptacek, E; Pueschel, E; Puldon, D; Purohit, M; Puzo, P; Pylypchenko, Y; Qian, J; Quadt, A; Quarrie, D R; Quayle, W B; Quilty, D; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Ragusa, F; Rahal, G; Rajagopalan, S; Rammensee, M; Rammes, M; Randle-Conde, A S; Rangel-Smith, C; Rao, K; Rauscher, F; Rave, T C; Ravenscroft, T; Raymond, M; Read, A L; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Reinsch, A; Reisin, H; Reisinger, I; Relich, M; Rembser, C; Ren, Z L; Renaud, A; Rescigno, M; Resconi, S; Resende, B; Reznicek, P; Rezvani, R; Richter, R; Ridel, M; Rieck, P; Rijssenbeek, M; Rimoldi, A; Rinaldi, L; Ritsch, E; Riu, I; Rivoltella, G; Rizatdinova, F; Rizvi, E; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Rocha de Lima, J G; Roda, C; Roda Dos Santos, D; Rodrigues, L; Roe, S; Røhne, O; Rolli, S; Romaniouk, A; Romano, M; Romeo, G; Romero Adam, E; Rompotis, N; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, A; Rose, M; Rosendahl, P L; Rosenthal, O; Rossetti, V; Rossi, E; Rossi, L P; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rubinskiy, I; Rud, V I; Rudolph, C; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rumyantsev, L; Rurikova, Z; Rusakovich, N A; Ruschke, A; Rutherfoord, J P; Ruthmann, N; Ruzicka, P; Ryabov, Y F; Rybar, M; Rybkin, G; Ryder, N C; Saavedra, A F; Sacerdoti, S; Saddique, A; Sadeh, I; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Saleem, M; Salek, D; Sales De Bruin, P H; Salihagic, D; Salnikov, A; Salt, J; Salvachua Ferrando, B M; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sampsonidis, D; Sanchez, A; Sánchez, J; Sanchez Martinez, V; Sandaker, H; Sander, H G; Sanders, M P; Sandhoff, M; Sandoval, T; Sandoval, C; Sandstroem, R; Sankey, D P C; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Santoyo Castillo, I; Sapp, K; Sapronov, A; Saraiva, J G; Sarkisyan-Grinbaum, E; Sarrazin, B; Sartisohn, G; Sasaki, O; Sasaki, Y; Sasao, N; Satsounkevitch, I; Sauvage, G; Sauvan, E; Sauvan, J B; Savard, P; Savinov, V; Savu, D O; Sawyer, C; Sawyer, L; Saxon, D H; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Schaarschmidt, J; Schacht, P; Schaefer, D; Schaelicke, A; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Scherzer, M I; Schiavi, C; Schieck, J; Schillo, C; Schioppa, M; Schlenker, S; Schmidt, E; Schmieden, K; Schmitt, C; Schmitt, C; Schmitt, S; Schneider, B; Schnellbach, Y J; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schorlemmer, A L S; Schott, M; Schouten, D; Schovancova, J; Schram, M; Schramm, S; Schreyer, M; Schroeder, C; Schroer, N; Schuh, N; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwartzman, A; Schwegler, Ph; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Schwoerer, M; Sciacca, F G; Scifo, E; Sciolla, G; Scott, W G; Scutti, F; Searcy, J; Sedov, G; Sedykh, E; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekula, S J; Selbach, K E; Seliverstov, D M; Sellers, G; Seman, M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Serre, T; Seuster, R; Severini, H; Sforza, F; Sfyrla, A; Shabalina, E; Shamim, M; Shan, L Y; Shank, J T; Shao, Q T; Shapiro, M; Shatalov, P B; Shaw, K; Sherwood, P; Shimizu, S; Shimojima, M; Shin, T; Shiyakova, M; Shmeleva, A; Shochet, M J; Short, D; Shrestha, S; Shulga, E; Shupe, M A; Shushkevich, S; Sicho, P; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silbert, O; Silva, J; Silver, Y; Silverstein, D; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simoniello, R; Simonyan, M; Sinervo, P; Sinev, N B; Sipica, V; Siragusa, G; Sircar, A; Sisakyan, A N; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skinnari, L A; Skottowe, H P; Skovpen, K Yu; Skubic, P; Slater, M; Slavicek, T; Sliwa, K; Smakhtin, V; Smart, B H; Smestad, L; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, K M; Smizanska, M; Smolek, K; Snesarev, A A; Snidero, G; Snow, J; Snyder, S; Sobie, R; Socher, F; Sodomka, J; Soffer, A; Soh, D A; Solans, C A; Solar, M; Solc, J; Soldatov, E Yu; Soldevila, U; Solfaroli Camillocci, E; Solodkov, A A; Solovyanov, O V; Solovyev, V; Soni, N; Sood, A; Sopko, V; Sopko, B; Sosebee, M; Soualah, R; Soueid, P; Soukharev, A M; South, D; Spagnolo, S; Spanò, F; Spearman, W R; Spighi, R; Spigo, G; Spousta, M; Spreitzer, T; Spurlock, B; St Denis, R D; Stahlman, J; Stamen, R; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, J; Staroba, P; Starovoitov, P; Staszewski, R; Stavina, P; Steele, G; Steinbach, P; Steinberg, P; Stekl, I; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stern, S; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoerig, K; Stoicea, G; Stonjek, S; Stradling, A R; Straessner, A; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, E; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Stucci, S A; Stugu, B; Stumer, I; Stupak, J; Sturm, P; Styles, N A; Su, D; Su, J; Subramania, Hs; Subramaniam, R; Succurro, A; Sugaya, Y; Suhr, C; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, X; Sundermann, J E; Suruliz, K; Susinno, G; Sutton, M R; Suzuki, Y; Svatos, M; Swedish, S; Swiatlowski, M; Sykora, I; Sykora, T; Ta, D; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takahashi, Y; Takai, H; Takashima, R; Takeda, H; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tam, J Y C; Tamsett, M C; Tan, K G; Tanaka, J; Tanaka, R; Tanaka, S; Tanaka, S; Tanasijczuk, A J; Tani, K; Tannoury, N; Tapprogge, S; Tarem, S; Tarrade, F; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Tavares Delgado, A; Tayalati, Y; Taylor, C; Taylor, F E; Taylor, G N; Taylor, W; Teischinger, F A; Teixeira Dias Castanheira, M; Teixeira-Dias, P; Temming, K K; Ten Kate, H; Teng, P K; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Therhaag, J; Theveneaux-Pelzer, T; Thoma, S; Thomas, J P; Thompson, E N; Thompson, P D; Thompson, P D; Thompson, A S; Thomsen, L A; Thomson, E; Thomson, M; Thong, W M; Thun, R P; Tian, F; Tibbetts, M J; Tic, T; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tiouchichine, E; Tipton, P; Tisserant, S; Todorov, T; Todorova-Nova, S; Toggerson, B; Tojo, J; Tokár, S; Tokushuku, K; Tollefson, K; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Topilin, N D; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Tran, H L; Trefzger, T; Tremblet, L; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Triplett, N; Trischuk, W; Trocmé, B; Troncon, C; Trottier-McDonald, M; Trovatelli, M; True, P; Trzebinski, M; Trzupek, A; Tsarouchas, C; Tseng, J C-L; Tsiareshka, P V; Tsionou, D; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsukerman, I I; Tsulaia, V; Tsung, J-W; Tsuno, S; Tsybychev, D; Tua, A; Tudorache, A; Tudorache, V; Tuggle, J M; Tuna, A N; Tupputi, S A; Turchikhin, S; Turecek, D; Turk Cakir, I; Turra, R; Tuts, P M; Tykhonov, A; Tylmad, M; Tyndel, M; Uchida, K; Ueda, I; Ueno, R; Ughetto, M; Ugland, M; Uhlenbrock, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Urbaniec, D; Urquijo, P; Usai, G; Usanova, A; Vacavant, L; Vacek, V; Vachon, B; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Valladolid Gallego, E; Vallecorsa, S; Valls Ferrer, J A; Van Berg, R; Van Der Deijl, P C; van der Geer, R; van der Graaf, H; Van Der Leeuw, R; van der Ster, D; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vaniachine, A; Vankov, P; Vannucci, F; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vassilakopoulos, V I; Vazeille, F; Vazquez Schroeder, T; Veatch, J; Veloso, F; Veneziano, S; Ventura, A; Ventura, D; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Vigne, R; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinogradov, V B; Virzi, J; Vitells, O; Viti, M; Vivarelli, I; Vives Vaque, F; Vlachos, S; Vladoiu, D; Vlasak, M; Vogel, A; Vokac, P; Volpi, G; Volpi, M; Volpini, G; von der Schmitt, H; von Radziewski, H; von Toerne, E; Vorobel, V; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vu Anh, T; Vuillermet, R; Vukotic, I; Vykydal, Z; Wagner, W; Wagner, P; Wahrmund, S; Wakabayashi, J; Walch, S; Walder, J; Walker, R; Walkowiak, W; Wall, R; Waller, P; Walsh, B; Wang, C; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, X; Warburton, A; Ward, C P; Wardrope, D R; Warsinsky, M; Washbrook, A; Wasicki, C; Watanabe, I; Watkins, P M; Watson, A T; Watson, I J; Watson, M F; Watts, G; Watts, S; Waugh, A T; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Webster, J S; Weidberg, A R; Weigell, P; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wendland, D; Weng, Z; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, P; Wessels, M; Wetter, J; Whalen, K; White, A; White, M J; White, R; White, S; Whiteson, D; Whittington, D; Wicke, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wijeratne, P A; Wildauer, A; Wildt, M A; Wilhelm, I; Wilkens, H G; Will, J Z; Williams, H H; Williams, S; Willis, W; Willocq, S; Wilson, J A; Wilson, A; Wingerter-Seez, I; Winkelmann, S; Winklmeier, F; Wittgen, M; Wittig, T; Wittkowski, J; Wollstadt, S J; Wolter, M W; Wolters, H; Wong, W C; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wraight, K; Wright, M; Wu, S L; Wu, X; Wu, Y; Wulf, E; Wyatt, T R; Wynne, B M; Xella, S; Xiao, M; Xu, C; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yamada, M; Yamaguchi, H; Yamaguchi, Y; Yamamoto, A; Yamamoto, K; Yamamoto, S; Yamamura, T; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, U K; Yang, Y; Yanush, S; Yao, L; Yasu, Y; Yatsenko, E; Yau Wong, K H; Ye, J; Ye, S; Yen, A L; Yildirim, E; Yilmaz, M; Yoosoofmiya, R; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J; Yuan, L; Yurkewicz, A; Zabinski, B; Zaidan, R; Zaitsev, A M; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zaytsev, A; Zeitnitz, C; Zeman, M; Zemla, A; Zengel, K; Zenin, O; Ženiš, T; Zerwas, D; Zevi Della Porta, G; Zhang, D; Zhang, H; Zhang, J; Zhang, L; Zhang, X; Zhang, Z; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, L; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, R; Zimmermann, S; Zimmermann, S; Zinonos, Z; Ziolkowski, M; Zitoun, R; Zobernig, G; Zoccoli, A; Zur Nedden, M; Zurzolo, G; Zutshi, V; Zwalinski, L

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of [Formula: see text] TeV corresponding to an integrated luminosity of [Formula: see text][Formula: see text]. Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-[Formula: see text] algorithm with distance parameters [Formula: see text] or [Formula: see text], and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a [Formula: see text] boson, for [Formula: see text] and pseudorapidities [Formula: see text]. The effect of multiple proton-proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ([Formula: see text]) for jets with [Formula: see text]. For central jets at lower [Formula: see text], the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton-proton collisions and test-beam data, which also provide the estimate for [Formula: see text] TeV. The calibration of forward jets is derived from dijet [Formula: see text] balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-[Formula: see text] jets at [Formula: see text]. Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5-3 %.

  2. Jet energy measurement and its systematic uncertainty in proton–proton collisions at √s = 7 TeV with the ATLAS detector

    DOE PAGES

    Aad, G.

    2015-01-15

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of \\(\\sqrt{s}=7\\) TeV corresponding to an integrated luminosity of \\(4.7\\) \\(\\,\\,\\text{ fb }^{-1}\\). Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-\\(k_{t}\\) algorithm with distance parameters \\(R=0.4\\) or \\(R=0.6\\), and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transversemore » momentum balance between a jet and a reference object such as a photon or a \\(Z\\) boson, for \\({20} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{1000}\\, ~\\mathrm{GeV }\\) and pseudorapidities \\(|\\eta |<{4.5}\\). The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region (\\(|\\eta |<{1.2}\\)) for jets with \\({55} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{500}\\, ~\\mathrm{GeV }\\). For central jets at lower \\(p_{\\mathrm {T}}\\), the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for \\(p_{\\mathrm {T}}^\\mathrm {jet}> 1\\) TeV. The calibration of forward jets is derived from dijet \\(p_{\\mathrm {T}}\\) balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-\\(p_{\\mathrm {T}}\\) jets at \\(|\\eta |=4.5\\). In addition, JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.« less

  3. Testing the Reviewed Event Bulletin of the International Data Centre Using Waveform Cross Correlation: Repeat Events at Aitik Copper Mine, Sweden

    NASA Astrophysics Data System (ADS)

    Kitov, I. O.; Rozhkov, N.; Bobrov, D.; Rozhkov, M.; Yedlin, M. J.

    2016-12-01

    The quality of the Reviewed Event Bulletin (REB) issued by the International Data Centre (IDC) of the Comprehensive Nuclear-Test- Ban Treaty Organization (CTBTO) is crucial for the Member States as well as for the seismological community. One of the most efficient methods to test the REB quality is using repeat events having very accurate absolute locations. Hundreds of quarry blasts detonated at Aitik copper mine (the central point of active mining - 67.08N, 20.95E) were recorded by several seismic arrays of the International Monitoring System (IMS), found by IDC automatic processing and then confirmed by analysts as REB events. The size of the quarry is approximately 1 km and one can consider that the uncertainty in absolute coordinates of the studied events is less than 0.5 km as measured from the central point. In the REB, the corresponding epicenters are almost uniformly scattered over the territory 67.0N to 67.3N, and 20.7E to 21.5E. These REB locations are based on the measured arrival times as well as azimuth and slowness estimates at several IMS stations with the main input from ARCES, NOA, FINES, and HFS. The higher scattering of REB locations is caused by the uncertainty in measurements and velocity model. Seismological methods based on waveform cross correlation allow very accurate relative location of repeat events. Here we test the level of similarity between signals from these events. It was found that IMS primary array station ARCES demonstrates the highest similarity as expressed by cross correlation coefficient (CC) and signal-to-noise ratio (SNR) calculated at the CC traces. Small-aperture array FINES is the second best and large-aperture array NOA demonstrating mediocre performance likely due its size and the loss of coherency between high-frequency and relatively low-velocity signals from the mine. During the last five years station ARCES has been upgraded from a vertical array to a 3-C one. This transformation has improved the performance of CC-technique as applied to the Aitik mine events. We have also applied a Principal Component Analysis to estimate the level of variability in the signals as well as to build the best waveform template for effective detection and identification of all blasts conducted at Aitik mine.

  4. On the Timing of Glacial Terminations in the Equatorial Pacific

    NASA Astrophysics Data System (ADS)

    Khider, D.; Ahn, S.; Lisiecki, L. E.; Lawrence, C.; Kienast, M.

    2015-12-01

    Understanding the mechanisms through which the climate system responds to orbital insolation changes requires establishing the timing of events imprinted on the geological record. In this study, we investigate the relative timing of the glacial terminations across the equatorial Pacific in order to identify a possible mechanism through which the tropics may have influenced a global climate response. The relative termination timing between the eastern and western equatorial Pacific was assessed from 15 published SST records based on Globigerinoides ruber Mg/Ca or alkenone thermometry. The novelty of our study lies in the accounting of the various sources of uncertainty inherent to paleoclimate reconstruction and timing analysis. Specifically, we use a Monte-Carlo process allowing sampling of possible realizations of the time series that are functions of the uncertainty of the benthic δ18O alignment to a global benthic curve, of the SST uncertainty, and of the uncertainty in the change point, which we use as a definition for the termination timing. We find that the uncertainty on the relative timing estimates is on the order of several thousand years, and stems from age model uncertainty (60%) and the uncertainty in the change point detection (40%). Random sources of uncertainty are the main contributor, and, therefore, averaging over a large datasets and/or higher resolution records should yield more precise and accurate estimates of the relative lead-lag. However, at this time, the number of records is not sufficient to identify any significant differences in the timing of the last three glacial terminations in SST records from the Eastern and Western Tropical Pacific.

  5. Estimating groundwater recharge uncertainty from joint application of an aquifer test and the water-table fluctuation method

    NASA Astrophysics Data System (ADS)

    Delottier, H.; Pryet, A.; Lemieux, J. M.; Dupuy, A.

    2018-05-01

    Specific yield and groundwater recharge of unconfined aquifers are both essential parameters for groundwater modeling and sustainable groundwater development, yet the collection of reliable estimates of these parameters remains challenging. Here, a joint approach combining an aquifer test with application of the water-table fluctuation (WTF) method is presented to estimate these parameters and quantify their uncertainty. The approach requires two wells: an observation well instrumented with a pressure probe for long-term monitoring and a pumping well, located in the vicinity, for the aquifer test. The derivative of observed drawdown levels highlights the necessity to represent delayed drainage from the unsaturated zone when interpreting the aquifer test results. Groundwater recharge is estimated with an event-based WTF method in order to minimize the transient effects of flow dynamics in the unsaturated zone. The uncertainty on groundwater recharge is obtained by the propagation of the uncertainties on specific yield (Bayesian inference) and groundwater recession dynamics (regression analysis) through the WTF equation. A major portion of the uncertainty on groundwater recharge originates from the uncertainty on the specific yield. The approach was applied to a site in Bordeaux (France). Groundwater recharge was estimated to be 335 mm with an associated uncertainty of 86.6 mm at 2σ. By the use of cost-effective instrumentation and parsimonious methods of interpretation, the replication of such a joint approach should be encouraged to provide reliable estimates of specific yield and groundwater recharge over a region of interest. This is necessary to reduce the predictive uncertainty of groundwater management models.

  6. Reliable estimates of predictive uncertainty for an Alpine catchment using a non-parametric methodology

    NASA Astrophysics Data System (ADS)

    Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.

    2017-04-01

    Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.

  7. The effects of vent location, event scale and time forecasts on pyroclastic density current hazard maps at Campi Flegrei caldera (Italy)

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Bisson, Marina; Esposti Ongaro, Tomaso; Flandoli, Franco; Isaia, Roberto; Rosi, Mauro; Vitale, Stefano

    2017-09-01

    This study presents a new method for producing long-term hazard maps for pyroclastic density currents (PDC) originating at Campi Flegrei caldera. Such method is based on a doubly stochastic approach and is able to combine the uncertainty assessments on the spatial location of the volcanic vent, the size of the flow and the expected time of such an event. The results are obtained by using a Monte Carlo approach and adopting a simplified invasion model based on the box model integral approximation. Temporal assessments are modelled through a Cox-type process including self-excitement effects, based on the eruptive record of the last 15 kyr. Mean and percentile maps of PDC invasion probability are produced, exploring their sensitivity to some sources of uncertainty and to the effects of the dependence between PDC scales and the caldera sector where they originated. Conditional maps representative of PDC originating inside limited zones of the caldera, or of PDC with a limited range of scales are also produced. Finally, the effect of assuming different time windows for the hazard estimates is explored, also including the potential occurrence of a sequence of multiple events. Assuming that the last eruption of Monte Nuovo (A.D. 1538) marked the beginning of a new epoch of activity similar to the previous ones, results of the statistical analysis indicate a mean probability of PDC invasion above 5% in the next 50 years on almost the entire caldera (with a probability peak of 25% in the central part of the caldera). In contrast, probability values reduce by a factor of about 3 if the entire eruptive record is considered over the last 15 kyr, i.e. including both eruptive epochs and quiescent periods.

  8. Ionospheric Response to Extremes in the Space Environment: Establishing Benchmarks for the Space Weather Action Plan.

    NASA Astrophysics Data System (ADS)

    Viereck, R. A.; Azeem, S. I.

    2017-12-01

    One of the goals of the National Space Weather Action Plan is to establish extreme event benchmarks. These benchmarks are estimates of environmental parameters that impact technologies and systems during extreme space weather events. Quantitative assessment of anticipated conditions during these extreme space weather event will enable operators and users of affected technologies to develop plans for mitigating space weather risks and improve preparedness. The ionosphere is one of the most important regions of space because so many applications either depend on ionospheric space weather for their operation (HF communication, over-the-horizon radars), or can be deleteriously affected by ionospheric conditions (e.g. GNSS navigation and timing, UHF satellite communications, synthetic aperture radar, HF communications). Since the processes that influence the ionosphere vary over time scales from seconds to years, it continues to be a challenge to adequately predict its behavior in many circumstances. Estimates with large uncertainties, in excess of 100%, may result in operators of impacted technologies over or under preparing for such events. The goal of the next phase of the benchmarking activity is to reduce these uncertainties. In this presentation, we will focus on the sources of uncertainty in the ionospheric response to extreme geomagnetic storms. We will then discuss various research efforts required to better understand the underlying processes of ionospheric variability and how the uncertainties in ionospheric response to extreme space weather could be reduced and the estimates improved.

  9. From products to processes: Academic events to foster interdisciplinary and iterative dialogue in a changing climate

    NASA Astrophysics Data System (ADS)

    Addor, Nans; Ewen, Tracy; Johnson, Leigh; Ćöltekin, Arzu; Derungs, Curdin; Muccione, Veruska

    2015-08-01

    In the context of climate change, both climate researchers and decision makers deal with uncertainties, but these uncertainties differ in fundamental ways. They stem from different sources, cover different temporal and spatial scales, might or might not be reducible or quantifiable, and are generally difficult to characterize and communicate. Hence, a mutual understanding between current and future climate researchers and decision makers must evolve for adaptation strategies and planning to progress. Iterative two-way dialogue can help to improve the decision making process by bridging current top-down and bottom-up approaches. One way to cultivate such interactions is by providing venues for these actors to interact and exchange on the uncertainties they face. We use a workshop-seminar series involving academic researchers, students, and decision makers as an opportunity to put this idea into practice and evaluate it. Seminars, case studies, and a round table allowed participants to reflect upon and experiment with uncertainties. An opinion survey conducted before and after the workshop-seminar series allowed us to qualitatively evaluate its influence on the participants. We find that the event stimulated new perspectives on research products and communication processes, and we suggest that similar events may ultimately contribute to the midterm goal of improving support for decision making in a changing climate. Therefore, we recommend integrating bridging events into university curriculum to foster interdisciplinary and iterative dialogue among researchers, decision makers, and students.

  10. Modeling flow and solute transport at a tile drain field site by explicit representation of preferential flow structures: Equifinality and uncertainty

    NASA Astrophysics Data System (ADS)

    Zehe, E.; Klaus, J.

    2011-12-01

    Rapid flow in connected preferential flow paths is crucial for fast transport of water and solutes through soils, especially at tile drained field sites. The present study tests whether an explicit treatment of worm burrows is feasible for modeling water flow, bromide and pesticide transport in structured heterogeneous soils with a 2-dimensional Richards based model. The essence is to represent worm burrows as morphologically connected paths of low flow resistance and low retention capacity in the spatially highly resolved model domain. The underlying extensive database to test this approach was collected during an irrigation experiment, which investigated transport of bromide and the herbicide Isoproturon at a 900 sqm tile drained field site. In a first step we investigated whether the inherent uncertainty in key data causes equifinality i.e. whether there are several spatial model setups that reproduce tile drain event discharge in an acceptable manner. We found a considerable equifinality in the spatial setup of the model, when key parameters such as the area density of worm burrows and the maximum volumetric water flows inside these macropores were varied within the ranges of either our measurement errors or measurements reported in the literature. Thirteen model runs yielded a Nash-Sutcliffe coefficient of more than 0.9. Also, the flow volumes were in good accordance and peak timing errors where less than or equal to 20 min. In the second step we investigated thus whether this "equifinality" in spatial model setups may be reduced when including the bromide tracer data into the model falsification process. We simulated transport of bromide for the 13 spatial model setups, which performed best with respect to reproduce tile drain event discharge, without any further calibration. Four of this 13 model setups allowed to model bromide transport within fixed limits of acceptability. Parameter uncertainty and equifinality could thus be reduced. Thirdly, we selected one of those four setups for simulating transport of Isoproturon, which was applied the day before the irrigation experiment, and tested different parameter combinations to characterise adsorption according to the footprint data base. Simulations could, however, only reproduce the observed event based leaching behaviour, when we allowed for retardation coefficients that were very close to one. This finding is consistent with observations various field observations. We conclude: a) A realistic representation of dominating structures and their topology is of key importance for predicting preferential water and mass flows at tile drained hillslopes. b) Parameter uncertainty and equifinality could be reduced, but a system inherent equifinality in a 2-dimensional Richards based model has to be accepted.

  11. Glycemic control and macrovascular disease in types 1 and 2 diabetes mellitus: Meta-analysis of randomized trials.

    PubMed

    Stettler, Christoph; Allemann, Sabin; Jüni, Peter; Cull, Carole A; Holman, Rury R; Egger, Matthias; Krähenbühl, Stephan; Diem, Peter

    2006-07-01

    Uncertainty persists concerning the effect of improved long-term glycemic control on macrovascular disease in diabetes mellitus (DM). We performed a systematic review and meta-analysis of randomized controlled trials comparing interventions to improve glycemic control with conventional treatment in type 1 and type 2 diabetes. Outcomes included the incidence rate ratios for any macrovascular event, cardiac events, stroke, and peripheral arterial disease, and the number needed to treat intensively during 10 years to prevent one macrovascular event. The analysis was based on 8 randomized comparisons including 1800 patients with type 1 DM (134 macrovascular events, 40 cardiac events, 88 peripheral vascular events, 6 cerebrovascular events, 11293 person-years of follow-up) and 6 comparisons including 4472 patients with type 2 DM (1587 macrovascular events, 1197 cardiac events, 87 peripheral vascular events, 303 cerebrovascular events, 43607 person-years). Combined incidence rate ratios for any macrovascular event were 0.38 (95% CI 0.26-0.56) in type 1 and 0.81 (0.73-0.91) in type 2 DM. In type 1 DM, effect was mainly based on reduction of cardiac and peripheral vascular events and, in type 2 DM, due to reductions in stroke and peripheral vascular events. Effects appear to be particularly important in younger patients with shorter duration of diabetes. Our data suggest that attempts to improve glycemic control reduce the incidence of macrovascular events both in type 1 and type 2 DM. In absolute terms, benefits are comparable, although effects on specific manifestations of macrovascular disease differ.

  12. Uncertainties in Past and Future Global Water Availability

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; Kam, J.

    2014-12-01

    Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.

  13. Computing quality scores and uncertainty for approximate pattern matching in geospatial semantic graphs

    DOE PAGES

    Stracuzzi, David John; Brost, Randolph C.; Phillips, Cynthia A.; ...

    2015-09-26

    Geospatial semantic graphs provide a robust foundation for representing and analyzing remote sensor data. In particular, they support a variety of pattern search operations that capture the spatial and temporal relationships among the objects and events in the data. However, in the presence of large data corpora, even a carefully constructed search query may return a large number of unintended matches. This work considers the problem of calculating a quality score for each match to the query, given that the underlying data are uncertain. As a result, we present a preliminary evaluation of three methods for determining both match qualitymore » scores and associated uncertainty bounds, illustrated in the context of an example based on overhead imagery data.« less

  14. Structural Design Methodology Based on Concepts of Uncertainty

    NASA Technical Reports Server (NTRS)

    Lin, K. Y.; Du, Jiaji; Rusk, David

    2000-01-01

    In this report, an approach to damage-tolerant aircraft structural design is proposed based on the concept of an equivalent "Level of Safety" that incorporates past service experience in the design of new structures. The discrete "Level of Safety" for a single inspection event is defined as the compliment of the probability that a single flaw size larger than the critical flaw size for residual strength of the structure exists, and that the flaw will not be detected. The cumulative "Level of Safety" for the entire structure is the product of the discrete "Level of Safety" values for each flaw of each damage type present at each location in the structure. Based on the definition of "Level of Safety", a design procedure was identified and demonstrated on a composite sandwich panel for various damage types, with results showing the sensitivity of the structural sizing parameters to the relative safety of the design. The "Level of Safety" approach has broad potential application to damage-tolerant aircraft structural design with uncertainty.

  15. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  16. Detection and localization capability of an urban seismic sinkhole monitoring network

    NASA Astrophysics Data System (ADS)

    Becker, Dirk; Dahm, Torsten; Schneider, Fabian

    2017-04-01

    Microseismic events linked to underground processes in sinkhole areas might serve as precursors to larger mass dislocation or rupture events which can cause felt ground shaking or even structural damage. To identify these weak and shallow events, a sensitive local seismic monitoring network is needed. In case of an urban environment the performance of local monitoring networks is severely compromised by the high anthropogenic noise level. We study the detection and localization capability of such a network, which is already partly installed in the urban area of the city of Hamburg, Germany, within the joint project SIMULTAN (http://www.gfz-potsdam.de/en/section/near-surface-geophysics/projects/simultan/). SIMULTAN aims to monitor a known sinkhole structure and gain a better understanding of the underlying processes. The current network consists of six surface stations installed in the basement of private houses and underground structures of a research facility (DESY - Deutsches Elektronen Synchrotron). During the started monitoring campaign since 2015, no microseismic events could be unambiguously attributed to the sinkholes. To estimate the detection and location capability of the network, we calculate synthetic waveforms based on the location and mechanism of former events in the area. These waveforms are combined with the recorded urban seismic noise at the station sites. As detection algorithms a simple STA/LTA trigger and a more sophisticated phase detector are used. While the STA/LTA detector delivers stable results and is able to detect events with a moment magnitude as low as 0.35 at a distance of 1.3km from the source even under the present high noise conditions the phase detector is more sensitive but also less stable. It should be stressed that due to the local near surface conditions of the wave propagation the detections are generally performed on S- or surface waves and not on P-waves, which have a significantly lower amplitude. Due to the often emergent onsets of the seismic phases of sinkhole events and the high noise conditions the localization capability of the network is assessed by a stacking approach of characteristic waveforms (STA/LTA traces) in addition to traditional estimates based on travel time uncertainties and network geometry. Also the effect of a vertical array of borehole sensors as well as a small scale surface array on the location accuracy is investigated. Due to the expected, rather low frequency character of the seismic signals arrays with a small aperture due to the required close proximity to the source exhibit considerable uncertainty in the determination of the azimuth of the incoming wavefront, but can contribute to better constrain the event location. Future borehole stations, apart from significantly reducing the detection threshold, would also significantly reduce the location uncertainty. In addition, the synthetic data sets created for this study can also be used to better constrain the magnitudes of the microseismic events by deriving attenuation relations for the surface waves of shallow events encountered in the sinkhole environment. This work has been funded by the German 'Geotechnologien' project SIMULTAN (BMBF03G0737A).

  17. A tiered approach for integrating exposure and dosimetry with ...

    EPA Pesticide Factsheets

    High-throughput (HT) risk screening approaches apply in vitro dose-response data to estimate potential health risks that arise from exposure to chemicals. However, much uncertainty is inherent in relating bioactivities observed in an in vitro system to the perturbations of biological mechanisms that lead to apical adverse health outcomes in living organisms. The chemical-agnostic Adverse Outcome Pathway (AOP) framework addresses this uncertainty by acting as a scaffold onto which pathway-based data can be arranged to aid in the understanding of in vitro toxicity testing results. In addition, risk estimation also requires reconciling chemical concentrations sufficient to produce bioactivity in vitro with concentrations that trigger a molecular initiating event (MIE) at the relevant biological target in vivo. Such target site exposures (TSEs) can be estimated using computational models to integrate exposure information with a chemical’s absorption, distribution, metabolism, and elimination (ADME) processes. In this presentation, the utility of a tiered approach for integrating exposure, ADME, and hazard into risk-based decision making will be demonstrated using several case studies, along with the investigation of how uncertainties in exposure and ADME might impact risk estimates. These case studies involve 1) identifying and prioritizing chemicals capable of altering biological pathways based on their potential to reach an in vivo target; 2) evaluating the infl

  18. Robust photometric invariant features from the color tensor.

    PubMed

    van de Weijer, Joost; Gevers, Theo; Smeulders, Arnold W M

    2006-01-01

    Luminance-based features are widely used as low-level input for computer vision applications, even when color data is available. The extension of feature detection to the color domain prevents information loss due to isoluminance and allows us to exploit the photometric information. To fully exploit the extra information in the color data, the vector nature of color data has to be taken into account and a sound framework is needed to combine feature and photometric invariance theory. In this paper, we focus on the structure tensor, or color tensor, which adequately handles the vector nature of color images. Further, we combine the features based on the color tensor with photometric invariant derivatives to arrive at photometric invariant features. We circumvent the drawback of unstable photometric invariants by deriving an uncertainty measure to accompany the photometric invariant derivatives. The uncertainty is incorporated in the color tensor, hereby allowing the computation of robust photometric invariant features. The combination of the photometric invariance theory and tensor-based features allows for detection of a variety of features such as photometric invariant edges, corners, optical flow, and curvature. The proposed features are tested for noise characteristics and robustness to photometric changes. Experiments show that the proposed features are robust to scene incidental events and that the proposed uncertainty measure improves the applicability of full invariants.

  19. Characterization of uncertainty in ETMS flight events predictions and its effect on traffic demand predictions

    DOT National Transportation Integrated Search

    2008-07-11

    This report presents the results of analysis and characterization of uncertainty in traffic demand predictions using ETMS data and probabilistic representation of the predictions. Our previous research, described in two prior reports, was focused on ...

  20. Testing stress shadowing effects at the South American subduction zone

    NASA Astrophysics Data System (ADS)

    Roth, F.; Dahm, T.; Hainzl, S.

    2017-11-01

    The seismic gap hypothesis assumes that a characteristic earthquake is followed by a long period with a reduced occurrence probability for the next large event on the same fault segment, as a consequence of the induced stress shadow. The gap model is commonly accepted by geologists and is often used for time-dependent seismic hazard estimations. However, systematic and rigorous tests to verify the seismic gap model have often failed so far, which might be partially related to limited data and too tight model assumptions. In this study, we relax the assumption of a characteristic size and location of repeating earthquakes and analyse one of the best available data sets, namely the historical record of major earthquakes along a 3000 km long linear segment of the South American subduction zone. To test whether a stress shadow effect is observable, we compiled a comprehensive catalogue of mega-thrust earthquakes along this plate boundary from 1520 to 2015 containing 174 earthquakes with Mw > 6.5. In our new testing approach, we analyse the time span between an earthquake and the last event that ruptured the epicentre location, where we consider the impact of the uncertainties of epicentres and rupture extensions. Assuming uniform boundary conditions along the trench, we compare the distribution of these recurrence times with simple recurrence models. We find that the distribution is in all cases almost exponentially distributed corresponding to a random (Poissonian) process; despite some tendencies for clustering of the Mw ≥ 7 events and a weak quasi-periodicity of the Mw ≥ 8 earthquakes, respectively. To verify whether the absence of a clear stress shadow signal is related to physical assumptions or data uncertainties, we perform simulations of a physics-based stochastic earthquake model considering rate and state-dependent earthquake nucleation, which are adapted to the observations with regard to the number of events, spatial extend, size distribution and involved uncertainties. Our simulations show that the catalogue uncertainties lead to a significant blurring of the theoretically peaked distribution, but the distribution would be still distinguishable from the observed one for Mw ≥ 7 events. However, considering the stress transfer to adjacent fault segments and heterogeneous instead of constant stress drop within the rupture zone can explain the observed recurrence time distribution. We conclude that simplified recurrence models, ignoring the complexity of the underlying physical process, cannot be applied for forecasting the Mw ≥ 7 earthquake occurrence at this plate boundary.

  1. Stochastic evaluation of annual micropollutant loads and their uncertainties in separate storm sewers.

    PubMed

    Hannouche, Ali; Chebbo, Ghassan; Joannis, Claude; Gasperi, Johnny; Gromaire, Marie-Christine; Moilleron, Régis; Barraud, Sylvie; Ruban, Véronique

    2017-12-01

    This article describes a stochastic method to calculate the annual pollutant loads and its application over several years at the outlet of three catchments drained by separate storm sewers. A stochastic methodology using Monte Carlo simulations is proposed for assessing annual pollutant load, as well as the associated uncertainties, from a few event sampling campaigns and/or continuous turbidity measurements (representative of the total suspended solids concentration (TSS)). Indeed, in the latter case, the proposed method takes into account the correlation between pollutants and TSS. The developed method was applied to data acquired within the French research project "INOGEV" (innovations for a sustainable management of urban water) at the outlet of three urban catchments drained by separate storm sewers. Ten or so event sampling campaigns for a large range of pollutants (46 pollutants and 2 conventional water quality parameters: TSS and total organic carbon (TOC)) are combined with hundreds of rainfall events for which, at least one among three continuously monitored parameters (rainfall intensity, flow rate, and turbidity) is available. Results obtained for the three catchments show that the annual pollutant loads can be estimated with uncertainties ranging from 10 to 60%, and the added value of turbidity monitoring for lowering the uncertainty is demonstrated. A low inter-annual and inter-site variability of pollutant loads, for many of studied pollutants, is observed with respect to the estimated uncertainties, and can be explained mainly by annual precipitation.

  2. How Confident can we be in Flood Risk Assessments?

    NASA Astrophysics Data System (ADS)

    Merz, B.

    2017-12-01

    Flood risk management should be based on risk analyses quantifying the risk and its reduction for different risk reduction strategies. However, validating risk estimates by comparing model simulations with past observations is hardly possible, since the assessment typically encompasses extreme events and their impacts that have not been observed before. Hence, risk analyses are strongly based on assumptions and expert judgement. This situation opens the door for cognitive biases, such as `illusion of certainty', `overconfidence' or `recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk analyses. This contribution discusses how confident we can be in flood risk assessments, and reflects about more rigorous approaches towards their validation.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendes, J.; Bessa, R.J.; Keko, H.

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highlymore » dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.« less

  4. Reply to comment by Rainer Facius et al. on "U.S. Government shutdown degrades aviation radiation monitoring during solar radiation storm"

    NASA Astrophysics Data System (ADS)

    Tobiska, W. Kent; Gersey, Brad; Wilkins, Richard; Mertens, Chris; Atwell, William; Bailey, Justin

    2014-05-01

    The premise of this comment perpetuates an unfortunate trend among some radiation researchers to minimize potential risks to human tissue from low-radiation sources. In fact, this discussion on the risk uncertainties of low-dose radiation further illustrates the need for more measurements and a program of active monitoring, especially when solar eruptive events can substantially elevate the radiation environment. This debate also highlights the context of a bigger problem; i.e., how do we as professionals act with due diligence to take the immense body of knowledge of space weather radiation effects on human tissue and distil it into ideas that regulatory agencies can use to maximize the safety of a population at risk. The focus of our article on radiation risks due to solar energetic particle events starts with our best assessment of risks and is based on the body of scientific knowledge while, at the same time, erring on the side of public safety. The uncertainty inherent in our assessment is accepted and described with this same philosophy in mind.

  5. Isotropic source terms of San Jacinto fault zone earthquakes based on waveform inversions with a generalized CAP method

    NASA Astrophysics Data System (ADS)

    Ross, Z. E.; Ben-Zion, Y.; Zhu, L.

    2015-02-01

    We analyse source tensor properties of seven Mw > 4.2 earthquakes in the complex trifurcation area of the San Jacinto Fault Zone, CA, with a focus on isotropic radiation that may be produced by rock damage in the source volumes. The earthquake mechanisms are derived with generalized `Cut and Paste' (gCAP) inversions of three-component waveforms typically recorded by >70 stations at regional distances. The gCAP method includes parameters ζ and χ representing, respectively, the relative strength of the isotropic and CLVD source terms. The possible errors in the isotropic and CLVD components due to station variability is quantified with bootstrap resampling for each event. The results indicate statistically significant explosive isotropic components for at least six of the events, corresponding to ˜0.4-8 per cent of the total potency/moment of the sources. In contrast, the CLVD components for most events are not found to be statistically significant. Trade-off and correlation between the isotropic and CLVD components are studied using synthetic tests with realistic station configurations. The associated uncertainties are found to be generally smaller than the observed isotropic components. Two different tests with velocity model perturbation are conducted to quantify the uncertainty due to inaccuracies in the Green's functions. Applications of the Mann-Whitney U test indicate statistically significant explosive isotropic terms for most events consistent with brittle damage production at the source.

  6. Extreme Events in China under Climate Change: Uncertainty and related impacts (CSSP-FOREX)

    NASA Astrophysics Data System (ADS)

    Leckebusch, Gregor C.; Befort, Daniel J.; Hodges, Kevin I.

    2016-04-01

    Suitable adaptation strategies or the timely initiation of related mitigation efforts in East Asia will strongly depend on robust and comprehensive information about future near-term as well as long-term potential changes in the climate system. Therefore, understanding the driving mechanisms associated with the East Asian climate is of major importance. The FOREX project (Fostering Regional Decision Making by the Assessment of Uncertainties of Future Regional Extremes and their Linkage to Global Climate System Variability for China and East Asia) focuses on the investigation of extreme wind and rainfall related events over Eastern Asia and their possible future changes. Here, analyses focus on the link between local extreme events and their driving weather systems. This includes the coupling between local rainfall extremes and tropical cyclones, the Meiyu frontal system, extra-tropical teleconnections and monsoonal activity. Furthermore, the relation between these driving weather systems and large-scale variability modes, e.g. NAO, PDO, ENSO is analysed. Thus, beside analysing future changes of local extreme events, the temporal variability of their driving weather systems and related large-scale variability modes will be assessed in current CMIP5 global model simulations to obtain more robust results. Beyond an overview of FOREX itself, first results regarding the link between local extremes and their steering weather systems based on observational and reanalysis data are shown. Special focus is laid on the contribution of monsoonal activity, tropical cyclones and the Meiyu frontal system on the inter-annual variability of the East Asian summer rainfall.

  7. Multiple-Event Seismic Location Using the Markov-Chain Monte Carlo Technique

    NASA Astrophysics Data System (ADS)

    Myers, S. C.; Johannesson, G.; Hanley, W.

    2005-12-01

    We develop a new multiple-event location algorithm (MCMCloc) that utilizes the Markov-Chain Monte Carlo (MCMC) method. Unlike most inverse methods, the MCMC approach produces a suite of solutions, each of which is consistent with observations and prior estimates of data and model uncertainties. Model parameters in MCMCloc consist of event hypocenters, and travel-time predictions. Data are arrival time measurements and phase assignments. Posteriori estimates of event locations, path corrections, pick errors, and phase assignments are made through analysis of the posteriori suite of acceptable solutions. Prior uncertainty estimates include correlations between travel-time predictions, correlations between measurement errors, the probability of misidentifying one phase for another, and the probability of spurious data. Inclusion of prior constraints on location accuracy allows direct utilization of ground-truth locations or well-constrained location parameters (e.g. from InSAR) that aid in the accuracy of the solution. Implementation of a correlation structure for travel-time predictions allows MCMCloc to operate over arbitrarily large geographic areas. Transition in behavior between a multiple-event locator for tightly clustered events and a single-event locator for solitary events is controlled by the spatial correlation of travel-time predictions. We test the MCMC locator on a regional data set of Nevada Test Site nuclear explosions. Event locations and origin times are known for these events, allowing us to test the features of MCMCloc using a high-quality ground truth data set. Preliminary tests suggest that MCMCloc provides excellent relative locations, often outperforming traditional multiple-event location algorithms, and excellent absolute locations are attained when constraints from one or more ground truth event are included. When phase assignments are switched, we find that MCMCloc properly corrects the error when predicted arrival times are separated by several seconds. In cases where the predicted arrival times are within the combined uncertainty of prediction and measurement errors, MCMCloc determines the probability of one or the other phase assignment and propagates this uncertainty into all model parameters. We find that MCMCloc is a promising method for simultaneously locating large, geographically distributed data sets. Because we incorporate prior knowledge on many parameters, MCMCloc is ideal for combining trusted data with data of unknown reliability. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-ABS-215048

  8. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    NASA Astrophysics Data System (ADS)

    Saha, Debasish; Kemanian, Armen R.; Rau, Benjamin M.; Adler, Paul R.; Montes, Felipe

    2017-04-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (corn-soybean rotation), College Station, TX (corn-vetch rotation), Fort Collins, CO (irrigated corn), and Pullman, WA (winter wheat), representing diverse agro-ecoregions of the United States. Fertilization source, rate, and timing were site-specific. These simulated fluxes surrogated daily measurements in the analysis. We ;sampled; the fluxes using a fixed interval (1-32 days) or a rule-based (decision tree-based) sampling method. Two types of decision trees were built: a high-input tree (HI) that included soil inorganic nitrogen (SIN) as a predictor variable, and a low-input tree (LI) that excluded SIN. Other predictor variables were identified with Random Forest. The decision trees were inverted to be used as rules for sampling a representative number of members from each terminal node. The uncertainty of the annual N2O flux estimation increased along with the fixed interval length. A 4- and 8-day fixed sampling interval was required at College Station and Ames, respectively, to yield ±20% accuracy in the flux estimate; a 12-day interval rendered the same accuracy at Fort Collins and Pullman. Both the HI and the LI rule-based methods provided the same accuracy as that of fixed interval method with up to a 60% reduction in sampling events, particularly at locations with greater temporal flux variability. For instance, at Ames, the HI rule-based and the fixed interval methods required 16 and 91 sampling events, respectively, to achieve the same absolute bias of 0.2 kg N ha-1 yr-1 in estimating cumulative N2O flux. These results suggest that using simulation models along with decision trees can reduce the cost and improve the accuracy of the estimations of cumulative N2O fluxes using the discrete chamber-based method.

  9. Global and Regional 3D Tomography for Improved Seismic Event Location and Uncertainty in Explosion Monitoring

    NASA Astrophysics Data System (ADS)

    Downey, N.; Begnaud, M. L.; Hipp, J. R.; Ballard, S.; Young, C. S.; Encarnacao, A. V.

    2017-12-01

    The SALSA3D global 3D velocity model of the Earth was developed to improve the accuracy and precision of seismic travel time predictions for a wide suite of regional and teleseismic phases. Recently, the global SALSA3D model was updated to include additional body wave phases including mantle phases, core phases, reflections off the core-mantle boundary and underside reflections off the surface of the Earth. We show that this update improves travel time predictions and leads directly to significant improvements in the accuracy and precision of seismic event locations as compared to locations computed using standard 1D velocity models like ak135, or 2½D models like RSTT. A key feature of our inversions is that path-specific model uncertainty of travel time predictions are calculated using the full 3D model covariance matrix computed during tomography, which results in more realistic uncertainty ellipses that directly reflect tomographic data coverage. Application of this method can also be done at a regional scale: we present a velocity model with uncertainty obtained using data obtained from the University of Utah Seismograph Stations. These results show a reduction in travel-time residuals for re-located events compared with those obtained using previously published models.

  10. Inter-model variability in hydrological extremes projections for Amazonian sub-basins

    NASA Astrophysics Data System (ADS)

    Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier

    2014-05-01

    Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.

  11. Location error uncertainties - an advanced using of probabilistic inverse theory

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2016-04-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analyzed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. While estimating of the earthquake foci location is relatively simple a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling, and apriori uncertainties. In this presentation we addressed this task when statistics of observational and/or modeling errors are unknown. This common situation requires introduction of apriori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland we illustrate an approach based on an analysis of Shanon's entropy calculated for the aposteriori distribution. We show that this meta-characteristic of the aposteriori distribution carries some information on uncertainties of the solution found.

  12. Estimating winter wheat phenological parameters: Implications for crop modeling

    USDA-ARS?s Scientific Manuscript database

    Crop parameters, such as the timing of developmental events, are critical for accurate simulation results in crop simulation models, yet uncertainty often exists in determining the parameters. Factors contributing to the uncertainty include: a) sources of variation within a plant (i.e., within diffe...

  13. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    NASA Astrophysics Data System (ADS)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  14. Effects of relational uncertainty in heightening national identification and reactive approach motivation of Japanese.

    PubMed

    Terashima, Yuto; Takai, Jiro

    2017-03-23

    This study investigated whether relational uncertainty poses uncertainty threat, which causes compensatory behaviours among Japanese. We hypothesised that Japanese, as collectivists, would perceive relational uncertainty to pose uncertainty threat. In two experiments, we manipulated relational uncertainty, and confirmed that participants exhibited compensatory reactions to reduce aversive feelings due to it. In Study 1, we conducted direct comparison between relational uncertainty, independent self-uncertainty and control conditions. The results revealed that participants who were instructed to imagine events pertaining to relational uncertainty heightened national identification as compensation than did participants in the control condition, but independent self-uncertainty did not provoke such effects. In Study 2, we again manipulated relational uncertainty; however, we also manipulated participants' individualism-collectivism cultural orientation through priming, and the analyses yielded a significant interaction effect between these variables. Relational uncertainty evoked reactive approach motivation, a cause for compensatory behaviours, among participants primed with collectivism, but not for individualism. It was concluded that the effect of uncertainty on compensatory behaviour is influenced by cultural priming, and that relational uncertainty is important to Japanese. © 2017 International Union of Psychological Science.

  15. Space Launch System Booster Separation Aerodynamic Database Development and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Chan, David T.; Pinier, Jeremy T.; Wilcox, Floyd J., Jr.; Dalle, Derek J.; Rogers, Stuart E.; Gomez, Reynaldo J.

    2016-01-01

    The development of the aerodynamic database for the Space Launch System (SLS) booster separation environment has presented many challenges because of the complex physics of the ow around three independent bodies due to proximity e ects and jet inter- actions from the booster separation motors and the core stage engines. This aerodynamic environment is dicult to simulate in a wind tunnel experiment and also dicult to simu- late with computational uid dynamics. The database is further complicated by the high dimensionality of the independent variable space, which includes the orientation of the core stage, the relative positions and orientations of the solid rocket boosters, and the thrust lev- els of the various engines. Moreover, the clearance between the core stage and the boosters during the separation event is sensitive to the aerodynamic uncertainties of the database. This paper will present the development process for Version 3 of the SLS booster separa- tion aerodynamic database and the statistics-based uncertainty quanti cation process for the database.

  16. Properties of Extreme Precipitation and Their Uncertainties in 3-year GPM Precipitation Radar Data

    NASA Astrophysics Data System (ADS)

    Liu, N.; Liu, C.

    2017-12-01

    Extreme high precipitation rates are often related to flash floods and have devastating impacts on human society and the environments. To better understand these rare events, 3-year Precipitation Features (PFs) are defined by grouping the contiguous areas with nonzero near-surface precipitation derived using Global Precipitation Measurement (GPM) Ku band Precipitation Radar (KuPR). The properties of PFs with extreme precipitation rates greater than 20, 50, 100 mm/hr, such as the geographical distribution, volumetric precipitation contribution, seasonal and diurnal variations, are examined. In addition to the large seasonal and regional variations, the rare extreme precipitation rates often have a larger contribution to the local total precipitation. Extreme precipitation rates occur more often over land than over ocean. The challenges in the retrieval of extreme precipitation might be from the attenuation correction and large uncertainties in the Z-R relationships from near-surface radar reflectivity to precipitation rates. These potential uncertainties are examined by using collocated ground based radar reflectivity and precipitation retrievals.

  17. Characterization of a neutron sensitive MCP/Timepix detector for quantitative image analysis at a pulsed neutron source

    NASA Astrophysics Data System (ADS)

    Watanabe, Kenichi; Minniti, Triestino; Kockelmann, Winfried; Dalgliesh, Robert; Burca, Genoveva; Tremsin, Anton S.

    2017-07-01

    The uncertainties and the stability of a neutron sensitive MCP/Timepix detector when operating in the event timing mode for quantitative image analysis at a pulsed neutron source were investigated. The dominant component to the uncertainty arises from the counting statistics. The contribution of the overlap correction to the uncertainty was concluded to be negligible from considerations based on the error propagation even if a pixel occupation probability is more than 50%. We, additionally, have taken into account the multiple counting effect in consideration of the counting statistics. Furthermore, the detection efficiency of this detector system changes under relatively high neutron fluxes due to the ageing effects of current Microchannel Plates. Since this efficiency change is position-dependent, it induces a memory image. The memory effect can be significantly reduced with correction procedures using the rate equations describing the permanent gain degradation and the scrubbing effect on the inner surfaces of the MCP pores.

  18. The longevity of lava dome eruptions

    NASA Astrophysics Data System (ADS)

    Wolpert, Robert L.; Ogburn, Sarah E.; Calder, Eliza S.

    2016-02-01

    Understanding the duration of past, ongoing, and future volcanic eruptions is an important scientific goal and a key societal need. We present a new methodology for forecasting the duration of ongoing and future lava dome eruptions based on a database (DomeHaz) recently compiled by the authors. The database includes duration and composition for 177 such eruptions, with "eruption" defined as the period encompassing individual episodes of dome growth along with associated quiescent periods during which extrusion pauses but unrest continues. In a key finding, we show that probability distributions for dome eruption durations are both heavy tailed and composition dependent. We construct objective Bayesian statistical models featuring heavy-tailed Generalized Pareto distributions with composition-specific parameters to make forecasts about the durations of new and ongoing eruptions that depend on both eruption duration to date and composition. Our Bayesian predictive distributions reflect both uncertainty about model parameter values (epistemic uncertainty) and the natural variability of the geologic processes (aleatoric uncertainty). The results are illustrated by presenting likely trajectories for 14 dome-building eruptions ongoing in 2015. Full representation of the uncertainty is presented for two key eruptions, Soufriére Hills Volcano in Montserrat (10-139 years, median 35 years) and Sinabung, Indonesia (1-17 years, median 4 years). Uncertainties are high but, importantly, quantifiable. This work provides for the first time a quantitative and transferable method and rationale on which to base long-term planning decisions for lava dome-forming volcanoes, with wide potential use and transferability to forecasts of other types of eruptions and other adverse events across the geohazard spectrum.

  19. A fully probabilistic approach to extreme rainfall modeling

    NASA Astrophysics Data System (ADS)

    Coles, Stuart; Pericchi, Luis Raúl; Sisson, Scott

    2003-03-01

    It is an embarrassingly frequent experience that statistical practice fails to foresee historical disasters. It is all too easy to blame global trends or some sort of external intervention, but in this article we argue that statistical methods that do not take comprehensive account of the uncertainties involved in both model and predictions, are bound to produce an over-optimistic appraisal of future extremes that is often contradicted by observed hydrological events. Based on the annual and daily rainfall data on the central coast of Venezuela, different modeling strategies and inference approaches show that the 1999 rainfall which caused the worst environmentally related tragedy in Venezuelan history was extreme, but not implausible given the historical evidence. We follow in turn a classical likelihood and Bayesian approach, arguing that the latter is the most natural approach for taking into account all uncertainties. In each case we emphasize the importance of making inference on predicted levels of the process rather than model parameters. Our most detailed model comprises of seasons with unknown starting points and durations for the extremes of daily rainfall whose behavior is described using a standard threshold model. Based on a Bayesian analysis of this model, so that both prediction uncertainty and process heterogeneity are properly modeled, we find that the 1999 event has a sizeable probability which implies that such an occurrence within a reasonably short time horizon could have been anticipated. Finally, since accumulation of extreme rainfall over several days is an additional difficulty—and indeed, the catastrophe of 1999 was exaggerated by heavy rainfall on successive days—we examine the effect of timescale on our broad conclusions, finding results to be broadly similar across different choices.

  20. A new 1649-1884 catalog of destructive earthquakes near Tokyo and implications for the long-term seismic process

    USGS Publications Warehouse

    Grunewald, E.D.; Stein, R.S.

    2006-01-01

    In order to assess the long-term character of seismicity near Tokyo, we construct an intensity-based catalog of damaging earthquakes that struck the greater Tokyo area between 1649 and 1884. Models for 15 historical earthquakes are developed using calibrated intensity attenuation relations that quantitatively convey uncertainties in event location and magnitude, as well as their covariance. The historical catalog is most likely complete for earthquakes M ??? 6.7; the largest earthquake in the catalog is the 1703 M ??? 8.2 Genroku event. Seismicity rates from 80 years of instrumental records, which include the 1923 M = 7.9 Kanto shock, as well as interevent times estimated from the past ???7000 years of paleoseismic data, are combined with the historical catalog to define a frequency-magnitude distribution for 4.5 ??? M ??? 8.2, which is well described by a truncated Gutenberg-Richter relation with a b value of 0.96 and a maximum magnitude of 8.4. Large uncertainties associated with the intensity-based catalog are propagated by a Monte Carlo simulation to estimations of the scalar moment rate. The resulting best estimate of moment rate during 1649-2003 is 1.35 ?? 1026 dyn cm yr-1 with considerable uncertainty at the 1??, level: (-0.11, + 0.20) ?? 1026 dyn cm yr-1. Comparison with geodetic models of the interseismic deformation indicates that the geodetic moment accumulation and likely moment release rate are roughly balanced over the catalog period. This balance suggests that the extended catalog is representative of long-term seismic processes near Tokyo and so can be used to assess earthquake probabilities. The resulting Poisson (or time-averaged) 30-year probability for M ??? 7.9 earthquakes is 7-11%.

  1. CISN ShakeAlert: Accounting for site amplification effects and quantifying time and spatial dependence of uncertainty estimates in the Virtual Seismologist earthquake early warning algorithm

    NASA Astrophysics Data System (ADS)

    Caprio, M.; Cua, G. B.; Wiemer, S.; Fischer, M.; Heaton, T. H.; CISN EEW Team

    2011-12-01

    The Virtual Seismologist (VS) earthquake early warning (EEW) algorithm is one of 3 EEW approaches being incorporated into the California Integrated Seismic Network (CISN) ShakeAlert system, a prototype EEW system being tested in real-time in California. The VS algorithm, implemented by the Swiss Seismological Service at ETH Zurich, is a Bayesian approach to EEW, wherein the most probable source estimate at any given time is a combination of contributions from a likehihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS codes have been running in real-time at the Southern California Seismic Network (SCSN) since July 2008, and at the Northern California Seismic Network (NCSN) since February 2009. With the aim of improving the convergence of real-time VS magnitude estimates to network magnitudes, we evaluate various empirical and Vs30-based approaches to accounting for site amplification. Empirical station corrections for SCSN stations are derived from M>3.0 events from 2005 through 2009. We evaluate the performance of the various approaches using an independent 2010 dataset. In addition, we analyze real-time VS performance from 2008 to the present to quantify the time and spatial dependence of VS uncertainty estimates. We also summarize real-time VS performance for significant 2011 events in California. Improved magnitude and uncertainty estimates potentially increase the utility of EEW information for end-users, particularly those intending to automate damage-mitigating actions based on real-time information.

  2. Pricing index-based catastrophe bonds: Part 2: Object-oriented design issues and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Unger, André J. A.

    2010-02-01

    This work is the second installment in a two-part series, and focuses on object-oriented programming methods to implement an augmented-state variable approach to aggregate the PCS index and introduce the Bermudan-style call feature into the proposed CAT bond model. The PCS index is aggregated quarterly using a discrete Asian running-sum formulation. The resulting aggregate PCS index augmented-state variable is used to specify the payoff (principle) on the CAT bond based on reinsurance layers. The purpose of the Bermudan-style call option is to allow the reinsurer to minimize their interest rate risk exposure on making fixed coupon payments under prevailing interest rates. A sensitivity analysis is performed to determine the impact of uncertainty in the frequency and magnitude of hurricanes on the price of the CAT bond. Results indicate that while the CAT bond is highly sensitive to the natural variability in the frequency of landfalling hurricanes between El Ninõ and non-El Ninõ years, it remains relatively insensitive to uncertainty in the magnitude of damages. In addition, results indicate that the maximum price of the CAT bond is insensitive to whether it is engineered to cover low frequency high magnitude events in a 'high' reinsurance layer relative to high frequency low magnitude events in a 'low' reinsurance layer. Also, while it is possible for the reinsurer to minimize their interest rate risk exposure on the fixed coupon payments, the impact of this risk on the price of the CAT bond appears small relative to the natural variability in the CAT bond price, and consequently catastrophic risk, due to uncertainty in the frequency and magnitude of landfalling hurricanes.

  3. A fast and robust method for moment tensor and depth determination of shallow seismic events in CTBT related studies.

    NASA Astrophysics Data System (ADS)

    Baker, Ben; Stachnik, Joshua; Rozhkov, Mikhail

    2017-04-01

    International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event according to the protocol to the Protocol to the Comprehensive Nuclear Test Ban Treaty. Determination of seismic event source mechanism and its depth is closely related to these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. In this presentation we demonstrate preliminary results obtained with the latter approach from an improved software design. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Posterior distributions of moment tensor parameters show narrow peaks where a significant number of reliable surface wave observations are available. For earthquake examples, fault orientation (strike, dip, and rake) posterior distributions also provide results consistent with published catalogues. Inclusion of observations on horizontal components will provide further constraints. In addition, the calculation of teleseismic P wave Green's Functions are improved through prior analysis to determine an appropriate attenuation parameter for each source-receiver path. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK events and shallow earthquakes using a new implementation of teleseismic P waves waveform fitting. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by Tape & Tape (2012) to discretize the complete moment tensor space from a geometric perspective is used. Probabilistic uncertainty estimates on the moment tensor parameters provide robustness to solution.

  4. A survey of resilience, burnout, and tolerance of uncertainty in Australian general practice registrars

    PubMed Central

    2013-01-01

    Background Burnout and intolerance of uncertainty have been linked to low job satisfaction and lower quality patient care. While resilience is related to these concepts, no study has examined these three concepts in a cohort of doctors. The objective of this study was to measure resilience, burnout, compassion satisfaction, personal meaning in patient care and intolerance of uncertainty in Australian general practice (GP) registrars. Methods We conducted a paper-based cross-sectional survey of GP registrars in Australia from June to July 2010, recruited from a newsletter item or registrar education events. Survey measures included the Resilience Scale-14, a single-item scale for burnout, Professional Quality of Life (ProQOL) scale, Personal Meaning in Patient Care scale, Intolerance of Uncertainty-12 scale, and Physician Response to Uncertainty scale. Results 128 GP registrars responded (response rate 90%). Fourteen percent of registrars were found to be at risk of burnout using the single-item scale for burnout, but none met the criteria for burnout using the ProQOL scale. Secondary traumatic stress, general intolerance of uncertainty, anxiety due to clinical uncertainty and reluctance to disclose uncertainty to patients were associated with being at higher risk of burnout, but sex, age, practice location, training duration, years since graduation, and reluctance to disclose uncertainty to physicians were not. Only ten percent of registrars had high resilience scores. Resilience was positively associated with compassion satisfaction and personal meaning in patient care. Resilience was negatively associated with burnout, secondary traumatic stress, inhibitory anxiety, general intolerance to uncertainty, concern about bad outcomes and reluctance to disclose uncertainty to patients. Conclusions GP registrars in this survey showed a lower level of burnout than in other recent surveys of the broader junior doctor population in both Australia and overseas. Resilience was also lower than might be expected of a satisfied and professionally successful cohort. PMID:23294479

  5. A survey of resilience, burnout, and tolerance of uncertainty in Australian general practice registrars.

    PubMed

    Cooke, Georga P E; Doust, Jenny A; Steele, Michael C

    2013-01-07

    Burnout and intolerance of uncertainty have been linked to low job satisfaction and lower quality patient care. While resilience is related to these concepts, no study has examined these three concepts in a cohort of doctors. The objective of this study was to measure resilience, burnout, compassion satisfaction, personal meaning in patient care and intolerance of uncertainty in Australian general practice (GP) registrars. We conducted a paper-based cross-sectional survey of GP registrars in Australia from June to July 2010, recruited from a newsletter item or registrar education events. Survey measures included the Resilience Scale-14, a single-item scale for burnout, Professional Quality of Life (ProQOL) scale, Personal Meaning in Patient Care scale, Intolerance of Uncertainty-12 scale, and Physician Response to Uncertainty scale. 128 GP registrars responded (response rate 90%). Fourteen percent of registrars were found to be at risk of burnout using the single-item scale for burnout, but none met the criteria for burnout using the ProQOL scale. Secondary traumatic stress, general intolerance of uncertainty, anxiety due to clinical uncertainty and reluctance to disclose uncertainty to patients were associated with being at higher risk of burnout, but sex, age, practice location, training duration, years since graduation, and reluctance to disclose uncertainty to physicians were not.Only ten percent of registrars had high resilience scores. Resilience was positively associated with compassion satisfaction and personal meaning in patient care. Resilience was negatively associated with burnout, secondary traumatic stress, inhibitory anxiety, general intolerance to uncertainty, concern about bad outcomes and reluctance to disclose uncertainty to patients. GP registrars in this survey showed a lower level of burnout than in other recent surveys of the broader junior doctor population in both Australia and overseas. Resilience was also lower than might be expected of a satisfied and professionally successful cohort.

  6. Probabilistic assessment of landslide tsunami hazard for the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Pampell-Manis, A.; Horrillo, J.; Shigihara, Y.; Parambath, L.

    2016-01-01

    The devastating consequences of recent tsunamis affecting Indonesia and Japan have prompted a scientific response to better assess unexpected tsunami hazards. Although much uncertainty exists regarding the recurrence of large-scale tsunami events in the Gulf of Mexico (GoM), geological evidence indicates that a tsunami is possible and would most likely come from a submarine landslide triggered by an earthquake. This study customizes for the GoM a first-order probabilistic landslide tsunami hazard assessment. Monte Carlo Simulation (MCS) is employed to determine landslide configurations based on distributions obtained from observational submarine mass failure (SMF) data. Our MCS approach incorporates a Cholesky decomposition method for correlated landslide size parameters to capture correlations seen in the data as well as uncertainty inherent in these events. Slope stability analyses are performed using landslide and sediment properties and regional seismic loading to determine landslide configurations which fail and produce a tsunami. The probability of each tsunamigenic failure is calculated based on the joint probability of slope failure and probability of the triggering earthquake. We are thus able to estimate sizes and return periods for probabilistic maximum credible landslide scenarios. We find that the Cholesky decomposition approach generates landslide parameter distributions that retain the trends seen in observational data, improving the statistical validity and relevancy of the MCS technique in the context of landslide tsunami hazard assessment. Estimated return periods suggest that probabilistic maximum credible SMF events in the north and northwest GoM have a recurrence of 5000-8000 years, in agreement with age dates of observed deposits.

  7. Better Than Counting: Density Profiles from Force Sampling

    NASA Astrophysics Data System (ADS)

    de las Heras, Daniel; Schmidt, Matthias

    2018-05-01

    Calculating one-body density profiles in equilibrium via particle-based simulation methods involves counting of events of particle occurrences at (histogram-resolved) space points. Here, we investigate an alternative method based on a histogram of the local force density. Via an exact sum rule, the density profile is obtained with a simple spatial integration. The method circumvents the inherent ideal gas fluctuations. We have tested the method in Monte Carlo, Brownian dynamics, and molecular dynamics simulations. The results carry a statistical uncertainty smaller than that of the standard counting method, reducing therefore the computation time.

  8. Forecast Based Financing for Managing Weather and Climate Risks to Reduce Potential Disaster Impacts

    NASA Astrophysics Data System (ADS)

    Arrighi, J.

    2017-12-01

    There is a critical window of time to reduce potential impacts of a disaster after a forecast for heightened risk is issued and before an extreme event occurs. The concept of Forecast-based Financing focuses on this window of opportunity. Through advanced preparation during system set-up, tailored methodologies are used to 1) analyze a range of potential extreme event forecasts, 2) identify emergency preparedness measures that can be taken when factoring in forecast lead time and inherent uncertainty and 3) develop standard operating procedures that are agreed on and tied to guaranteed funding sources to facilitate emergency measures led by the Red Cross or government actors when preparedness measures are triggered. This presentation will focus on a broad overview of the current state of theory and approaches used in developing a forecast-based financing systems - with a specific focus on hydrologic events, case studies of success and challenges in various contexts where this approach is being piloted, as well as what is on the horizon to be further explored and developed from a research perspective as the application of this approach continues to expand.

  9. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  10. The timing, two-pulsed nature, and variable climatic expression of the 4.2 ka event: A review and new high-resolution stalagmite data from Namibia

    NASA Astrophysics Data System (ADS)

    Railsback, L. Bruce; Liang, Fuyuan; Brook, G. A.; Voarintsoa, Ny Riavo G.; Sletten, Hillary R.; Marais, Eugene; Hardt, Ben; Cheng, Hai; Edwards, R. Lawrence

    2018-04-01

    The climatic event between 4.2 and 3.9 ka BP known as the "4.2 ka event" is commonly considered to be a synchronous global drought that happened as one pulse. However, careful comparison of records from around the world shows that synchrony is possible only if the published chronologies of the various records are shifted to the extent allowed by the uncertainties of their age data, that several records suggest a two-pulsed event, and that some records suggest a wet rather than dry event. The radiometric ages constraining those records have uncertainties of several decades if not hundreds of years, and in some records the event is represented by only one or two analyses. This paper reports a new record from Stalagmite DP1 from northeastern Namibia in which high 230Th/232Th activity ratios allow small age uncertainties ranging between only 10-28 years, and the event is documented by more than 35 isotopic analyses and by petrographic observation of a surface of dissolution. The ages from Stalagmite DP1 combine with results from 11 other records from around the world to suggest an event centered at about 4.07 ka BP with bracketing ages of 4.15 to 3.93 ka BP. The isotopic and petrographic results suggest a two-pulsed wet event in northeastern Namibia, which is in the Southern Hemisphere's summer rainfall zone where more rain presumably fell with southward migration of the Inter-Tropical Convergence Zone as the result of cooling in the Northern Hemisphere. Comparison with other records from outside the region of dryness from the Mediterranean to eastern Asia suggests that multiple climatic zones similarly moved southward during the event, in some cases bringing wetter conditions that contradict the notion of global drought.

  11. Cognitive caching promotes flexibility in task switching: evidence from event-related potentials.

    PubMed

    Lange, Florian; Seer, Caroline; Müller, Dorothea; Kopp, Bruno

    2015-12-08

    Time-consuming processes of task-set reconfiguration have been shown to contribute to the costs of switching between cognitive tasks. We describe and probe a novel mechanism serving to reduce the costs of task-set reconfiguration. We propose that when individuals are uncertain about the currently valid task, one task set is activated for execution while other task sets are maintained at a pre-active state in cognitive cache. We tested this idea by assessing an event-related potential (ERP) index of task-set reconfiguration in a three-rule task-switching paradigm involving varying degrees of task uncertainty. In high-uncertainty conditions, two viable tasks were equally likely to be correct whereas in low-uncertainty conditions, one task was more likely than the other. ERP and performance measures indicated substantial costs of task-set reconfiguration when participants were required to switch away from a task that had been likely to be correct. In contrast, task-set-reconfiguration costs were markedly reduced when the previous task set was chosen under high task uncertainty. These results suggest that cognitive caching of alternative task sets adds to human cognitive flexibility under high task uncertainty.

  12. Cognitive caching promotes flexibility in task switching: evidence from event-related potentials

    PubMed Central

    Lange, Florian; Seer, Caroline; Müller, Dorothea; Kopp, Bruno

    2015-01-01

    Time-consuming processes of task-set reconfiguration have been shown to contribute to the costs of switching between cognitive tasks. We describe and probe a novel mechanism serving to reduce the costs of task-set reconfiguration. We propose that when individuals are uncertain about the currently valid task, one task set is activated for execution while other task sets are maintained at a pre-active state in cognitive cache. We tested this idea by assessing an event-related potential (ERP) index of task-set reconfiguration in a three-rule task-switching paradigm involving varying degrees of task uncertainty. In high-uncertainty conditions, two viable tasks were equally likely to be correct whereas in low-uncertainty conditions, one task was more likely than the other. ERP and performance measures indicated substantial costs of task-set reconfiguration when participants were required to switch away from a task that had been likely to be correct. In contrast, task-set-reconfiguration costs were markedly reduced when the previous task set was chosen under high task uncertainty. These results suggest that cognitive caching of alternative task sets adds to human cognitive flexibility under high task uncertainty. PMID:26643146

  13. Volcanic ash dosage calculator: A proof-of-concept tool to support aviation stakeholders during ash events

    NASA Astrophysics Data System (ADS)

    Dacre, H.; Prata, A.; Shine, K. P.; Irvine, E.

    2017-12-01

    The volcanic ash clouds produced by Icelandic volcano Eyjafjallajökull in April/May 2010 resulted in `no fly zones' which paralysed European aircraft activity and cost the airline industry an estimated £1.1 billion. In response to the crisis, the Civil Aviation Authority (CAA), in collaboration with Rolls Royce, produced the `safe-to-fly' chart. As ash concentrations are the primary output of dispersion model forecasts, the chart was designed to illustrate how engine damage progresses as a function of ash concentration. Concentration thresholds were subsequently derived based on previous ash encounters. Research scientists and aircraft manufactures have since recognised the importance of volcanic ash dosages; the accumulated concentration over time. Dosages are an improvement to concentrations as they can be used to identify pernicious situations where ash concentrations are acceptably low but the exposure time is long enough to cause damage to aircraft engines. Here we present a proof-of-concept volcanic ash dosage calculator; an innovative, web-based research tool, developed in close collaboration with operators and regulators, which utilises interactive data visualisation to communicate the uncertainty inherent in dispersion model simulations and subsequent dosage calculations. To calculate dosages, we use NAME (Numerical Atmospheric-dispersion Modelling Environment) to simulate several Icelandic eruption scenarios, which result in tephra dispersal across the North Atlantic, UK and Europe. Ash encounters are simulated based on flight-optimal routes derived from aircraft routing software. Key outputs of the calculator include: the along-flight dosage, exposure time and peak concentration. The design of the tool allows users to explore the key areas of uncertainty in the dosage calculation and to visualise how this changes as the planned flight path is varied. We expect that this research will result in better informed decisions from key stakeholders during volcanic ash events through a deeper understanding of the associated uncertainties in dosage calculations.

  14. Do regional methods really help reduce uncertainties in flood frequency analyses?

    NASA Astrophysics Data System (ADS)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.

  15. Effective Integration of Earth Observation Data and Flood Modeling for Rapid Disaster Response: The Texas 2015 Case

    NASA Astrophysics Data System (ADS)

    Schumann, G.

    2016-12-01

    Routinely obtaining real-time 2-D inundation patterns of a flood event at a meaningful spatial resolution and over large scales is at the moment only feasible with either operational aircraft flights or satellite imagery. Of course having model simulations of floodplain inundation available to complement the remote sensing data is highly desirable, for both event re-analysis and forecasting event inundation. Using the Texas 2015 flood disaster, we demonstrate the value of multi-scale EO data for large scale 2-D floodplain inundation modeling and forecasting. A dynamic re-analysis of the Texas 2015 flood disaster was run using a 2-D flood model developed for accurate large scale simulations. We simulated the major rivers entering the Gulf of Mexico and used flood maps produced from both optical and SAR satellite imagery to examine regional model sensitivities and assess associated performance. It was demonstrated that satellite flood maps can complement model simulations and add value, although this is largely dependent on a number of important factors, such as image availability, regional landscape topology, and model uncertainty. In the preferred case where model uncertainty is high, landscape topology is complex (i.e. urbanized coastal area) and satellite flood maps are available (in case of SAR for instance), satellite data can significantly reduce model uncertainty by identifying the "best possible" model parameter set. However, most often the situation is occurring where model uncertainty is low and spatially contiguous flooding can be mapped from satellites easily enough, such as in rural large inland river floodplains. Consequently, not much value from satellites can be added. Nevertheless, where a large number of flood maps are available, model credibility can be increased substantially. In the case presented here this was true for at least 60% of the many thousands of kilometers of river flow length simulated, where satellite flood maps existed. The next steps of this project is to employ a technique termed "targeted observation" approach, which is an assimilation based procedure that allows quantifying the impact observations have on model predictions at the local scale and also along the entire river system, when assimilated with the model at specific "overpass" locations.

  16. How to deal with climate change uncertainty in the planning of engineering systems

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Dittes, Beatrice; Straub, Daniel

    2016-04-01

    The effect of extreme events such as floods on the infrastructure and built environment is associated with significant uncertainties: These include the uncertain effect of climate change, uncertainty on extreme event frequency estimation due to limited historic data and imperfect models, and, not least, uncertainty on future socio-economic developments, which determine the damage potential. One option for dealing with these uncertainties is the use of adaptable (flexible) infrastructure that can easily be adjusted in the future without excessive costs. The challenge is in quantifying the value of adaptability and in finding the optimal sequence of decision. Is it worth to build a (potentially more expensive) adaptable system that can be adjusted in the future depending on the future conditions? Or is it more cost-effective to make a conservative design without counting with the possible future changes to the system? What is the optimal timing of the decision to build/adjust the system? We develop a quantitative decision-support framework for evaluation of alternative infrastructure designs under uncertainties, which: • probabilistically models the uncertain future (trough a Bayesian approach) • includes the adaptability of the systems (the costs of future changes) • takes into account the fact that future decisions will be made under uncertainty as well (using pre-posterior decision analysis) • allows to identify the optimal capacity and optimal timing to build/adjust the infrastructure. Application of the decision framework will be demonstrated on an example of flood mitigation planning in Bavaria.

  17. Cara Status and Upcoming Enhancements

    NASA Technical Reports Server (NTRS)

    Newman, Lauri

    2015-01-01

    RIC Miss Values in Summary TableTabular presentation of miss vector in Summary Section RIC Uncertainty Values in Details SectionNumerical presentation of miss component uncertainty values in Details SectionGreen Events with Potentially Maneuverable Secondary ObjectsAll potentially maneuverable secondary objects will be reported out to 7-days prior to TCA for LEO events and 10-days for NONLEO events, regardless of risk (relates to MOWG Action Item 1309-11) All green events with potentially active secondary objects included in Summary ReportsAllows more time for contacting other OOBlack Box FixSometimes a black square appeared in the summary report where the ASW RIC time history plot should beAppendix Orbit RegimeMission Name MismatchPc 0 Plotting BugAll Pc points less than 1e-10 (zero) are now plotted as 1e-10 (instead of not at all)Maneuver Indication FixManeuver indicator now present even if maneuver was in the past.

  18. Soft Water Level Sensors for Characterizing the Hydrological Behaviour of Agricultural Catchments

    PubMed Central

    Crabit, Armand; Colin, François; Bailly, Jean Stéphane; Ayroles, Hervé; Garnier, François

    2011-01-01

    An innovative soft water level sensor is proposed to characterize the hydrological behaviour of agricultural catchments by measuring rainfall and stream flows. This sensor works as a capacitor coupled with a capacitance to frequency converter and measures water level at an adjustable time step acquisition. It was designed to be handy, minimally invasive and optimized in terms of energy consumption and low-cost fabrication so as to multiply its use on several catchments under natural conditions. It was used as a stage recorder to measure water level dynamics in a channel during a runoff event and as a rain gauge to measure rainfall amount and intensity. Based on the Manning equation, a method allowed estimation of water discharge with a given uncertainty and hence runoff volume at an event or annual scale. The sensor was tested under controlled conditions in the laboratory and under real conditions in the field. Comparisons of the sensor to reference devices (tipping bucket rain gauge, hydrostatic pressure transmitter limnimeter, Venturi channels…) showed accurate results: rainfall intensities and dynamic responses were accurately reproduced and discharges were estimated with an uncertainty usually acceptable in hydrology. Hence, it was used to monitor eleven small agricultural catchments located in the Mediterranean region. Both catchment reactivity and water budget have been calculated. Dynamic response of the catchments has been studied at the event scale through the rising time determination and at the annual scale by calculating the frequency of occurrence of runoff events. It provided significant insight into catchment hydrological behaviour which could be useful for agricultural management perspectives involving pollutant transport, flooding event and global water balance. PMID:22163868

  19. Simulation of earthquake ground motions in the eastern United States using deterministic physics‐based and site‐based stochastic approaches

    USGS Publications Warehouse

    Rezaeian, Sanaz; Hartzell, Stephen; Sun, Xiaodan; Mendoza, Carlos

    2017-01-01

    Earthquake ground‐motion recordings are scarce in the central and eastern United States (CEUS) for large‐magnitude events and at close distances. We use two different simulation approaches, a deterministic physics‐based method and a site‐based stochastic method, to simulate ground motions over a wide range of magnitudes. Drawing on previous results for the modeling of recordings from the 2011 Mw 5.8 Mineral, Virginia, earthquake and using the 2001 Mw 7.6 Bhuj, India, earthquake as a tectonic analog for a large magnitude CEUS event, we are able to calibrate the two simulation methods over this magnitude range. Both models show a good fit to the Mineral and Bhuj observations from 0.1 to 10 Hz. Model parameters are then adjusted to obtain simulations for Mw 6.5, 7.0, and 7.6 events in the CEUS. Our simulations are compared with the 2014 U.S. Geological Survey weighted combination of existing ground‐motion prediction equations in the CEUS. The physics‐based simulations show comparable response spectral amplitudes and a fairly similar attenuation with distance. The site‐based stochastic simulations suggest a slightly faster attenuation of the response spectral amplitudes with distance for larger magnitude events and, as a result, slightly lower amplitudes at distances greater than 200 km. Both models are plausible alternatives and, given the few available data points in the CEUS, can be used to represent the epistemic uncertainty in modeling of postulated CEUS large‐magnitude events.

  20. A risk-based multi-objective model for optimal placement of sensors in water distribution system

    NASA Astrophysics Data System (ADS)

    Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein

    2018-02-01

    In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.

  1. A Hot-Deck Multiple Imputation Procedure for Gaps in Longitudinal Recurrent Event Histories

    PubMed Central

    Wang, Chia-Ning; Little, Roderick; Nan, Bin; Harlow, Siobán D.

    2012-01-01

    Summary We propose a regression-based hot deck multiple imputation method for gaps of missing data in longitudinal studies, where subjects experience a recurrent event process and a terminal event. Examples are repeated asthma episodes and death, or menstrual periods and the menopause, as in our motivating application. Research interest concerns the onset time of a marker event, defined by the recurrent-event process, or the duration from this marker event to the final event. Gaps in the recorded event history make it difficult to determine the onset time of the marker event, and hence, the duration from onset to the final event. Simple approaches such as jumping gap times or dropping cases with gaps have obvious limitations. We propose a procedure for imputing information in the gaps by substituting information in the gap from a matched individual with a completely recorded history in the corresponding interval. Predictive Mean Matching is used to incorporate information on longitudinal characteristics of the repeated process and the final event time. Multiple imputation is used to propagate imputation uncertainty. The procedure is applied to an important data set for assessing the timing and duration of the menopausal transition. The performance of the proposed method is assessed by a simulation study. PMID:21361886

  2. Dispersion analysis for baseline reference mission 1. [flight simulation and trajectory analysis for space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Kuhn, A. E.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.

  3. SensibleSleep: A Bayesian Model for Learning Sleep Patterns from Smartphone Events

    PubMed Central

    Sekara, Vedran; Jonsson, Håkan; Larsen, Jakob Eg; Lehmann, Sune

    2017-01-01

    We propose a Bayesian model for extracting sleep patterns from smartphone events. Our method is able to identify individuals’ daily sleep periods and their evolution over time, and provides an estimation of the probability of sleep and wake transitions. The model is fitted to more than 400 participants from two different datasets, and we verify the results against ground truth from dedicated armband sleep trackers. We show that the model is able to produce reliable sleep estimates with an accuracy of 0.89, both at the individual and at the collective level. Moreover the Bayesian model is able to quantify uncertainty and encode prior knowledge about sleep patterns. Compared with existing smartphone-based systems, our method requires only screen on/off events, and is therefore much less intrusive in terms of privacy and more battery-efficient. PMID:28076375

  4. SensibleSleep: A Bayesian Model for Learning Sleep Patterns from Smartphone Events.

    PubMed

    Cuttone, Andrea; Bækgaard, Per; Sekara, Vedran; Jonsson, Håkan; Larsen, Jakob Eg; Lehmann, Sune

    2017-01-01

    We propose a Bayesian model for extracting sleep patterns from smartphone events. Our method is able to identify individuals' daily sleep periods and their evolution over time, and provides an estimation of the probability of sleep and wake transitions. The model is fitted to more than 400 participants from two different datasets, and we verify the results against ground truth from dedicated armband sleep trackers. We show that the model is able to produce reliable sleep estimates with an accuracy of 0.89, both at the individual and at the collective level. Moreover the Bayesian model is able to quantify uncertainty and encode prior knowledge about sleep patterns. Compared with existing smartphone-based systems, our method requires only screen on/off events, and is therefore much less intrusive in terms of privacy and more battery-efficient.

  5. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. NASA project stakeholders participated in determining and managing completion distribution functions produced from PAST. The first result was that project stakeholders improved project completion risk awareness. Secondly, using PAST, mitigation options were analyzed to improve project completion performance and reduce total project cost.

  6. Comparison of 2D numerical models for river flood hazard assessment: simulation of the Secchia River flood in January, 2014

    NASA Astrophysics Data System (ADS)

    Shustikova, Iuliia; Domeneghetti, Alessio; Neal, Jeffrey; Bates, Paul; Castellarin, Attilio

    2017-04-01

    Hydrodynamic modeling of inundation events still brings a large array of uncertainties. This effect is especially evident in the models run for geographically large areas. Recent studies suggest using fully two-dimensional (2D) models with high resolution in order to avoid uncertainties and limitations coming from the incorrect interpretation of flood dynamics and an unrealistic reproduction of the terrain topography. This, however, affects the computational efficiency increasing the running time and hardware demands. Concerning this point, our study evaluates and compares numerical models of different complexity by testing them on a flood event that occurred in the basin of the Secchia River, Northern Italy, on 19th January, 2014. The event was characterized by a levee breach and consequent flooding of over 75 km2 of the plain behind the dike within 48 hours causing population displacement, one death and economic losses in excess of 400 million Euro. We test the well-established TELEMAC 2D, and LISFLOOD-FP codes, together with the recently launched HEC-RAS 5.0.3 (2D model), all models are implemented using different grid size (2-200 m) based on the 1 m digital elevation model resolution. TELEMAC is a fully 2D hydrodynamic model which is based on the finite-element or finite-volume approach. Whereas HEC-RAS 5.0.3 and LISFLOOD-FP are both coupled 1D-2D models. All models are calibrated against observed inundation extent and maximum water depths, which are retrieved from remotely sensed data and field survey reports. Our study quantitatively compares the three modeling strategies highlighting differences in terms of the ease of implementation, accuracy of representation of hydraulic processes within floodplains and computational efficiency. Additionally, we look into the different grid resolutions in terms of the results accuracy and computation time. Our study is a preliminary assessment that focuses on smaller areas in order to identify potential modeling schemes that would be efficient for simulating flooding scenarios for large and very large floodplains. This research aims at contributing to the reduction of uncertainties and limitations in hazard and risk assessment.

  7. Stress Drop and Its Relationship to Radiated Energy, Ground Motion and Uncertainty

    NASA Astrophysics Data System (ADS)

    Baltay, A.

    2014-12-01

    Despite the seemingly diverse circumstances under which crustal earthquakes occur, scale-invariant stress drop and apparent stress, the ratio of radiated seismic energy to moment, is observed. The magnitude-independence of these parameters is central to our understanding of both earthquake physics and strong ground motion genesis. Estimates of stress drop and radiated energy, however, display large amounts of scatter potentially masking any secondary trends in the data. We investigate sources of this uncertainty within the framework of constant stress drop and apparent stress. We first re-visit estimates of energy and stress drop from a variety of earthquake observations and methods, for events ranging from magnitude ~2 to ~9. Using an empirical Green's function (eGf) deconvolution method, which removes the path and site effects, radiated energy and Brune stress drop are estimated for both regional events in the western US and Eastern Honshu, Japan from the HiNet network, as well as teleseismically recorded global great earthquakes [Baltay et al., 2010, 2011, 2014]. In addition to eGf methods, ground-motion based metrics for stress drop are considered, using both KikNet data from Japan [Baltay et al., 2013] and the NGA-West2 data, a very well curated ground-motion database. Both the eGf-based stress drop estimates and those from the NGA-West2 database show a marked decrease in scatter, allowing us to identify deterministic secondary trends in stress drop. We find both an increasing stress drop with depth, as well as a larger stress drop of about 30% on average for mainshock events as compared to on-fault aftershocks. While both of these effects are already included in some ground-motion prediction equations (GMPE), many previous seismological studies have been unable to conclusively uncover these trends because of their considerable scatter. Elucidating these effects in the context of reduced and quantified epistemic uncertainty can help both seismologists and engineers to understand the true aleatory variability of the earthquake source, which may be due to the complex and diverse circumstances under which these earthquake occur and which we are yet unable to model.

  8. A Model-Based Approach to Infer Shifts in Regional Fire Regimes Over Time Using Sediment Charcoal Records

    NASA Astrophysics Data System (ADS)

    Itter, M.; Finley, A. O.; Hooten, M.; Higuera, P. E.; Marlon, J. R.; McLachlan, J. S.; Kelly, R.

    2016-12-01

    Sediment charcoal records are used in paleoecological analyses to identify individual local fire events and to estimate fire frequency and regional biomass burned at centennial to millenial time scales. Methods to identify local fire events based on sediment charcoal records have been well developed over the past 30 years, however, an integrated statistical framework for fire identification is still lacking. We build upon existing paleoecological methods to develop a hierarchical Bayesian point process model for local fire identification and estimation of fire return intervals. The model is unique in that it combines sediment charcoal records from multiple lakes across a region in a spatially-explicit fashion leading to estimation of a joint, regional fire return interval in addition to lake-specific local fire frequencies. Further, the model estimates a joint regional charcoal deposition rate free from the effects of local fires that can be used as a measure of regional biomass burned over time. Finally, the hierarchical Bayesian approach allows for tractable error propagation such that estimates of fire return intervals reflect the full range of uncertainty in sediment charcoal records. Specific sources of uncertainty addressed include sediment age models, the separation of local versus regional charcoal sources, and generation of a composite charcoal record The model is applied to sediment charcoal records from a dense network of lakes in the Yukon Flats region of Alaska. The multivariate joint modeling approach results in improved estimates of regional charcoal deposition with reduced uncertainty in the identification of individual fire events and local fire return intervals compared to individual lake approaches. Modeled individual-lake fire return intervals range from 100 to 500 years with a regional interval of roughly 200 years. Regional charcoal deposition to the network of lakes is correlated up to 50 kilometers. Finally, the joint regional charcoal deposition rate exhibits changes over time coincident with major climatic and vegetation shifts over the past 10,000 years. Ongoing work will use the regional charcoal deposition rate to estimate changes in biomass burned as a function of climate variability and regional vegetation pattern.

  9. Using subseasonal-to-seasonal (S2S) extreme rainfall forecasts for extended-range flood prediction in Australia

    NASA Astrophysics Data System (ADS)

    White, C. J.; Franks, S. W.; McEvoy, D.

    2015-06-01

    Meteorological and hydrological centres around the world are looking at ways to improve their capacity to be able to produce and deliver skilful and reliable forecasts of high-impact extreme rainfall and flooding events on a range of prediction timescales (e.g. sub-daily, daily, multi-week, seasonal). Making improvements to extended-range rainfall and flood forecast models, assessing forecast skill and uncertainty, and exploring how to apply flood forecasts and communicate their benefits to decision-makers are significant challenges facing the forecasting and water resources management communities. This paper presents some of the latest science and initiatives from Australia on the development, application and communication of extreme rainfall and flood forecasts on the extended-range "subseasonal-to-seasonal" (S2S) forecasting timescale, with a focus on risk-based decision-making, increasing flood risk awareness and preparedness, capturing uncertainty, understanding human responses to flood forecasts and warnings, and the growing adoption of "climate services". The paper also demonstrates how forecasts of flood events across a range of prediction timescales could be beneficial to a range of sectors and society, most notably for disaster risk reduction (DRR) activities, emergency management and response, and strengthening community resilience. Extended-range S2S extreme flood forecasts, if presented as easily accessible, timely and relevant information are a valuable resource to help society better prepare for, and subsequently cope with, extreme flood events.

  10. Evaluation and comparison of different RCMs simulations of the Mediterranean climate: a view on the impact of model resolution and Mediterranean sea coupling.

    NASA Astrophysics Data System (ADS)

    Panthou, Gérémy; Vrac, Mathieu; Drobinski, Philippe; Bastin, Sophie; Somot, Samuel; Li, Laurent

    2015-04-01

    As regularly stated by numerous authors, the Mediterranean climate is considered as one major climate 'hot spot'. At least, three reasons may explain this statement. First, this region is known for being regularly affected by extreme hydro-meteorological events (heavy precipitation and flash-floods during the autumn season; droughts and heat waves during spring and summer). Second, the vulnerability of populations in regard of these extreme events is expected to increase during the XXIst century (at least due to the projected population growth in this region). At last, Global Circulation Models project that this regional climate will be highly sensitive to climate change. Moreover, global warming is expected to intensify the hydrological cycle and thus to increase the frequency of extreme hydro-meteorological events. In order to propose adaptation strategies, the robust estimation of the future evolution of the Mediterranean climate and the associated extreme hydro-meteorological events (in terms of intensity/frequency) is of great relevance. However, these projections are characterized by large uncertainties. Many components of the simulation chain can explain these large uncertainties : (i) uncertainties concerning the emission scenario; (ii) climate model simulations suffer of parametrization errors and uncertainties concerning the initial state of the climate; and (iii) the additional uncertainties given by the (dynamical or statistical) downscaling techniques and the impact model. Narrowing (as fine as possible) these uncertainties is a major challenge of the actual climate research. One way for that is to reduce the uncertainties associated with each component. In this study, we are interested in evaluating the potential improvement of : (i) coupled RCM simulations (with the Mediterranean Sea) in comparison with atmosphere only (stand-alone) RCM simulations and (ii) RCM simulations at a finer resolution in comparison with larger resolution. For that, three different RCMs (WRF, ALADIN, LMDZ4) were run, forced by ERA-Interim reanalyses, within the MED-CORDEX experiment. For each RCM, different versions (coupled/stand-alone, high/low resolution) were realized. A large set of scores was developed and applied in order to evaluate the performances of these different RCMs simulations. These scores were applied for three variables (daily precipitation amount, mean daily air temperature and the dry spell lengths). A particular attention was given to the RCM capability to reproduce the seasonal and spatial pattern of extreme statistics. Results show that the differences between coupled and stand-alone RCMs are localized very near the Mediterranean sea and that the model resolution has a slight impact on the scores obtained. Globally, the main differences between the RCM simulations come from the RCM used. Keywords: Mediterranean climate, extreme hydro-meteorological events, RCM simulations, evaluation of climate simulations

  11. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    PubMed

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  12. Full moment tensors with uncertainties for the 2017 North Korea declared nuclear test and for a collocated, subsequent event

    NASA Astrophysics Data System (ADS)

    Alvizuri, C. R.; Tape, C.

    2017-12-01

    A seismic moment tensor is a 3×3 symmetric matrix that characterizes the far-field seismic radiation from a source, whether it be an earthquake, volcanic event, explosion. We estimate full moment tensors and their uncertainties for the North Korea declared nuclear test and for a collocated event that occurred eight minutes later. The nuclear test and the subsequent event occurred on September 3, 2017 at around 03:30 and 03:38 UTC time. We perform a grid search over the six-dimensional space of moment tensors, generating synthetic waveforms at each moment tensor grid point and then evaluating a misfit function between the observed and synthetic waveforms. The synthetic waveforms are computed using a 1-D structure model for the region; this approximation requires careful assessment of time shifts between data and synthetics, as well as careful choice of the bandpass for filtering. For each moment tensor we characterize its uncertainty in terms of waveform misfit, a probability function, and a confidence curve for the probability that the true moment tensor lies within the neighborhood of the optimal moment tensor. For each event we estimate its moment tensor using observed waveforms from all available seismic stations within a 2000-km radius. We use as much of the waveform as possible, including surface waves for all stations, and body waves above 1 Hz for some of the closest stations. Our preliminary magnitude estimates are Mw 5.1-5.3 for the first event and Mw 4.7 for the second event. Our results show a dominantly positive isotropic moment tensor for the first event, and a dominantly negative isotropic moment tensor for the subsequent event. As expected, the details of the probability density, waveform fit, and confidence curves are influenced by the structural model, the choice of filter frequencies, and the selection of stations.

  13. Influence of uncertainty on framed decision-making with moral dilemma

    PubMed Central

    Mermillod, Martial; Le Pennec, Jean-Luc; Dutheil, Frédéric; Mondillon, Laurie

    2018-01-01

    In cases of impending natural disasters, most events are uncertain and emotionally relevant, both critical factors for decision-making. Moreover, for exposed individuals, the sensitivity to the framing of the consequences (gain or loss) and the moral judgments they have to perform (e.g., evacuate or help an injured person) constitute two central effects that have never been examined in the same context of decision-making. In a framed decision-making task with moral dilemma, we investigated whether uncertainty (i.e., unpredictably of events) and a threatening context would influence the framing effect (actions framed in loss are avoided in comparison to the ones framed in gain) and the personal intention effect (unintentional actions are more morally acceptable in comparison to intentional actions) on the perceived moral acceptability of taking action. Considering the impact of uncertainty and fear on the processes underlying these effects, we assumed that these emotions would lead to the negation of the two effects. Our results indicate that the exposure to uncertain events leads to the negation of the framing effect, but does not influence the moral acceptability and the effect of personal intention. We discuss our results in the light of dual-process models (i.e. systematic vs. heuristic), appraisal theories, and neurocognitive aspects. These elements highlight the importance of providing solutions to cope with uncertainty, both for scientists and local populations exposed to natural hazards. PMID:29847589

  14. Influence of uncertainty on framed decision-making with moral dilemma.

    PubMed

    Merlhiot, Gaëtan; Mermillod, Martial; Le Pennec, Jean-Luc; Dutheil, Frédéric; Mondillon, Laurie

    2018-01-01

    In cases of impending natural disasters, most events are uncertain and emotionally relevant, both critical factors for decision-making. Moreover, for exposed individuals, the sensitivity to the framing of the consequences (gain or loss) and the moral judgments they have to perform (e.g., evacuate or help an injured person) constitute two central effects that have never been examined in the same context of decision-making. In a framed decision-making task with moral dilemma, we investigated whether uncertainty (i.e., unpredictably of events) and a threatening context would influence the framing effect (actions framed in loss are avoided in comparison to the ones framed in gain) and the personal intention effect (unintentional actions are more morally acceptable in comparison to intentional actions) on the perceived moral acceptability of taking action. Considering the impact of uncertainty and fear on the processes underlying these effects, we assumed that these emotions would lead to the negation of the two effects. Our results indicate that the exposure to uncertain events leads to the negation of the framing effect, but does not influence the moral acceptability and the effect of personal intention. We discuss our results in the light of dual-process models (i.e. systematic vs. heuristic), appraisal theories, and neurocognitive aspects. These elements highlight the importance of providing solutions to cope with uncertainty, both for scientists and local populations exposed to natural hazards.

  15. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    NASA Astrophysics Data System (ADS)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be

  16. Constraining shallow seismic event depth via synthetic modeling for Expert Technical Analysis at the IDC

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.

    2015-12-01

    Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.

  17. Event reweighting with the NuWro neutrino interaction generator

    NASA Astrophysics Data System (ADS)

    Pickering, Luke; Stowell, Patrick; Sobczyk, Jan

    2017-09-01

    Event reweighting has been implemented in the NuWro neutrino event generator for a number of free theory parameters in the interaction model. Event reweighting is a key analysis technique, used to efficiently study the effect of neutrino interaction model uncertainties. This opens up the possibility for NuWro to be used as a primary event generator by experimental analysis groups. A preliminary model tuning to ANL and BNL data of quasi-elastic and single pion production events was performed to validate the reweighting engine.

  18. Integrating uncertainty propagation in GNSS radio occultation retrieval: from excess phase to atmospheric bending angle profiles

    NASA Astrophysics Data System (ADS)

    Schwarz, Jakob; Kirchengast, Gottfried; Schwaerz, Marc

    2018-05-01

    Global Navigation Satellite System (GNSS) radio occultation (RO) observations are highly accurate, long-term stable data sets and are globally available as a continuous record from 2001. Essential climate variables for the thermodynamic state of the free atmosphere - such as pressure, temperature, and tropospheric water vapor profiles (involving background information) - can be derived from these records, which therefore have the potential to serve as climate benchmark data. However, to exploit this potential, atmospheric profile retrievals need to be very accurate and the remaining uncertainties quantified and traced throughout the retrieval chain from raw observations to essential climate variables. The new Reference Occultation Processing System (rOPS) at the Wegener Center aims to deliver such an accurate RO retrieval chain with integrated uncertainty propagation. Here we introduce and demonstrate the algorithms implemented in the rOPS for uncertainty propagation from excess phase to atmospheric bending angle profiles, for estimated systematic and random uncertainties, including vertical error correlations and resolution estimates. We estimated systematic uncertainty profiles with the same operators as used for the basic state profiles retrieval. The random uncertainty is traced through covariance propagation and validated using Monte Carlo ensemble methods. The algorithm performance is demonstrated using test day ensembles of simulated data as well as real RO event data from the satellite missions CHAllenging Minisatellite Payload (CHAMP); Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC); and Meteorological Operational Satellite A (MetOp). The results of the Monte Carlo validation show that our covariance propagation delivers correct uncertainty quantification from excess phase to bending angle profiles. The results from the real RO event ensembles demonstrate that the new uncertainty estimation chain performs robustly. Together with the other parts of the rOPS processing chain this part is thus ready to provide integrated uncertainty propagation through the whole RO retrieval chain for the benefit of climate monitoring and other applications.

  19. Uncertainty in measuring runoff from small watersheds using instrumented outlet-pond

    USDA-ARS?s Scientific Manuscript database

    This study quantified the uncertainty associated with event runoff quantity monitored at watershed outlet ponds. Inflow and outflow depth data were collected from 2004 to 2011 at seven instrumented monitoring stations at the outlet of watersheds ranging in size from 35.2 to 159.5 ha on the USDA-ARS ...

  20. Impact of intermediate and high energy nuclear data on the neutronic safety parameters of MYRRHA accelerator driven system

    NASA Astrophysics Data System (ADS)

    Stankovskiy, Alexey; Çelik, Yurdunaz; Eynde, Gert Van den

    2017-09-01

    Perturbation of external neutron source can cause significant local power changes transformed into undesired safety-related events in an accelerator driven system. Therefore for the accurate design of MYRRHA sub-critical core it is important to evaluate the uncertainty of power responses caused by the uncertainties in nuclear reaction models describing the particle transport from primary proton energy down to the evaluated nuclear data table range. The calculations with a set of models resulted in quite low uncertainty on the local power caused by significant perturbation of primary neutron yield from proton interactions with lead and bismuth isotopes. The considered accidental event of prescribed proton beam shape loss causes drastic increase in local power but does not practically change the total core thermal power making this effect difficult to detect. In the same time the results demonstrate a correlation between perturbed local power responses in normal operation and misaligned beam conditions indicating that generation of covariance data for proton and neutron induced neutron multiplicities for lead and bismuth isotopes is needed to obtain reliable uncertainties for local power responses.

  1. On the uncertainty of phenological responses to climate change and its implication for terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.

    2012-01-01

    Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst. The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% CI: 2.4 day century-1 for scenario B1 and 4.5 day century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 day century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century-1 for A1fi, ±3.6 day century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 day °C-1 and 5.2 day °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated carbon and water fluxes using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of the seasonality of processes, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and evapotranspiration (ET) of 9.6% and 2.9% respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, uncertainties related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.

  2. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  3. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  4. The flood event of 10-12 November 2013 on the Tiber River basin (central Italy): real-time flood forecasting with uncertainty supporting risk management and decision-making

    NASA Astrophysics Data System (ADS)

    Berni, Nicola; Brocca, Luca; Barbetta, Silvia; Pandolfo, Claudia; Stelluti, Marco; Moramarco, Tommaso

    2014-05-01

    The Italian national hydro-meteorological early warning system is composed by 21 regional offices (Functional Centres, CF). Umbria Region (central Italy) CF provides early warning for floods and landslides, real-time monitoring and decision support systems (DSS) for the Civil Defence Authorities when significant events occur. The alert system is based on hydrometric and rainfall thresholds with detailed procedures for the management of critical events in which different roles of authorities and institutions involved are defined. The real-time flood forecasting system is based also on different hydrological and hydraulic forecasting models. Among these, the MISDc rainfall-runoff model ("Modello Idrologico SemiDistribuito in continuo"; Brocca et al., 2011) and the flood routing model named STAFOM-RCM (STAge Forecasting Model-Rating Curve Model; Barbetta et al., 2014) are continuously operative in real-time providing discharge and stage forecasts, respectively, with lead-times up to 24 hours (when quantitative precipitation forecasts are used) in several gauged river sections in the Upper-Middle Tiber River basin. Models results are published in real-time in the open source CF web platform: www.cfumbria.it. MISDc provides discharge and soil moisture forecasts for different sub-basins while STAFOM-RCM provides stage forecasts at hydrometric sections. Moreover, through STAFOM-RCM the uncertainty of the forecast stage hydrograph is provided in terms of 95% Confidence Interval (CI) assessed by analyzing the statistical properties of model output in terms of lateral. In the period 10th-12th November 2013, a severe flood event occurred in Umbria mainly affecting the north-eastern area and causing significant economic damages, but fortunately no casualties. The territory was interested by intense and persistent rainfall; the hydro-meteorological monitoring network recorded locally rainfall depth over 400 mm in 72 hours. In the most affected area, the recorded rainfall depths correspond approximately to a return period of 200 years. Most rivers in Umbria have been involved, exceeding hydrometric thresholds and causing flooding (e.g. Chiascio river). The flood event was continuously monitored at the Umbria Region CF and the possible evolution predicted and assessed on the basis of the model forecasts. The predictions provided by MISDc and STAFOM-RCM were found useful to support real-time decision-making addressed to flood risk management. Moreover, the quantification of the uncertainty affecting the deterministic forecast stages was found consistent with the level of confidence selected and had practical utility corroborating the need of coupling deterministic forecast and 'uncertainty' when the model output is used to support decisions about flood management. REFERENCES Barbetta, S., Moramarco, T., Brocca, L., Franchini, M., Melone, F. (2014). Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3), 729-743. Brocca, L., Melone, F., Moramarco, T. (2011). Distributed rainfall-runoff modelling for flood frequency estimation and flood forecasting. Hydrological Processes, 25 (18), 2801-2813

  5. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  6. Prediction Accuracy of Error Rates for MPTB Space Experiment

    NASA Technical Reports Server (NTRS)

    Buchner, S. P.; Campbell, A. B.; Davis, D.; McMorrow, D.; Petersen, E. L.; Stassinopoulos, E. G.; Ritter, J. C.

    1998-01-01

    This paper addresses the accuracy of radiation-induced upset-rate predictions in space using the results of ground-based measurements together with standard environmental and device models. The study is focused on two part types - 16 Mb NEC DRAM's (UPD4216) and 1 Kb SRAM's (AMD93L422) - both of which are currently in space on board the Microelectronics and Photonics Test Bed (MPTB). To date, ground-based measurements of proton-induced single event upset (SEM cross sections as a function of energy have been obtained and combined with models of the proton environment to predict proton-induced error rates in space. The role played by uncertainties in the environmental models will be determined by comparing the modeled radiation environment with the actual environment measured aboard MPTB. Heavy-ion induced upsets have also been obtained from MPTB and will be compared with the "predicted" error rate following ground testing that will be done in the near future. These results should help identify sources of uncertainty in predictions of SEU rates in space.

  7. Uncertainty Of Stream Nutrient Transport Estimates Using Random Sampling Of Storm Events From High Resolution Water Quality And Discharge Data

    NASA Astrophysics Data System (ADS)

    Scholefield, P. A.; Arnscheidt, J.; Jordan, P.; Beven, K.; Heathwaite, L.

    2007-12-01

    The uncertainties associated with stream nutrient transport estimates are frequently overlooked and the sampling strategy is rarely if ever investigated. Indeed, the impact of sampling strategy and estimation method on the bias and precision of stream phosphorus (P) transport calculations is little understood despite the use of such values in the calibration and testing of models of phosphorus transport. The objectives of this research were to investigate the variability and uncertainty in the estimates of total phosphorus transfers at an intensively monitored agricultural catchment. The Oona Water which is located in the Irish border region, is part of a long term monitoring program focusing on water quality. The Oona Water is a rural river catchment with grassland agriculture and scattered dwelling houses and has been monitored for total phosphorus (TP) at 10 min resolution for several years (Jordan et al, 2007). Concurrent sensitive measurements of discharge are also collected. The water quality and discharge data were provided at 1 hour resolution (averaged) and this meant that a robust estimate of the annual flow weighted concentration could be obtained by simple interpolation between points. A two-strata approach (Kronvang and Bruhn, 1996) was used to estimate flow weighted concentrations using randomly sampled storm events from the 400 identified within the time series and also base flow concentrations. Using a random stratified sampling approach for the selection of events, a series ranging from 10 through to the full 400 were used, each time generating a flow weighted mean using a load-discharge relationship identified through log-log regression and monte-carlo simulation. These values were then compared to the observed total phosphorus concentration for the catchment. Analysis of these results show the impact of sampling strategy, the inherent bias in any estimate of phosphorus concentrations and the uncertainty associated with such estimates. The estimates generated using the full time series underestimate the flow weighted mean concentration of total phosphorus. This work compliments other contemporary work in the area of load estimation uncertainty in the UK (Johnes, 2007). Johnes P,J. 2007, Uncertainties in annual riverine phosphorus load estimation: Impact of load estimation methodology, sampling frequency, baseflow index and catchment population density, Journal of hydrology 332 (1- 2): 241-258 Jordan, P., Arnscheidt, J., McGrogan, H & McCormick, S., 2007. Characterising phosphorus transfers in rural transfers using a continuous bank-side analyser. Hydrology and Earth System Science 11, 372-381 Kronvang B & Bruhn, A. J, 1996. Choice of sampling strategy and estimation method for calculating nitrogen and phosphorus transport in small lowland streams , Hydrological processes 10 (11): 1483-1501

  8. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  9. Ensemble superparameterization versus stochastic parameterization: A comparison of model uncertainty representation in tropical weather prediction

    NASA Astrophysics Data System (ADS)

    Subramanian, Aneesh C.; Palmer, Tim N.

    2017-06-01

    Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.Plain Language SummaryProbabilistic weather forecasts, especially for tropical weather, is still a significant challenge for global weather forecasting systems. Expressing uncertainty along with weather forecasts is important for informed decision making. Hence, we explore the use of a relatively new approach in using super-parameterization, where a cloud resolving model is embedded within a global model, in probabilistic tropical weather forecasts at medium range. We show that this approach helps improve modeling uncertainty in forecasts of certain features such as precipitation magnitude and location better, but forecasts of tropical winds are not necessarily improved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22699331','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22699331"><span>Decision strategies for handling the uncertainty of future extreme rainfall under the influence of climate change.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gregersen, I B; Arnbjerg-Nielsen, K</p> <p>2012-01-01</p> <p>Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored through the application and discussion of three different theoretical decision support strategies: the precautionary principle, the minimax strategy and Bayesian decision support. The reviewed decision support strategies all proved valuable for addressing the identified uncertainties, at best applied together as they all yield information that improved decision making and thus enabled more robust decisions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19990040264','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19990040264"><span>Evaluation of Spacecraft Shielding Effectiveness for Radiation Protection</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Cucinotta, Francis A.; Wilson, John W.</p> <p>1999-01-01</p> <p>The potential for serious health risks from solar particle events (SPE) and galactic cosmic rays (GCR) is a critical issue in the NASA strategic plan for the Human Exploration and Development of Space (HEDS). The excess cost to protect against the GCR and SPE due to current uncertainties in radiation transmission properties and cancer biology could be exceedingly large based on the excess launch costs to shield against uncertainties. The development of advanced shielding concepts is an important risk mitigation area with the potential to significantly reduce risk below conventional mission designs. A key issue in spacecraft material selection is the understanding of nuclear reactions on the transmission properties of materials. High-energy nuclear particles undergo nuclear reactions in passing through materials and tissue altering their composition and producing new radiation types. Spacecraft and planetary habitat designers can utilize radiation transport codes to identify optimal materials for lowering exposures and to optimize spacecraft design to reduce astronaut exposures. To reach these objectives will require providing design engineers with accurate data bases and computationally efficient software for describing the transmission properties of space radiation in materials. Our program will reduce the uncertainty in the transmission properties of space radiation by improving the theoretical description of nuclear reactions and radiation transport, and provide accurate physical descriptions of the track structure of microscopic energy deposition.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25426631','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25426631"><span>Uncertainty and denial: a resource-rational model of the value of information.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pierson, Emma; Goodman, Noah</p> <p>2014-01-01</p> <p>Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24280111','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24280111"><span>Decision-support information system to manage mass casualty incidents at a level 1 trauma center.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bar-El, Yaron; Tzafrir, Sara; Tzipori, Idan; Utitz, Liora; Halberthal, Michael; Beyar, Rafael; Reisner, Shimon</p> <p>2013-12-01</p> <p>Mass casualty incidents are probably the greatest challenge to a hospital. When such an event occurs, hospitals are required to instantly switch from their routine activity to conditions of great uncertainty and confront needs that exceed resources. We describe an information system that was uniquely designed for managing mass casualty events. The web-based system is activated when a mass casualty event is declared; it displays relevant operating procedures, checklists, and a log book. The system automatically or semiautomatically initiates phone calls and public address announcements. It collects real-time data from computerized clinical and administrative systems in the hospital, and presents them to the managing team in a clear graphic display. It also generates periodic reports and summaries of available or scarce resources that are sent to predefined recipients. When the system was tested in a nationwide exercise, it proved to be an invaluable tool for informed decision making in demanding and overwhelming situations such as mass casualty events.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20150005487&hterms=discrete&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Ddiscrete','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20150005487&hterms=discrete&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Ddiscrete"><span>Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Dubos, Gregory F.; Cornford, Steven</p> <p>2012-01-01</p> <p>While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4245129','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4245129"><span>Uncertainty and Denial: A Resource-Rational Model of the Value of Information</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Pierson, Emma; Goodman, Noah</p> <p>2014-01-01</p> <p>Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them. PMID:25426631</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1380216-performance-missing-transverse-momentum-reconstruction-proton-proton-collisions-sqrt-mbox-tev-atlas','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1380216-performance-missing-transverse-momentum-reconstruction-proton-proton-collisions-sqrt-mbox-tev-atlas"><span>Performance of missing transverse momentum reconstruction in proton-proton collisions at $$\\sqrt{s} = 7~\\mbox{TeV}$$ with ATLAS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Aad, G.; Abbott, B.; Abdallah, J.; ...</p> <p>2012-01-03</p> <p>The measurement of missing transverse momentum in the ATLAS detector, described in this paper, makes use of the full event reconstruction and a calibration based on reconstructed physics objects. The performance of the missing transverse momentum reconstruction is evaluated using data collected in pp collisions at a centre-of-mass energy of 7 TeV in 2010. Minimum bias events and events with jets of hadrons are used from data samples corresponding to an integrated luminosity of about 0.3 nb –1 and 600 nb –1 respectively, together with events containing a Z boson decaying to two leptons (electrons or muons) or a Wmore » boson decaying to a lepton (electron or muon) and a neutrino, from a data sample corresponding to an integrated luminosity of about 36 pb –1. In conclusion, an estimate of the systematic uncertainty on the missing transverse momentum scale is presented.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27232520','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27232520"><span>Politeness and the communication of uncertainty.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Holtgraves, Thomas; Perdew, Audrey</p> <p>2016-09-01</p> <p>Ambiguity in language derives, in part, from the multiple motivations that underlie the choice to use any particular expression. The use of some lexical items, such as probability expressions and scalar terms, can be motivated by a desire to communicate uncertainty as well as a desire to be polite (i.e., manage face). Research has demonstrated that the interpretation of these items can be influenced by the existence of a potential politeness motive. In general, communications about negative events, relative to positive events, result in higher likelihood estimates whenever politeness can be discerned as a potential motive. With few exceptions, however, this research has focused only on the hearer. In the present research we focused on the dyad and examined whether speakers vary their messages as a function of politeness, and the effect that this has on subsequent judgments made by a recipient. In two experiments we presented participants with situations that varied in terms of face-threat and asked them how they would communicate potentially threatening information. Both experiments included a second set of participants who read these utterances and provided judgments as to the degree of uncertainty conveyed by the utterance. In both experiments, messages in the face-threatening condition conveyed greater uncertainty than messages in the non-face-threatening condition, and the probability estimates made by the second set of participants varied as a function of conveyed uncertainty. This research demonstrates that when examining speakers and hearers together, severe events may be judged less likely (rather than more likely), because speakers tend to hedge the certainty with which they communicate the information. Copyright © 2016 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1910447B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1910447B"><span>A Bayesian Network approach for flash flood risk assessment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Boutkhamouine, Brahim; Roux, Hélène; Pérès, François</p> <p>2017-04-01</p> <p>Climate change is contributing to the increase of natural disasters such as extreme weather events. Sometimes, these events lead to sudden flash floods causing devastating effects on life and property. Most recently, many regions of the French Mediterranean perimeter have endured such catastrophic flood events; Var (October 2015), Ardèche (November 2014), Nîmes (October 2014), Hérault, Gard and Languedoc (September 2014), and Pyrenees mountains (Jun 2013). Altogether, it resulted in dozens of victims and property damages amounting to millions of euros. With this heavy loss in mind, development of hydrological forecasting and warning systems is becoming an essential element in regional and national strategies. Flash flood forecasting but also monitoring is a difficult task because small ungauged catchments ( 10 km2) are often the most destructive ones as for the extreme flash flood event of September 2002 in the Cévennes region (France) (Ruin et al., 2008). The problem of measurement/prediction uncertainty is particularly crucial when attempting to develop operational flash-flood forecasting methods. Taking into account the uncertainty related to the model structure itself, to the model parametrization or to the model forcing (spatio-temporal rainfall, initial conditions) is crucial in hydrological modelling. Quantifying these uncertainties is of primary importance for risk assessment and decision making. Although significant improvements have been made in computational power and distributed hydrologic modelling, the issue dealing with integration of uncertainties into flood forecasting remains up-to-date and challenging. In order to develop a framework which could handle these uncertainties and explain their propagation through the model, we propose to explore the potential of graphical models (GMs) and, more precisely, Bayesian Networks (BNs). These networks are Directed Acyclic Graphs (DAGs) in which knowledge of a certain phenomenon is represented by influencing variables. Each node of the graph corresponds to a variable and arcs represent the probabilistic dependencies between these variables. Both the quantification of the strength of these probabilistic dependencies and the computation of inferences are based on Bayes' theorem. In order to use BNs for the assessment of the flooding risks, the modelling work is divided into two parts. First, identifying all the factors controlling the flood generation. The qualitative explanation of this issue is then reached by establishing the cause and effect relationships between these factors. These underlying relationships are represented in what we call Conditional Probabilities Tables (CPTs). The next step is to estimate these CPTs using information coming from network of sensors, databases and expertise. By using this basic cognitive structure, we will be able to estimate the magnitude of flood risk in a small geographical area with a homogeneous hydrological system. The second part of our work will be dedicated to the estimation of this risk on the scale of a basin. To do so, we will create a spatio-temporal model able to take in consideration both spatial and temporal variability of all factors involved in the flood generation. Key words: Flash flood forecasting - Uncertainty modelling - flood risk management -Bayesian Networks.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1367850-branch-bound-algorithm-applied-uncertainty-quantification-boiling-water-reactor-station-blackout','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1367850-branch-bound-algorithm-applied-uncertainty-quantification-boiling-water-reactor-station-blackout"><span>Branch-and-Bound algorithm applied to uncertainty quantification of a Boiling Water Reactor Station Blackout</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Nielsen, Joseph; Tokuhiro, Akira; Hiromoto, Robert; ...</p> <p>2015-11-13</p> <p>Evaluation of the impacts of uncertainty and sensitivity in modeling presents a significant set of challenges in particular to high fidelity modeling. Computational costs and validation of models creates a need for cost effective decision making with regards to experiment design. Experiments designed to validate computation models can be used to reduce uncertainty in the physical model. In some cases, large uncertainty in a particular aspect of the model may or may not have a large impact on the final results. For example, modeling of a relief valve may result in large uncertainty, however, the actual effects on final peakmore » clad temperature in a reactor transient may be small and the large uncertainty with respect to valve modeling may be considered acceptable. Additionally, the ability to determine the adequacy of a model and the validation supporting it should be considered within a risk informed framework. Low fidelity modeling with large uncertainty may be considered adequate if the uncertainty is considered acceptable with respect to risk. In other words, models that are used to evaluate the probability of failure should be evaluated more rigorously with the intent of increasing safety margin. Probabilistic risk assessment (PRA) techniques have traditionally been used to identify accident conditions and transients. Traditional classical event tree methods utilize analysts’ knowledge and experience to identify the important timing of events in coordination with thermal-hydraulic modeling. These methods lack the capability to evaluate complex dynamic systems. In these systems, time and energy scales associated with transient events may vary as a function of transition times and energies to arrive at a different physical state. Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. Unfortunately DPRA methods introduce issues associated with combinatorial explosion of states. This study presents a methodology to address combinatorial explosion using a Branch-and-Bound algorithm applied to Dynamic Event Trees (DET), which utilize LENDIT (L – Length, E – Energy, N – Number, D – Distribution, I – Information, and T – Time) as well as a set theory to describe system, state, resource, and response (S2R2) sets to create bounding functions for the DET. The optimization of the DET in identifying high probability failure branches is extended to create a Phenomenological Identification and Ranking Table (PIRT) methodology to evaluate modeling parameters important to safety of those failure branches that have a high probability of failure. The PIRT can then be used as a tool to identify and evaluate the need for experimental validation of models that have the potential to reduce risk. Finally, in order to demonstrate this methodology, a Boiling Water Reactor (BWR) Station Blackout (SBO) case study is presented.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JHyd..529.1373W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JHyd..529.1373W"><span>An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan</p> <p>2015-10-01</p> <p>Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1614918K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1614918K"><span>Seismic catalog condensation with applications to multifractal analysis of South Californian seismicity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen</p> <p>2014-05-01</p> <p>Latest advances in the instrumentation field have increased the station coverage and lowered event detection thresholds. This has resulted in a vast increase in the number of located events with each year. The abundance of data comes as a double edged sword: while it facilitates more robust statistics and provides better confidence intervals, it also paralyzes computations whose execution times grow exponentially with the number of data points. In this study, we present a novel method that assesses the relative importance of each data point, reduces the size of datasets while preserving the information content. For a given seismic catalog, the goal is to express the same spatial probability density distribution with fewer data points. To achieve this, we exploit the fact that seismic catalogs are not optimally encoded. This coding deficiency is the result of the sequential data entry where new events are added without taking into account previous ones. For instance, if there are several events with identical parameters occurring at the same location, these could be grouped together rather than occupying the same memory space as if they were distinct events. Following this reasoning, the proposed condensation methodology is implemented by grouping all event according to their overall variance, starting from the group with the highest variance (worst location uncertainty), each event is sampled by a number of sample points, these points are then used to calculate which better located events are able to express these probable locations with a higher likelihood. Based on these likelihood comparisons, weights from poorly located events are successively transferred to better located ones. As a result of the process, a large portion of the events (~30%) ends up with zero weights (thus being fully represented by events increasing their weights), while the information content (i.e the sum of all weights) remains preserved. The resulting condensed catalog not only provides more optimally encoding but is also regularized with respect to the local information quality. By investigating the locations of mass enrichment and depletion at different scales, we observe that the areas of increased mass are in good agreement with reported surface fault traces. We also conduct multifractal spatial analysis on condensed catalogs and investigate different spatial scaling regimes made clearer by reducing the effect of location uncertainty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017DPS....4950401B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017DPS....4950401B"><span>Overview of the strategies and results of the 2017 occultation campaigns involving (486958) 2014 MU69</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Buie, Marc W.; Porter, Simon Bernard; Terrell, Dirk; Tamblyn, Peter; Verbiscer, Anne J.; Soto, Alejandro; Wasserman, Lawrence H.; Zangari, Amanda Marie; Skrutskie, Michael F.; Parker, Alex; Young, Eliot F.; Benecchi, Susan; Stern, S. Alan; New Horizons MU69 Occultation Team; New Horizons MU69 Occultation Team</p> <p>2017-10-01</p> <p>Three stellar occultation opportunities were identified in 2017 involving the New Horizons extended mission target: (486958) 2014 MU69. The first event was on 2017 June 3 and predicted to be visible from southern South America and southern Africa with a somewhat faint star with g’=15.33. The second event was on 2017 June 10 under very difficult observing conditions just 16° from a full moon and the faintest star of the three with g’=15.57. The third event was on 2017 July 17 and predicted to be visible from southern Argentina with the brightest star of the three with g’=12.60. We pursued each of these events with an observing plan and strategy tuned to the constraints imposed by observing conditions and the anticipated prediction uncertainties. The first and third events were amenable to a ground-based telescope deployment and we fielded 25 telescopes. The second event was possible only with SOFIA (Stratospheric Observatory For Infrared Astronomy). The deployment for the first event involved splitting resources between two continents and a strategy optimized to prevent a null result for a D=40km object. The second event was optimized for the search for dust and rings but had a 75% chance of a solid body event for a D=40 km size. The third event was driven by needing to prevent a null result on a D=10 km size and providing extra conservatism on the ground-track uncertainty while observing from the area of Comodoro Rivadavia, Argentina. All campaigns were successful in recording data essential for the constraint on dust or rings around MU69: June 3, 24 lightcurves; July 10, 1 lightcurve; July 17, 23 lightcurves. Only the last event was able to record solid-body chords from the object with 5 chords detected close to the predicted time and place. We will present an overview of the strategies and basic results of the campaigns. This work would not have been possible without the financial support of the New Horizons mission and NASA, astrometric support of the Gaia mission, and logistical support from Argentina and specifically Comodoro Rivadavia as well as assistance from the US Embassies in Buenos Aires and Cape Town.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1917758D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1917758D"><span>Hydrologic ensembles based on convection-permitting precipitation nowcasts for flash flood warnings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Demargne, Julie; Javelle, Pierre; Organde, Didier; de Saint Aubin, Céline; Ramos, Maria-Helena</p> <p>2017-04-01</p> <p>In order to better anticipate flash flood events and provide timely warnings to communities at risk, the French national service in charge of flood forecasting (SCHAPI) is implementing a national flash flood warning system for small-to-medium ungauged basins. Based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014), the current version of the system runs a simplified hourly distributed hydrologic model with operational radar-gauge QPE grids from Météo-France at a 1-km2 resolution every 15 minutes. This produces real-time peak discharge estimates along the river network, which are subsequently compared to regionalized flood frequency estimates to provide warnings according to the AIGA-estimated return period of the ongoing event. To further extend the effective warning lead time while accounting for hydrometeorological uncertainties, the flash flood warning system is being enhanced to include Météo-France's AROME-NWC high-resolution precipitation nowcasts as time-lagged ensembles and multiple sets of hydrological regionalized parameters. The operational deterministic precipitation forecasts, from the nowcasting version of the AROME convection-permitting model (Auger et al. 2015), were provided at a 2.5-km resolution for a 6-hr forecast horizon for 9 significant rain events from September 2014 to June 2016. The time-lagged approach is a practical choice of accounting for the atmospheric forecast uncertainty when no extensive forecast archive is available for statistical modelling. The evaluation on 781 French basins showed significant improvements in terms of flash flood event detection and effective warning lead-time, compared to warnings from the current AIGA setup (without any future precipitation). We also discuss how to effectively communicate verification information to help determine decision-relevant warning thresholds for flood magnitude and probability. Javelle, P., Demargne, J., Defrance, D., Arnaud, P., 2014. Evaluating flash flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, doi: 10.1080/02626667.2014.923970 Auger, L., Dupont, O., Hagelin, S., Brousseau, P., Brovelli, P., 2015. AROME-NWC: a new nowcasting tool based on an operational mesoscale forecasting system. Quarterly Journal of the Royal Meteorological Society, 141: 1603-1611, doi:10.1002/qj.2463</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1810119D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1810119D"><span>Flash flood warnings for ungauged basins based on high-resolution precipitation forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Demargne, Julie; Javelle, Pierre; Organde, Didier; de Saint Aubin, Céline; Janet, Bruno</p> <p>2016-04-01</p> <p>Early detection of flash floods, which are typically triggered by severe rainfall events, is still challenging due to large meteorological and hydrologic uncertainties at the spatial and temporal scales of interest. Also the rapid rising of waters necessarily limits the lead time of warnings to alert communities and activate effective emergency procedures. To better anticipate such events and mitigate their impacts, the French national service in charge of flood forecasting (SCHAPI) is implementing a national flash flood warning system for small-to-medium (up to 1000 km²) ungauged basins based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014). The current deterministic AIGA system has been run in real-time in the South of France since 2005 and has been tested in the RHYTMME project (rhytmme.irstea.fr/). It ingests the operational radar-gauge QPE grids from Météo-France to run a simplified hourly distributed hydrologic model at a 1-km² resolution every 15 minutes. This produces real-time peak discharge estimates along the river network, which are subsequently compared to regionalized flood frequency estimates to provide warnings according to the AIGA-estimated return period of the ongoing event. The calibration and regionalization of the hydrologic model has been recently enhanced for implementing the national flash flood warning system for the entire French territory by 2016. To further extend the effective warning lead time, the flash flood warning system is being enhanced to ingest Météo-France's AROME-NWC high-resolution precipitation nowcasts. The AROME-NWC system combines the most recent available observations with forecasts from the nowcasting version of the AROME convection-permitting model (Auger et al. 2015). AROME-NWC pre-operational deterministic precipitation forecasts, produced every hour at a 2.5-km resolution for a 6-hr forecast horizon, were provided for 3 significant rain events in September and November 2014 and ingested as time-lagged ensembles. The time-lagged approach is a practical choice of accounting for the atmospheric forecast uncertainty when no extensive forecast archive is available for statistical modelling. The evaluation on 185 basins in the South of France showed significant improvements in terms of flash flood event detection and effective warning lead-time, compared to warnings from the current AIGA setup (without any future precipitation). Various verification metrics (e.g., Relative Mean Error, Brier Skill Score) show the skill of ensemble precipitation and flow forecasts compared to single-valued persistency benchmarks. Planned enhancements include integrating additional probabilistic NWP products (e.g., AROME precipitation ensembles on longer forecast horizon), accounting for and reducing hydrologic uncertainties from the model parameters and initial conditions via data assimilation, and developing a comprehensive observational and post-event damage database to determine decision-relevant warning thresholds for flood magnitude and probability. Javelle, P., Demargne, J., Defrance, D., Arnaud, P., 2014. Evaluating flash flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, doi: 10.1080/02626667.2014.923970 Auger, L., Dupont, O., Hagelin, S., Brousseau, P., Brovelli, P., 2015. AROME-NWC: a new nowcasting tool based on an operational mesoscale forecasting system. Quarterly Journal of the Royal Meteorological Society, 141: 1603-1611, doi: 10.1002/qj.2463</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AGUFMGC11A0885B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AGUFMGC11A0885B"><span>Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Boslough, M.</p> <p>2011-12-01</p> <p>Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20000120142','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20000120142"><span>Qualitative Discovery in Medical Databases</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Maluf, David A.</p> <p>2000-01-01</p> <p>Implication rules have been used in uncertainty reasoning systems to confirm and draw hypotheses or conclusions. However a major bottleneck in developing such systems lies in the elicitation of these rules. This paper empirically examines the performance of evidential inferencing with implication networks generated using a rule induction tool called KAT. KAT utilizes an algorithm for the statistical analysis of empirical case data, and hence reduces the knowledge engineering efforts and biases in subjective implication certainty assignment. The paper describes several experiments in which real-world diagnostic problems were investigated; namely, medical diagnostics. In particular, it attempts to show that: (1) with a limited number of case samples, KAT is capable of inducing implication networks useful for making evidential inferences based on partial observations, and (2) observation driven by a network entropy optimization mechanism is effective in reducing the uncertainty of predicted events.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19927799','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19927799"><span>[Impact of water pollution risk in water transfer project based on fault tree analysis].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Liu, Jian-Chang; Zhang, Wei; Wang, Li-Min; Li, Dai-Qing; Fan, Xiu-Ying; Deng, Hong-Bing</p> <p>2009-09-15</p> <p>The methods to assess water pollution risk for medium water transfer are gradually being explored. The event-nature-proportion method was developed to evaluate the probability of the single event. Fault tree analysis on the basis of calculation on single event was employed to evaluate the extent of whole water pollution risk for the channel water body. The result indicates, that the risk of pollutants from towns and villages along the line of water transfer project to the channel water body is at high level with the probability of 0.373, which will increase pollution to the channel water body at the rate of 64.53 mg/L COD, 4.57 mg/L NH4(+) -N and 0.066 mg/L volatilization hydroxybenzene, respectively. The measurement of fault probability on the basis of proportion method is proved to be useful in assessing water pollution risk under much uncertainty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1511653R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1511653R"><span>Paleoclimate networks: a concept meeting central challenges in the reconstruction of paleoclimate dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rehfeld, Kira; Goswami, Bedartha; Marwan, Norbert; Breitenbach, Sebastian; Kurths, Jürgen</p> <p>2013-04-01</p> <p>Statistical analysis of dependencies amongst paleoclimate data helps to infer on the climatic processes they reflect. Three key challenges have to be addressed, however: the datasets are heterogeneous in time (i) and space (ii), and furthermore time itself is a variable that needs to be reconstructed, which (iii) introduces additional uncertainties. To address these issues in a flexible way we developed the paleoclimate network framework, inspired by the increasing application of complex networks in climate research. Nodes in the paleoclimate network represent a paleoclimate archive, and an associated time series. Links between these nodes are assigned, if these time series are significantly similar. Therefore, the base of the paleoclimate network is formed by linear and nonlinear estimators for Pearson correlation, mutual information and event synchronization, which quantify similarity from irregularly sampled time series. Age uncertainties are propagated into the final network analysis using time series ensembles which reflect the uncertainty. We discuss how spatial heterogeneity influences the results obtained from network measures, and demonstrate the power of the approach by inferring teleconnection variability of the Asian summer monsoon for the past 1000 years.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017hsn..book.2577S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017hsn..book.2577S"><span>The Hubble Constant from Supernovae</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Saha, Abhijit; Macri, Lucas M.</p> <p></p> <p>The decades-long quest to obtain a precise and accurate measurement of the local expansion rate of the universe (the Hubble Constant or H0) has greatly benefited from the use of supernovae (SNe). Starting from humble beginnings (dispersions of ˜ 0.5 mag in the Hubble flow in the late 1960s/early 1970s), the increasingly more sophisticated understanding, classification, and analysis of these events turned type Ia SNe into the premiere choice for a secondary distance indicator by the early 1990s. While some systematic uncertainties specific to SNe and to Cepheid-based distances to the calibrating host galaxies still contribute to the H0 error budget, the major emphasis over the past two decades has been on reducing the statistical uncertainty by obtaining ever-larger samples of distances to SN hosts. Building on early efforts with the first-generation instruments on the Hubble Space Telescope, recent observations with the latest instruments on this facility have reduced the estimated total uncertainty on H0 to 2.4 % and shown a path to reach a 1 % measurement by the end of the decade, aided by Gaia and the James Webb Space Telescope.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JAfES.142..124J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JAfES.142..124J"><span>Synthesis of instrumentally and historically recorded earthquakes and studying their spatial statistical relationship (A case study: Dasht-e-Biaz, Eastern Iran)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jalali, Mohammad; Ramazi, Hamidreza</p> <p>2018-06-01</p> <p>Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1358084','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1358084"><span>A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael</p> <p></p> <p>The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27386264','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27386264"><span>Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Xin, Cao; Chongshi, Gu</p> <p>2016-01-01</p> <p>Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1810291K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1810291K"><span>Trans-dimensional and hierarchical Bayesian approaches toward rigorous estimation of seismic sources and structures in the Northeast Asia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean</p> <p>2016-04-01</p> <p>A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ERL....12a4017A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ERL....12a4017A"><span>Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abadie, Luis Maria; Galarraga, Ibon; Sainz de Murieta, Elisa</p> <p>2017-01-01</p> <p>A quantification of present and future mean annual losses due to extreme coastal events can be crucial for adequate decision making on adaptation to climate change in coastal areas around the globe. However, this approach is limited when uncertainty needs to be accounted for. In this paper, we assess coastal flood risk from sea-level rise and extreme events in 120 major cities around the world using an alternative stochastic approach that accounts for uncertainty. Probability distributions of future relative (local) sea-level rise have been used for each city, under three IPPC emission scenarios, RCP 2.6, 4.5 and 8.5. The approach allows a continuous stochastic function to be built to assess yearly evolution of damages from 2030 to 2100. Additionally, we present two risk measures that put low-probability, high-damage events in the spotlight: the Value at Risk (VaR) and the Expected Shortfall (ES), which enable the damages to be estimated when a certain risk level is exceeded. This level of acceptable risk can be defined involving different stakeholders to guide progressive adaptation strategies. The method presented here is new in the field of economics of adaptation and offers a much broader picture of the challenges related to dealing with climate impacts. Furthermore, it can be applied to assess not only adaptation needs but also to put adaptation into a timeframe in each city.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/897576','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/897576"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Gibson, Adam Paul</p> <p></p> <p>The authors present a measurement of the mass of the top quark. The event sample is selected from proton-antiproton collisions, at 1.96 TeV center-of-mass energy, observed with the CDF detector at Fermilab's Tevatron. They consider a 318 pb -1 dataset collected between March 2002 and August 2004. They select events that contain one energetic lepton, large missing transverse energy, exactly four energetic jets, and at least one displaced vertex b tag. The analysis uses leading-order tmore » $$\\bar{t}$$ and background matrix elements along with parameterized parton showering to construct event-by-event likelihoods as a function of top quark mass. From the 63 events observed with the 318 pb -1 dataset they extract a top quark mass of 172.0 ± 2.6(stat) ± 3.3(syst) GeV/c 2 from the joint likelihood. The mean expected statistical uncertainty is 3.2 GeV/c 2 for m $$\\bar{t}$$ = 178 GTeV/c 2 and 3.1 GeV/c 2 for m $$\\bar{t}$$ = 172.5 GeV/c 2. The systematic error is dominated by the uncertainty of the jet energy scale.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19303128','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19303128"><span>Assessment of annual pollutant loads in combined sewers from continuous turbidity measurements: sensitivity to calibration data.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lacour, C; Joannis, C; Chebbo, G</p> <p>2009-05-01</p> <p>This article presents a methodology for assessing annual wet weather Suspended Solids (SS) and Chemical Oxygen Demand (COD) loads in combined sewers, along with the associated uncertainties from continuous turbidity measurements. The proposed method is applied to data from various urban catchments in the cities of Paris and Nantes. The focus here concerns the impact of the number of rain events sampled for calibration (i.e. through establishing linear SS/turbidity or COD/turbidity relationships) on the uncertainty of annual pollutant load assessments. Two calculation methods are investigated, both of which rely on Monte Carlo simulations: random assignment of event-specific calibration relationships to each individual rain event, and the use of an overall relationship built from the entire available data set. Since results indicate a fairly low inter-event variability for calibration relationship parameters, an accurate assessment of pollutant loads can be derived, even when fewer than 10 events are sampled for calibration purposes. For operational applications, these results suggest that turbidity could provide a more precise evaluation of pollutant loads at lower cost than typical sampling methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010JHyd..387..176T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010JHyd..387..176T"><span>Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tramblay, Yves; Bouvier, Christophe; Martin, Claude; Didon-Lescot, Jean-François; Todorovik, Dragana; Domergue, Jean-Marc</p> <p>2010-06-01</p> <p>Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture, modelled soil moisture through the Interaction-Sol-Biosphère-Atmosphère (ISBA) component of the SIM model (Météo-France), antecedent precipitation and base flow. A modelling approach based on the Soil Conservation Service-Curve Number method (SCS-CN) is used to simulate the flood events in a small headwater catchment in the Cevennes region (France). The model involves two parameters: one for the runoff production, S, and one for the routing component, K. The S parameter can be interpreted as the maximal water retention capacity, and acts as the initial condition of the model, depending on the antecedent moisture conditions. The model was calibrated from a 20-flood sample, and led to a median Nash value of 0.9. The local TDR measurements in the deepest layers of soil (80-140 cm) were found to be the best predictors for the S parameter. TDR measurements averaged over the whole soil profile, outputs of the SIM model, and the logarithm of base flow also proved to be good predictors, whereas antecedent precipitations were found to be less efficient. The good correlations observed between the TDR predictors and the S calibrated values indicate that monitoring soil moisture could help setting the initial conditions for simplified event-based models in small basins.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22333998','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22333998"><span>Design of the value of imaging in enhancing the wellness of your heart (VIEW) trial and the impact of uncertainty on power.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ambrosius, Walter T; Polonsky, Tamar S; Greenland, Philip; Goff, David C; Perdue, Letitia H; Fortmann, Stephen P; Margolis, Karen L; Pajewski, Nicholas M</p> <p>2012-04-01</p> <p>Although observational evidence has suggested that the measurement of coronary artery calcium (CAC) may improve risk stratification for cardiovascular events and thus help guide the use of lipid-lowering therapy, this contention has not been evaluated within the context of a randomized trial. The Value of Imaging in Enhancing the Wellness of Your Heart (VIEW) trial is proposed as a randomized study in participants at low intermediate risk of future coronary heart disease (CHD) events to evaluate whether CAC testing leads to improved patient outcomes. To describe the challenges encountered in designing a prototypical screening trial and to examine the impact of uncertainty on power. The VIEW trial was designed as an effectiveness clinical trial to examine the benefit of CAC testing to guide therapy on a primary outcome consisting of a composite of nonfatal myocardial infarction, probable or definite angina with revascularization, resuscitated cardiac arrest, nonfatal stroke (not transient ischemic attack (TIA)), CHD death, stroke death, other atherosclerotic death, or other cardiovascular disease (CVD) death. Many critical choices were faced in designing the trial, including (1) the choice of primary outcome, (2) the choice of therapy, (3) the target population with corresponding ethical issues, (4) specifications of assumptions for sample size calculations, and (5) impact of uncertainty in these assumptions on power/sample size determination. We have proposed a sample size of 30,000 (800 events), which provides 92.7% power. Alternatively, sample sizes of 20,228 (539 events), 23,138 (617 events), and 27,078 (722 events) provide 80%, 85%, and 90% power. We have also allowed for uncertainty in our assumptions by computing average power integrated over specified prior distributions. This relaxation of specificity indicates a reduction in power, dropping to 89.9% (95% confidence interval (CI): 89.8-89.9) for a sample size of 30,000. Samples sizes of 20,228, 23,138, and 27,078 provide power of 78.0% (77.9-78.0), 82.5% (82.5-82.6), and 87.2% (87.2-87.3), respectively. These power estimates are dependent on form and parameters of the prior distributions. Despite the pressing need for a randomized trial to evaluate the utility of CAC testing, conduct of such a trial requires recruiting a large patient population, making efficiency of critical importance. The large sample size is primarily due to targeting a study population at relatively low risk of a CVD event. Our calculations also illustrate the importance of formally considering uncertainty in power calculations of large trials as standard power calculations may tend to overestimate power.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4475283','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4475283"><span>Design of the Value of Imaging in Enhancing the Wellness of Your Heart (VIEW) Trial and the Impact of Uncertainty on Power</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Ambrosius, Walter T.; Polonsky, Tamar S.; Greenland, Philip; Goff, David C.; Perdue, Letitia H.; Fortmann, Stephen P.; Margolis, Karen L.; Pajewski, Nicholas M.</p> <p>2014-01-01</p> <p>Background Although observational evidence has suggested that the measurement of CAC may improve risk stratification for cardiovascular events and thus help guide the use of lipid-lowering therapy, this contention has not been evaluated within the context of a randomized trial. The Value of Imaging in Enhancing the Wellness of Your Heart (VIEW) trial is proposed as a randomized study in participants at low intermediate risk of future coronary heart disease (CHD) events to evaluate whether coronary artery calcium (CAC) testing leads to improved patient outcomes. Purpose To describe the challenges encountered in designing a prototypical screening trial and to examine the impact of uncertainty on power. Methods The VIEW trial was designed as an effectiveness clinical trial to examine the benefit of CAC testing to guide therapy on a primary outcome consisting of a composite of non-fatal myocardial infarction, probable or definite angina with revascularization, resuscitated cardiac arrest, non-fatal stroke (not transient ischemic attack (TIA)), CHD death, stroke death, other atherosclerotic death, or other cardiovascular disease (CVD) death. Many critical choices were faced in designing the trial, including: (1) the choice of primary outcome, (2) the choice of therapy, (3) the target population with corresponding ethical issues, (4) specifications of assumptions for sample size calculations, and (5) impact of uncertainty in these assumptions on power/sample size determination. Results We have proposed a sample size of 30,000 (800 events) which provides 92.7% power. Alternatively, sample sizes of 20,228 (539 events), 23,138 (617 events) and 27,078 (722 events) provide 80, 85, and 90% power. We have also allowed for uncertainty in our assumptions by computing average power integrated over specified prior distributions. This relaxation of specificity indicates a reduction in power, dropping to 89.9% (95% confidence interval (CI): 89.8 to 89.9) for a sample size of 30,000. Samples sizes of 20,228, 23,138, and 27,078 provide power of 78.0% (77.9 to 78.0), 82.5% (82.5 to 82.6), and 87.2% (87.2 to 87.3), respectively. Limitations These power estimates are dependent on form and parameters of the prior distributions. Conclusions Despite the pressing need for a randomized trial to evaluate the utility of CAC testing, conduct of such a trial requires recruiting a large patient population, making efficiency of critical importance. The large sample size is primarily due to targeting a study population at relatively low risk of a CVD event. Our calculations also illustrate the importance of formally considering uncertainty in power calculations of large trials as standard power calculations may tend to overestimate power. PMID:22333998</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.8447B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.8447B"><span>Integration of expert knowledge and uncertainty in natural risk assessment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Baruffini, Mirko; Jaboyedoff, Michel</p> <p>2010-05-01</p> <p>Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/984645','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/984645"><span>Measurement of the $B^-$ lifetime using a simulation free approach for trigger bias correction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Aaltonen, T.; /Helsinki Inst. of Phys.; Adelman, J.</p> <p>2010-04-01</p> <p>The collection of a large number of B hadron decays to hadronic final states at the CDF II detector is possible due to the presence of a trigger that selects events based on track impact parameters. However, the nature of the selection requirements of the trigger introduces a large bias in the observed proper decay time distribution. A lifetime measurement must correct for this bias and the conventional approach has been to use a Monte Carlo simulation. The leading sources of systematic uncertainty in the conventional approach are due to differences between the data and the Monte Carlo simulation. Inmore » this paper they present an analytic method for bias correction without using simulation, thereby removing any uncertainty between data and simulation. This method is presented in the form of a measurement of the lifetime of the B{sup -} using the mode B{sup -} {yields} D{sup 0}{pi}{sup -}. The B{sup -} lifetime is measured as {tau}{sub B{sup -}} = 1.663 {+-} 0.023 {+-} 0.015 ps, where the first uncertainty is statistical and the second systematic. This new method results in a smaller systematic uncertainty in comparison to methods that use simulation to correct for the trigger bias.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.A21F0211S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.A21F0211S"><span>Exploring regional stakeholder needs and requirements in terms of Extreme Weather Event Attribution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schwab, M.; Meinke, I.; Vanderlinden, J. P.; Touili, N.; Von Storch, H.</p> <p>2015-12-01</p> <p>Extreme event attribution has increasingly received attention in the scientific community. It may also serve decision-making at the regional level where much of the climate change impact mitigation takes place. Nevertheless, there is, to date, little known about the requirements of regional actors in terms of extreme event attribution. We have therefore analysed these at the example of regional decision-makers for climate change-related activities and/or concerned with storm surge risks at the German Baltic Sea and heat wave risks in the Greater Paris area. In order to explore if stakeholders find scientific knowledge from extreme event attribution useful and how this information might be relevant to their decision-making, we consulted a diverse set of actors engaged in the assessment, mitigation and communication of storm surge, heat wave, and climate change-related risks. Extreme event attribution knowledge was perceived to be most useful to public and political awareness-raising, but was of little or no relevance for the consulted stakeholders themselves. It was not acknowledged that it would support adaptation planning as sometimes argued in the literature. The consulted coastal protection, health, and urban adaptation planners rather needed reliable statements about possible future changes in extreme events than causal statements about past events. To enhance salience, a suitable product of event attribution should be linked to regional problems, vulnerabilities, and impacts of climate change. Given that the tolerance of uncertainty is rather low, most of the stakeholders also claimed that a suitable product of event attribution is to be received from a trusted "honest broker" and published rather later, but with smaller uncertainties than vice versa. Institutional mechanisms, like regional climate services, which enable and foster communication, translation and mediation across the boundaries between knowledge and action can help fulfill such requirements. This is of particular importance for extreme event attribution which is often understood as science producing complex and abstract information attached to large uncertainties. They can serve as an interface for creating the necessary mutual understanding by being in a continuous dialogue with both science and stakeholders.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/53301','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/53301"><span>Uncertainty in natural hazards, modeling and decision support: An introduction to this volume [Chapter 1</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p>Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde</p> <p>2017-01-01</p> <p>Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Panic+AND+Disorder&pg=6&id=EJ855514','ERIC'); return false;" href="https://eric.ed.gov/?q=Panic+AND+Disorder&pg=6&id=EJ855514"><span>Preliminary Investigation of Intolerance of Uncertainty Treatment for Anxiety Disorders</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Hewitt, Sarah N.; Egan, Sarah; Rees, Clare</p> <p>2009-01-01</p> <p>Intolerance of uncertainty (IU) is the tendency to react negatively to uncertain situations or events, and it has been found to be an important maintaining factor in a number of different anxiety disorders. It is often included as a part of cognitive behavioural interventions for anxiety disorders but its specific contribution to treatment outcome…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=338490&Lab=NHEERL&keyword=pesticides+AND+human+AND+health&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=338490&Lab=NHEERL&keyword=pesticides+AND+human+AND+health&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Quantification of the uncertainties in extrapolating from in vitro androgen receptor (AR) antagonism to key events in in vivo screening assays and adverse reproductive outcomes in F1 male rats</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>There are multiple molecular initiating events (MIEs) that can disrupt male sexual differentiation including AR antagonism and inhibition of synthesis, and metabolism of fetal testosterone. Disruption of this event by pesticides like vinclozolin that act as AR antagonists and ph...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.S11H..04M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.S11H..04M"><span>Assessing the Uncertainties on Seismic Source Parameters: Towards Realistic Estimates of Moment Tensor Determinations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Magnoni, F.; Scognamiglio, L.; Tinti, E.; Casarotti, E.</p> <p>2014-12-01</p> <p>Seismic moment tensor is one of the most important source parameters defining the earthquake dimension and style of the activated fault. Moment tensor catalogues are ordinarily used by geoscientists, however, few attempts have been done to assess possible impacts of moment magnitude uncertainties upon their own analysis. The 2012 May 20 Emilia mainshock is a representative event since it is defined in literature with a moment magnitude value (Mw) spanning between 5.63 and 6.12. An uncertainty of ~0.5 units in magnitude leads to a controversial knowledge of the real size of the event. The possible uncertainty associated to this estimate could be critical for the inference of other seismological parameters, suggesting caution for seismic hazard assessment, coulomb stress transfer determination and other analyses where self-consistency is important. In this work, we focus on the variability of the moment tensor solution, highlighting the effect of four different velocity models, different types and ranges of filtering, and two different methodologies. Using a larger dataset, to better quantify the source parameter uncertainty, we also analyze the variability of the moment tensor solutions depending on the number, the epicentral distance and the azimuth of used stations. We endorse that the estimate of seismic moment from moment tensor solutions, as well as the estimate of the other kinematic source parameters, cannot be considered an absolute value and requires to come out with the related uncertainties and in a reproducible framework characterized by disclosed assumptions and explicit processing workflows.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1839568','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1839568"><span>Situational Awareness During Mass-Casualty Events: Command and Control</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Demchak, Barry; Chan, Theordore C.; Griswold, William G.; Lenert, Leslie</p> <p>2006-01-01</p> <p>In existing Incident Command systems1, situational awareness is achieved manually through paper tracking systems. Such systems often produce high latencies and incomplete data, resulting in inefficient and ineffective resource deployment. The WIISARD2 system collects much more data than a paper-based system, dramatically reducing latency while increasing the kinds and quality of information available to Incident Commanders. The WIISARD Command Center solves the problem of data overload and uncertainty through the careful use of limited screen area and novel visualization techniques. PMID:17238524</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015SPIE.9815E..1BZ','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015SPIE.9815E..1BZ"><span>A risk-based coverage model for video surveillance camera control optimization</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua</p> <p>2015-12-01</p> <p>Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5102323','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5102323"><span>Rapid classification of hippocampal replay content for real-time applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Liu, Daniel F.; Karlsson, Mattias P.; Frank, Loren M.; Eden, Uri T.</p> <p>2016-01-01</p> <p>Sharp-wave ripple (SWR) events in the hippocampus replay millisecond-timescale patterns of place cell activity related to the past experience of an animal. Interrupting SWR events leads to learning and memory impairments, but how the specific patterns of place cell spiking seen during SWRs contribute to learning and memory remains unclear. A deeper understanding of this issue will require the ability to manipulate SWR events based on their content. Accurate real-time decoding of SWR replay events requires new algorithms that are able to estimate replay content and the associated uncertainty, along with software and hardware that can execute these algorithms for biological interventions on a millisecond timescale. Here we develop an efficient estimation algorithm to categorize the content of replay from multiunit spiking activity. Specifically, we apply real-time decoding methods to each SWR event and then compute the posterior probability of the replay feature. We illustrate this approach by classifying SWR events from data recorded in the hippocampus of a rat performing a spatial memory task into four categories: whether they represent outbound or inbound trajectories and whether the activity is replayed forward or backward in time. We show that our algorithm can classify the majority of SWR events in a recording epoch within 20 ms of the replay onset with high certainty, which makes the algorithm suitable for a real-time implementation with short latencies to incorporate into content-based feedback experiments. PMID:27535369</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19113769','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19113769"><span>Measurement of the branching fractions of B-->D**(l) nu(l) decays in events tagged by a fully reconstructed B meson.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Aubert, B; Bona, M; Karyotakis, Y; Lees, J P; Poireau, V; Prencipe, E; Prudent, X; Tisserand, V; Garra Tico, J; Grauges, E; Lopez, L; Palano, A; Pappagallo, M; Eigen, G; Stugu, B; Sun, L; Abrams, G S; Battaglia, M; Brown, D N; Cahn, R N; Jacobsen, R G; Kerth, L T; Kolomensky, Yu G; Lynch, G; Osipenkov, I L; Ronan, M T; Tackmann, K; Tanabe, T; Hawkes, C M; Soni, N; Watson, A T; Koch, H; Schroeder, T; Walker, D; Asgeirsson, D J; Fulsom, B G; Hearty, C; Mattison, T S; Mckenna, J A; Barrett, M; Khan, A; Blinov, V E; Bukin, A D; Buzykaev, A R; Druzhinin, V P; Golubev, V B; Onuchin, A P; Serednyakov, S I; Skovpen, Yu I; Solodov, E P; Todyshev, K Yu; Bondioli, M; Curry, S; Eschrich, I; Kirkby, D; Lankford, A J; Lund, P; Mandelkern, M; Martin, E C; Stoker, D P; Abachi, S; Buchanan, C; Gary, J W; Liu, F; Long, O; Shen, B C; Vitug, G M; Yasin, Z; Zhang, L; Sharma, V; Campagnari, C; Hong, T M; Kovalskyi, D; Mazur, M A; Richman, J D; Beck, T W; Eisner, A M; Flacco, C J; Heusch, C A; Kroseberg, J; Lockman, W S; Schalk, T; Schumm, B A; Seiden, A; Wang, L; Wilson, M G; Winstrom, L O; Cheng, C H; Doll, D A; Echenard, B; Fang, F; Hitlin, D G; Narsky, I; Piatenko, T; Porter, F C; Andreassen, R; Mancinelli, G; Meadows, B T; Mishra, K; Sokoloff, M D; Bloom, P C; Ford, W T; Gaz, A; Hirschauer, J F; Nagel, M; Nauenberg, U; Smith, J G; Ulmer, K A; Wagner, S R; Ayad, R; Soffer, A; Toki, W H; Wilson, R J; Altenburg, D D; Feltresi, E; Hauke, A; Jasper, H; Karbach, M; Merkel, J; Petzold, A; Spaan, B; Wacker, K; Kobel, M J; Mader, W F; Nogowski, R; Schubert, K R; Schwierz, R; Sundermann, J E; Volk, A; Bernard, D; Bonneaud, G R; Latour, E; Thiebaux, Ch; Verderi, M; Clark, P J; Gradl, W; Playfer, S; Watson, J E; Andreotti, M; Bettoni, D; Bozzi, C; Calabrese, R; Cecchi, A; Cibinetto, G; Franchini, P; Luppi, E; Negrini, M; Petrella, A; Piemontese, L; Santoro, V; Baldini-Ferroli, R; Calcaterra, A; de Sangro, R; Finocchiaro, G; Pacetti, S; Patteri, P; Peruzzi, I M; Piccolo, M; Rama, M; Zallo, A; Buzzo, A; Contri, R; Lo Vetere, M; Macri, M M; Monge, M R; Passaggio, S; Patrignani, C; Robutti, E; Santroni, A; Tosi, S; Chaisanguanthum, K S; Morii, M; Marks, J; Schenk, S; Uwer, U; Klose, V; Lacker, H M; Bard, D J; Dauncey, P D; Nash, J A; Panduro Vazquez, W; Tibbetts, M; Behera, P K; Chai, X; Charles, M J; Mallik, U; Cochran, J; Crawley, H B; Dong, L; Meyer, W T; Prell, S; Rosenberg, E I; Rubin, A E; Gao, Y Y; Gritsan, A V; Guo, Z J; Lae, C K; Denig, A G; Fritsch, M; Schott, G; Arnaud, N; Béquilleux, J; D'Orazio, A; Davier, M; Firmino da Costa, J; Grosdidier, G; Höcker, A; Lepeltier, V; Le Diberder, F; Lutz, A M; Pruvot, S; Roudeau, P; Schune, M H; Serrano, J; Sordini, V; Stocchi, A; Wormser, G; Lange, D J; Wright, D M; Bingham, I; Burke, J P; Chavez, C A; Fry, J R; Gabathuler, E; Gamet, R; Hutchcroft, D E; Payne, D J; Touramanis, C; Bevan, A J; Clarke, C K; George, K A; Di Lodovico, F; Sacco, R; Sigamani, M; Cowan, G; Flaecher, H U; Hopkins, D A; Paramesvaran, S; Salvatore, F; Wren, A C; Brown, D N; Davis, C L; Alwyn, K E; Bailey, D; Barlow, R J; Chia, Y M; Edgar, C L; Jackson, G; Lafferty, G D; West, T J; Yi, J I; Anderson, J; Chen, C; Jawahery, A; Roberts, D A; Simi, G; Tuggle, J M; Dallapiccola, C; Li, X; Salvati, E; Saremi, S; Cowan, R; Dujmic, D; Fisher, P H; Koeneke, K; Sciolla, G; Spitznagel, M; Taylor, F; Yamamoto, R K; Zhao, M; Patel, P M; Robertson, S H; Lazzaro, A; Lombardo, V; Palombo, E; Bauer, J M; Cremaldi, L; Eschenburg, V; Godang, R; Kroeger, R; Sanders, D A; Summers, D J; Zhao, H W; Simard, M; Taras, P; Viaud, F B; Nicholson, H; De Nardo, G; Lista, L; Monorchio, D; Onorato, G; Sciacca, C; Raven, G; Snoek, H L; Jessop, C P; Knoepfel, K J; Lo Secco, J M; Wang, W F; Benelli, G; Corwin, L A; Honscheid, K; Kagan, H; Kass, R; Morris, J P; Rahimi, A M; Regensburger, J J; Sekula, S J; Wong, Q K; Blount, N L; Brau, J; Frey, R; Igonkina, O; Kolb, J A; Lu, M; Rahmat, R; Sinev, N B; Strom, D; Strube, J; Torrence, E; Castelli, G; Gagliardi, N; Margoni, M; Morandin, M; Posocco, M; Rotondo, M; Simonetto, F; Stroili, R; Voci, C; del Amo Sanchez, P; Ben-Haim, E; Briand, H; Calderini, G; Chauveau, J; David, P; Del Buono, L; Hamon, O; Leruste, Ph; Ocariz, J; Perez, A; Prendki, J; Sitt, S; Gladney, L; Biasini, M; Covarelli, R; Manoni, E; Angelini, C; Batignani, G; Bettarini, S; Carpinelli, M; Cervelli, A; Forti, E; Giorgi, M A; Lusiani, A; Marchiori, G; Morganti, M; Neri, N; Paoloni, E; Rizzo, G; Walsh, J J; Lopes Pegna, D; Lu, C; Olsen, J; Smith, A J S; Telnov, A V; Anulli, F; Baracchini, E; Cavoto, G; del Re, D; Di Marco, E; Faccini, R; Ferrarotto, F; Ferroni, F; Gaspero, M; Jackson, P D; Li Gioi, L; Mazzoni, M A; Morganti, S; Piredda, G; Polci, F; Renga, F; Voena, C; Ebert, M; Hartmann, T; Schröder, H; Waldi, R; Adye, T; Franek, B; Olaiya, E O; Wilson, F F; Emery, S; Escalier, M; Esteve, L; Ganzhur, S F; Hamel de Monchenault, G; Kozanecki, W; Vasseur, G; Yèche, Ch; Zito, M; Chen, X R; Liu, H; Park, W; Purohit, M V; White, R M; Wilson, J R; Allen, M T; Aston, D; Bartoldus, R; Bechtle, P; Benitez, J F; Cenci, R; Coleman, J P; Convery, M R; Dingfelder, J C; Dorfan, J; Dubois-Felsmann, G P; Dunwoodie, W; Field, R C; Gabareen, A M; Gowdy, S J; Graham, M T; Grenier, P; Hast, C; Innes, W R; Kaminski, J; Kelsey, M H; Kim, H; Kim, P; Kocian, M L; Leith, D W G S; Li, S; Lindquist, B; Luitz, S; Luth, V; Lynch, H L; MacFarlane, D B; Marsiske, H; Messner, R; Muller, D R; Neal, H; Nelson, S; O'Grady, C P; Ofte, I; Perazzo, A; Perl, M; Ratcliff, B N; Roodman, A; Salnikov, A A; Schindler, R H; Schwiening, J; Snyder, A; Su, D; sullivan, M K; Suzuki, K; Swain, S K; Thompson, J M; Va'vra, J; Wagner, A P; Weaver, M; West, C A; Wisniewski, W J; Wittgen, M; Wright, D H; Wulsin, H W; Yarritu, A K; Yi, K; Young, C C; Ziegler, V; Burchat, P R; Edwards, A J; Majewski, S A; Miyashita, T S; Petersen, B A; Wilden, L; Ahmed, S; Alam, M S; Ernst, J A; Pan, B; Saeed, M A; Zain, S B; Spanier, S M; Wogsland, B J; Eckmann, R; Ritchie, J L; Ruland, A M; Schilling, C J; Schwitters, R F; Drummond, B W; Izen, J M; Lou, X C; Bianchi, F; Gamba, D; Pelliccioni, M; Bomben, M; Bosisio, L; Cartaro, C; Della Ricca, G; Lanceri, L; Vitale, L; Azzolini, V; Lopez-March, N; Martinez-Vidal, F; Milanes, D A; Oyanguren, A; Albert, J; Banerjee, Sw; Bhuyan, B; Choi, H H F; Hamano, K; Kowalewski, R; Lewczuk, M J; Nugent, I M; Roney, J M; Sobie, R J; Gershon, T J; Harrison, P F; Ilic, J; Latham, T E; Mohanty, G B; Band, H R; Chen, X; Dasu, S; Flood, K T; Pan, Y; Pierini, M; Prepost, R; Vuosalo, C O; Wu, S L</p> <p>2008-12-31</p> <p>We report a measurement of the branching fractions of B-->D**(l) nu(l), decays based on 417 fb(-1) of data collected at the Y(4S) resonance with the BABAR detector at the PEP-II e+e- storage rings. Events are selected by full reconstructing one of the B mesons in a hadronic decay mode. A fit to the invariant mass differences m(D(*) pi)- m(D(*)) is performed to extract the signal yields of the different D** states. We observe the B-->D**l(-1)nu(l) decay modes corresponding to the four D states predicted by heavy quark symmetry with a significance greater than 5 standard deviations including systematic uncertainties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24029703','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24029703"><span>Modelling the expected rate of laboratory biosafety breakdowns involving rinderpest virus in the post-eradication era.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Beauvais, W; Fournié, G; Jones, B A; Cameron, A; Njeumi, F; Lubroth, J; Pfeiffer, D U</p> <p>2013-11-01</p> <p>Now that we are in the rinderpest post-eradication era, attention is focused on the risk of re-introduction. A semi-quantitative risk assessment identified accidental use of rinderpest virus in laboratories as the most likely cause of re-introduction. However there is little data available on the rates of laboratory biosafety breakdowns in general. In addition, any predictions based on past events are subject to various uncertainties. The aims of this study were therefore to investigate the potential usefulness of historical data for predicting the future risk of rinderpest release via laboratory biosafety breakdowns, and to investigate the impacts of the various uncertainties on these predictions. Data were collected using a worldwide online survey of laboratories, a structured search of ProMED reports and discussion with experts. A stochastic model was constructed to predict the number of laboratory biosafety breakdowns involving rinderpest that will occur over the next 10 years, based on: (1) the historical rate of biosafety breakdowns; and (2) the change in the number of laboratories that will have rinderpest virus in the next 10 years compared to historically. The search identified five breakdowns, all of which occurred during 1970-2000 and all of which were identified via discussions with experts. Assuming that our search for historical events had a sensitivity of over 60% and there has been at least a 40% reduction in the underlying risk (attributable to decreased laboratory activity post eradication) the most likely number of biosafety events worldwide was estimated to be zero over a 10 year period. However, the risk of at least one biosafety breakdown remains greater than 1 in 10,000 unless the sensitivity was at least 99% or the number of laboratories has decreased by at least 99% (based on 2000-2010 during which there were no biosafety breakdowns). Copyright © 2013 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EPJC...73.2304A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EPJC...73.2304A"><span>Jet energy measurement with the ATLAS detector in proton-proton collisions at √{s}=7 TeV</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Aad, G.; Abbott, B.; Abdallah, J.; Abdelalim, A. A.; Abdesselam, A.; Abdinov, O.; Abi, B.; Abolins, M.; Abramowicz, H.; Abreu, H.; Acerbi, E.; Acharya, B. S.; Adams, D. L.; Addy, T. N.; Adelman, J.; Aderholz, M.; Adomeit, S.; Adragna, P.; Adye, T.; Aefsky, S.; Aguilar-Saavedra, J. A.; Aharrouche, M.; Ahlen, S. P.; Ahles, F.; Ahmad, A.; Ahsan, M.; Aielli, G.; Akdogan, T.; Åkesson, T. P. A.; Akimoto, G.; Akimov, A. V.; Akiyama, A.; Aktas, A.; Alam, M. S.; Alam, M. A.; Albert, J.; Albrand, S.; Aleksa, M.; Aleksandrov, I. N.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alison, J.; Aliyev, M.; Allport, P. P.; Allwood-Spiers, S. E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alviggi, M. G.; Amako, K.; Amaral, P.; Amelung, C.; Ammosov, V. V.; Amorim, A.; Amorós, G.; Amram, N.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Andrieux, M.-L.; Anduaga, X. S.; Angerami, A.; Anghinolfi, F.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoun, S.; Aperio Bella, L.; Apolle, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Archambault, J. P.; Arfaoui, S.; Arguin, J.-F.; Arik, E.; Arik, M.; Armbruster, A. J.; Arnaez, O.; Arnault, C.; Artamonov, A.; Artoni, G.; Arutinov, D.; Asai, S.; Asfandiyarov, R.; Ask, S.; Åsman, B.; Asner, D.; Asquith, L.; Assamagan, K.; Astbury, A.; Astvatsatourov, A.; Atoian, G.; Aubert, B.; Auge, E.; Augsten, K.; Aurousseau, M.; Austin, N.; Avolio, G.; Avramidou, R.; Axen, D.; Ay, C.; Azuelos, G.; Azuma, Y.; Baak, M. A.; Baccaglioni, G.; Bacci, C.; Bach, A. M.; Bachacou, H.; Bachas, K.; Bachy, G.; Backes, M.; Backhaus, M.; Badescu, E.; Bagnaia, P.; Bahinipati, S.; Bai, Y.; Bailey, D. C.; Bain, T.; Baines, J. T.; Baker, O. K.; Baker, M. D.; Baker, S.; Banas, E.; Banerjee, P.; Banerjee, Sw.; Banfi, D.; Bangert, A.; Bansal, V.; Bansil, H. S.; Barak, L.; Baranov, S. P.; Barashkou, A.; Barbaro Galtieri, A.; Barber, T.; Barberio, E. L.; Barberis, D.; Barbero, M.; Bardin, D. Y.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B. M.; Barnett, R. M.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Barrillon, P.; Bartoldus, R.; Barton, A. E.; Bartsch, D.; Bartsch, V.; Bates, R. L.; Batkova, L.; Batley, J. R.; Battaglia, A.; Battistin, M.; Battistoni, G.; Bauer, F.; Bawa, H. S.; Beare, B.; Beau, T.; Beauchemin, P. H.; Beccherle, R.; Bechtle, P.; Beck, H. P.; Beck, G. A.; Beckingham, M.; Becks, K. H.; Beddall, A. J.; Beddall, A.; Bedikian, S.; Bednyakov, V. A.; Bee, C. P.; Begel, M.; Behar Harpaz, S.; Behera, P. K.; Beimforde, M.; Belanger-Champagne, C.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellina, F.; Bellomo, M.; Belloni, A.; Beloborodova, O.; Belotskiy, K.; Beltramello, O.; Ben Ami, S.; Benary, O.; Benchekroun, D.; Benchouk, C.; Bendel, M.; Benekos, N.; Benhammou, Y.; Benjamin, D. P.; Benoit, M.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Beretta, M.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernardet, K.; Bernat, P.; Bernhard, R.; Bernius, C.; Berry, T.; Bertin, A.; Bertinelli, F.; Bertolucci, F.; Besana, M. I.; Besson, N.; Bethke, S.; Bhimji, W.; Bianchi, R. M.; Bianco, M.; Biebel, O.; Bieniek, S. P.; Bierwagen, K.; Biesiada, J.; Biglietti, M.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biscarat, C.; Bitenc, U.; Black, K. M.; Blair, R. E.; Blanchard, J.-B.; Blanchot, G.; Blazek, T.; Blocker, C.; Blocki, J.; Blondel, A.; Blum, W.; Blumenschein, U.; Bobbink, G. J.; Bobrovnikov, V. B.; Bocchetta, S. S.; Bocci, A.; Boddy, C. R.; Boehler, M.; Boek, J.; Boelaert, N.; Böser, S.; Bogaerts, J. A.; Bogdanchikov, A.; Bogouch, A.; Bohm, C.; Boisvert, V.; Bold, T.; Boldea, V.; Bolnet, N. M.; Bona, M.; Bondarenko, V. G.; Bondioli, M.; Boonekamp, M.; Boorman, G.; Booth, C. N.; Bordoni, S.; Borer, C.; Borisov, A.; Borissov, G.; Borjanovic, I.; Borroni, S.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Botterill, D.; Bouchami, J.; Boudreau, J.; Bouhova-Thacker, E. V.; Bourdarios, C.; Bousson, N.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozhko, N. I.; Bozovic-Jelisavcic, I.; Bracinik, J.; Braem, A.; Branchini, P.; Brandenburg, G. W.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Brelier, B.; Bremer, J.; Brenner, R.; Bressler, S.; Breton, D.; Britton, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brodbeck, T. J.; Brodet, E.; Broggi, F.; Bromberg, C.; Brooijmans, G.; Brooks, W. K.; Brown, G.; Brown, H.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.; Bruschi, M.; Buanes, T.; Bucci, F.; Buchanan, J.; Buchanan, N. J.; Buchholz, P.; Buckingham, R. M.; Buckley, A. G.; Buda, S. I.; Budagov, I. A.; Budick, B.; Büscher, V.; Bugge, L.; Buira-Clark, D.; Bulekov, O.; Bunse, M.; Buran, T.; Burckhart, H.; Burdin, S.; Burgess, T.; Burke, S.; Busato, E.; Bussey, P.; Buszello, C. P.; Butin, F.; Butler, B.; Butler, J. M.; Buttar, C. M.; Butterworth, J. M.; Buttinger, W.; Caballero, J.; Cabrera Urbán, S.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calfayan, P.; Calkins, R.; Caloba, L. P.; Caloi, R.; Calvet, D.; Calvet, S.; Camacho Toro, R.; Camarri, P.; Cambiaghi, M.; Cameron, D.; Campana, S.; Campanelli, M.; Canale, V.; Canelli, F.; Canepa, A.; Cantero, J.; Capasso, L.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capriotti, D.; Capua, M.; Caputo, R.; Caramarcu, C.; Cardarelli, R.; Carli, T.; Carlino, G.; Carminati, L.; Caron, B.; Caron, S.; Carrillo Montoya, G. D.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Cascella, M.; Caso, C.; Castaneda Hernandez, A. M.; Castaneda-Miranda, E.; Castillo Gimenez, V.; Castro, N. F.; Cataldi, G.; Cataneo, F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Cattani, G.; Caughron, S.; Cauz, D.; Cavalleri, P.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cetin, S. A.; Cevenini, F.; Chafaq, A.; Chakraborty, D.; Chan, K.; Chapleau, B.; Chapman, J. D.; Chapman, J. W.; Chareyre, E.; Charlton, D. G.; Chavda, V.; Chavez Barajas, C. A.; Cheatham, S.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, S.; Chen, T.; Chen, X.; Cheng, S.; Cheplakov, A.; Chepurnov, V. F.; Cherkaoui El Moursli, R.; Chernyatin, V.; Cheu, E.; Cheung, S. L.; Chevalier, L.; Chiefari, G.; Chikovani, L.; Childers, J. T.; Chilingarov, A.; Chiodini, G.; Chizhov, M. V.; Choudalakis, G.; Chouridou, S.; Christidi, I. A.; Christov, A.; Chromek-Burckhart, D.; Chu, M. L.; Chudoba, J.; Ciapetti, G.; Ciba, K.; Ciftci, A. K.; Ciftci, R.; Cinca, D.; Cindro, V.; Ciobotaru, M. D.; Ciocca, C.; Ciocio, A.; Cirilli, M.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, P. J.; Cleland, W.; Clemens, J. C.; Clement, B.; Clement, C.; Clifft, R. W.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coe, P.; Cogan, J. G.; Coggeshall, J.; Cogneras, E.; Cojocaru, C. D.; Colas, J.; Colijn, A. P.; Collard, C.; Collins, N. J.; Collins-Tooth, C.; Collot, J.; Colon, G.; Conde Muiño, P.; Coniavitis, E.; Conidi, M. C.; Consonni, M.; Consorti, V.; Constantinescu, S.; Conta, C.; Conventi, F.; Cook, J.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Copic, K.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Costin, T.; Côté, D.; Coura Torres, R.; Courneyea, L.; Cowan, G.; Cowden, C.; Cox, B. E.; Cranmer, K.; Cranshaw, J.; Crescioli, F.; Cristinziani, M.; Crosetti, G.; Crupi, R.; Crépé-Renaudin, S.; Cuciuc, C.-M.; Cuenca Almenar, C.; Cuhadar Donszelmann, T.; Curatolo, M.; Curtis, C. J.; Cwetanski, P.; Czirr, H.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; D'Orazio, A.; Da Silva, P. V. M.; Da Via, C.; Dabrowski, W.; Dai, T.; Dallapiccola, C.; Daly, C. H.; Dam, M.; Dameri, M.; Damiani, D. S.; Danielsson, H. O.; Dannheim, D.; Dao, V.; Darbo, G.; Darlea, G. L.; Daum, C.; Dauvergne, J. P.; Davey, W.; Davidek, T.; Davidson, N.; Davidson, R.; Davies, E.; Davies, M.; Davison, A. R.; Davygora, Y.; Dawe, E.; Dawson, I.; Dawson, J. W.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Castro, S.; De Castro Faria Salgado, P. E.; De Cecco, S.; de Graat, J.; De Groot, N.; de Jong, P.; De La Taille, C.; De la Torre, H.; De Lotto, B.; de Mora, L.; De Nooij, L.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; Dean, S.; Debbe, R.; Dedovich, D. V.; Degenhardt, J.; Dehchar, M.; Del Papa, C.; Del Peso, J.; Del Prete, T.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delpierre, P.; Delruelle, N.; Delsart, P. A.; Deluca, C.; Demers, S.; Demichev, M.; Demirkoz, B.; Deng, J.; Deng, W.; Denisov, S. P.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Devetak, E.; Deviveiros, P. O.; Dewhurst, A.; DeWilde, B.; Dhaliwal, S.; Dhullipudi, R.; Di Ciaccio, A.; Di Ciaccio, L.; Di Girolamo, A.; Di Girolamo, B.; Di Luise, S.; Di Mattia, A.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Diaz, M. A.; Diblen, F.; Diehl, E. B.; Dietrich, J.; Dietzsch, T. A.; Diglio, S.; Dindar Yagci, K.; Dingfelder, J.; Dionisi, C.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; do Vale, M. A. B.; Do Valle Wemans, A.; Doan, T. K. O.; Dobbs, M.; Dobinson, R.; Dobos, D.; Dobson, E.; Dobson, M.; Dodd, J.; Doglioni, C.; Doherty, T.; Doi, Y.; Dolejsi, J.; Dolenc, I.; Dolezal, Z.; Dolgoshein, B. A.; Dohmae, T.; Donadelli, M.; Donega, M.; Donini, J.; Dopke, J.; Doria, A.; Dos Anjos, A.; Dosil, M.; Dotti, A.; Dova, M. T.; Dowell, J. D.; Doxiadis, A. D.; Doyle, A. T.; Drasal, Z.; Drees, J.; Dressnandt, N.; Drevermann, H.; Driouichi, C.; Dris, M.; Dubbert, J.; Dubbs, T.; Dube, S.; Duchovni, E.; Duckeck, G.; Dudarev, A.; Dudziak, F.; Dührssen, M.; Duerdoth, I. P.; Duflot, L.; Dufour, M.-A.; Dunford, M.; Duran Yildiz, H.; Duxfield, R.; Dwuznik, M.; Dydak, F.; Düren, M.; Ebenstein, W. L.; Ebke, J.; Eckert, S.; Eckweiler, S.; Edmonds, K.; Edwards, C. A.; Edwards, N. C.; Ehrenfeld, W.; Ehrich, T.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellinghaus, F.; Ellis, K.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Engelmann, R.; Engl, A.; Epp, B.; Eppig, A.; Erdmann, J.; Ereditato, A.; Eriksson, D.; Ernst, J.; Ernst, M.; Ernwein, J.; Errede, D.; Errede, S.; Ertel, E.; Escalier, M.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Etienne, F.; Etienvre, A. I.; Etzion, E.; Evangelakou, D.; Evans, H.; Fabbri, L.; Fabre, C.; Fakhrutdinov, R. M.; Falciano, S.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farley, J.; Farooque, T.; Farrington, S. M.; Farthouat, P.; Fassnacht, P.; Fassouliotis, D.; Fatholahzadeh, B.; Favareto, A.; Fayard, L.; Fazio, S.; Febbraro, R.; Federic, P.; Fedin, O. L.; Fedorko, W.; Fehling-Kaschek, M.; Feligioni, L.; Fellmann, D.; Felzmann, C. U.; Feng, C.; Feng, E. J.; Fenyuk, A. B.; Ferencei, J.; Ferland, J.; Fernando, W.; Ferrag, S.; Ferrando, J.; Ferrara, V.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferrer, A.; Ferrer, M. L.; Ferrere, D.; Ferretti, C.; Ferretto Parodi, A.; Fiascaris, M.; Fiedler, F.; Filipčič, A.; Filippas, A.; Filthaut, F.; Fincke-Keeler, M.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, G.; Fischer, P.; Fisher, M. J.; Fisher, S. M.; Flechl, M.; Fleck, I.; Fleckner, J.; Fleischmann, P.; Fleischmann, S.; Flick, T.; Flores Castillo, L. R.; Flowerdew, M. J.; Fokitis, M.; Fonseca Martin, T.; Fopma, J.; Forbush, D. A.; Formica, A.; Forti, A.; Fortin, D.; Foster, J. M.; Fournier, D.; Foussat, A.; Fowler, A. J.; Fowler, K.; Fox, H.; Francavilla, P.; Franchino, S.; Francis, D.; Frank, T.; Franklin, M.; Franz, S.; Fraternali, M.; Fratina, S.; Freestone, J.; French, S. T.; Friedrich, F.; Froeschl, R.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fullana Torregrosa, E.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gadfort, T.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Galea, C.; Gallas, E. J.; Gallo, V.; Gallop, B. J.; Gallus, P.; Galyaev, E.; Gan, K. K.; Gao, Y. S.; Gapienko, V. A.; Gaponenko, A.; Garberson, F.; Garcia-Sciveres, M.; García, C.; García Navarro, J. E.; Gardner, R. W.; Garelli, N.; Garitaonandia, H.; Garonne, V.; Garvey, J.; Gatti, C.; Gaudio, G.; Gaumer, O.; Gaur, B.; Gauthier, L.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gayde, J.-C.; Gazis, E. N.; Ge, P.; Gee, C. N. P.; Geerts, D. A. A.; Geich-Gimbel, Ch.; Gellerstedt, K.; Gemme, C.; Gemmell, A.; Genest, M. H.; Gentile, S.; George, M.; George, S.; Gerlach, P.; Gershon, A.; Geweniger, C.; Ghazlane, H.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giakoumopoulou, V.; Giangiobbe, V.; Gianotti, F.; Gibbard, B.; Gibson, A.; Gibson, S. M.; Gilbert, L. M.; Gilchriese, M.; Gilewsky, V.; Gillberg, D.; Gillman, A. R.; Gingrich, D. M.; Ginzburg, J.; Giokaris, N.; Giordani, M. P.; Giordano, R.; Giorgi, F. M.; Giovannini, P.; Giraud, P. F.; Giugni, D.; Giunta, M.; Giusti, P.; Gjelsten, B. K.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glazov, A.; Glitza, K. W.; Glonti, G. L.; Godfrey, J.; Godlewski, J.; Goebel, M.; Göpfert, T.; Goeringer, C.; Gössling, C.; Göttfert, T.; Goldfarb, S.; Golling, T.; Golovnia, S. N.; Gomes, A.; Gomez Fajardo, L. S.; Gonçalo, R.; Goncalves Pinto Firmino Da Costa, J.; Gonella, L.; Gonidec, A.; Gonzalez, S.; González de la Hoz, S.; Gonzalez Silva, M. L.; Gonzalez-Sevilla, S.; Goodson, J. J.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Gorokhov, S. A.; Goryachev, V. N.; Gosdzik, B.; Gosselink, M.; Gostkin, M. I.; Gough Eschrich, I.; Gouighri, M.; Goujdami, D.; Goulette, M. P.; Goussiou, A. G.; Goy, C.; Grabowska-Bold, I.; Grafström, P.; Grah, C.; Grahn, K.-J.; Grancagnolo, F.; Grancagnolo, S.; Grassi, V.; Gratchev, V.; Grau, N.; Gray, H. M.; Gray, J. A.; Graziani, E.; Grebenyuk, O. G.; Green, B.; Greenfield, D.; Greenshaw, T.; Greenwood, Z. D.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Griffiths, J.; Grigalashvili, N.; Grillo, A. A.; Grinstein, S.; Grishkevich, Y. V.; Grivaz, J.-F.; Groh, M.; Gross, E.; Grosse-Knetter, J.; Groth-Jensen, J.; Grybel, K.; Guarino, V. J.; Guest, D.; Guicheney, C.; Guida, A.; Guindon, S.; Guler, H.; Gunther, J.; Guo, B.; Guo, J.; Gupta, A.; Gusakov, Y.; Gushchin, V. N.; Gutierrez, A.; Gutierrez, P.; Guttman, N.; Gutzwiller, O.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haas, S.; Haber, C.; Hackenburg, R.; Hadavand, H. K.; Hadley, D. R.; Haefner, P.; Hahn, F.; Haider, S.; Hajduk, Z.; Hakobyan, H.; Haller, J.; Hamacher, K.; Hamal, P.; Hamilton, A.; Hamilton, S.; Han, H.; Han, L.; Hanagaki, K.; Hance, M.; Handel, C.; Hanke, P.; Hansen, J. R.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hansson, P.; Hara, K.; Hare, G. A.; Harenberg, T.; Harkusha, S.; Harper, D.; Harrington, R. D.; Harris, O. M.; Harrison, K.; Hartert, J.; Hartjes, F.; Haruyama, T.; Harvey, A.; Hasegawa, S.; Hasegawa, Y.; Hassani, S.; Hatch, M.; Hauff, D.; Haug, S.; Hauschild, M.; Hauser, R.; Havranek, M.; Hawes, B. M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, D.; Hayakawa, T.; Hayashi, T.; Hayden, D.; Hayward, H. S.; Haywood, S. J.; Hazen, E.; He, M.; Head, S. J.; Hedberg, V.; Heelan, L.; Heim, S.; Heinemann, B.; Heisterkamp, S.; Helary, L.; Heller, M.; Hellman, S.; Hellmich, D.; Helsens, C.; Hemperek, T.; Henderson, R. C. W.; Henke, M.; Henrichs, A.; Henriques Correia, A. M.; Henrot-Versille, S.; Henry-Couannier, F.; Hensel, C.; Henß, T.; Hernandez, C. M.; Hernández Jiménez, Y.; Herrberg, R.; Hershenhorn, A. D.; Herten, G.; Hertenberger, R.; Hervas, L.; Hessey, N. P.; Hidvegi, A.; Higón-Rodriguez, E.; Hill, D.; Hill, J. C.; Hill, N.; Hiller, K. H.; Hillert, S.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hirose, M.; Hirsch, F.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, D.; Hohlfeld, M.; Holder, M.; Holmgren, S. O.; Holy, T.; Holzbauer, J. L.; Homma, Y.; Hong, T. M.; Hooft van Huysduynen, L.; Horazdovsky, T.; Horn, C.; Horner, S.; Horton, K.; Hostachy, J.-Y.; Hou, S.; Houlden, M. A.; Hoummada, A.; Howarth, J.; Howell, D. F.; Hristova, I.; Hrivnac, J.; Hruska, I.; Hryn'ova, T.; Hsu, P. J.; Hsu, S.-C.; Huang, G. S.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Hughes-Jones, R. E.; Huhtinen, M.; Hurst, P.; Hurwitz, M.; Husemann, U.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibbotson, M.; Ibragimov, I.; Ichimiya, R.; Iconomidou-Fayard, L.; Idarraga, J.; Iengo, P.; Igonkina, O.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Imbault, D.; Imori, M.; Ince, T.; Inigo-Golfin, J.; Ioannou, P.; Iodice, M.; Irles Quiles, A.; Ishikawa, A.; Ishino, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, B.; Jackson, J. N.; Jackson, P.; Jaekel, M. R.; Jain, V.; Jakobs, K.; Jakobsen, S.; Jakubek, J.; Jana, D. K.; Jankowski, E.; Jansen, E.; Jantsch, A.; Janus, M.; Jarlskog, G.; Jeanty, L.; Jelen, K.; Jen-La Plante, I.; Jenni, P.; Jeremie, A.; Jež, P.; Jézéquel, S.; Jha, M. K.; Ji, H.; Ji, W.; Jia, J.; Jiang, Y.; Jimenez Belenguer, M.; Jin, G.; Jin, S.; Jinnouchi, O.; Joergensen, M. D.; Joffe, D.; Johansen, L. G.; Johansen, M.; Johansson, K. E.; Johansson, P.; Johnert, S.; Johns, K. A.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, T. W.; Jones, T. J.; Jonsson, O.; Joram, C.; Jorge, P. M.; Joseph, J.; Jovin, T.; Ju, X.; Jung, C. A.; Juranek, V.; Jussel, P.; Juste Rozas, A.; Kabachenko, V. V.; Kabana, S.; Kaci, M.; Kaczmarska, A.; Kadlecik, P.; Kado, M.; Kagan, H.; Kagan, M.; Kaiser, S.; Kajomovitz, E.; Kalinin, S.; Kalinovskaya, L. V.; Kama, S.; Kanaya, N.; Kaneda, M.; Kanno, T.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kaplon, J.; Kar, D.; Karagounis, M.; Karagoz, M.; Karnevskiy, M.; Karr, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasmi, A.; Kass, R. D.; Kastanas, A.; Kataoka, M.; Kataoka, Y.; Katsoufis, E.; Katzy, J.; Kaushik, V.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kayl, M. S.; Kazanin, V. A.; Kazarinov, M. Y.; Keates, J. R.; Keeler, R.; Kehoe, R.; Keil, M.; Kekelidze, G. D.; Kelly, M.; Kennedy, J.; Kenney, C. J.; Kenyon, M.; Kepka, O.; Kerschen, N.; Kerševan, B. P.; Kersten, S.; Kessoku, K.; Ketterer, C.; Keung, J.; Khakzad, M.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Kholodenko, A. G.; Khomich, A.; Khoo, T. J.; Khoriauli, G.; Khoroshilov, A.; Khovanskiy, N.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kim, H.; Kim, M. S.; Kim, P. C.; Kim, S. H.; Kimura, N.; Kind, O.; King, B. T.; King, M.; King, R. S. B.; Kirk, J.; Kirsch, L. E.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kittelmann, T.; Kiver, A. M.; Kladiva, E.; Klaiber-Lodewigs, J.; Klein, M.; Klein, U.; Kleinknecht, K.; Klemetti, M.; Klier, A.; Klimentov, A.; Klingenberg, R.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Klous, S.; Kluge, E.-E.; Kluge, T.; Kluit, P.; Kluth, S.; Knecht, N. S.; Kneringer, E.; Knobloch, J.; Knoops, E. B. F. G.; Knue, A.; Ko, B. R.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kocnar, A.; Kodys, P.; Köneke, K.; König, A. C.; Koenig, S.; Köpke, L.; Koetsveld, F.; Koevesarki, P.; Koffas, T.; Koffeman, E.; Kohn, F.; Kohout, Z.; Kohriki, T.; Koi, T.; Kokott, T.; Kolachev, G. M.; Kolanoski, H.; Kolesnikov, V.; Koletsou, I.; Koll, J.; Kollar, D.; Kollefrath, M.; Kolya, S. D.; Komar, A. A.; Komori, Y.; Kondo, T.; Kono, T.; Kononov, A. I.; Konoplich, R.; Konstantinidis, N.; Kootz, A.; Koperny, S.; Kopikov, S. V.; Korcyl, K.; Kordas, K.; Koreshev, V.; Korn, A.; Korol, A.; Korolkov, I.; Korolkova, E. V.; Korotkov, V. A.; Kortner, O.; Kortner, S.; Kostyukhin, V. V.; Kotamäki, M. J.; Kotov, S.; Kotov, V. M.; Kotwal, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewski, R.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V. A.; Kramberger, G.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J.; Kraus, J. K.; Kreisel, A.; Krejci, F.; Kretzschmar, J.; Krieger, N.; Krieger, P.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruker, T.; Krumshteyn, Z. V.; Kruth, A.; Kubota, T.; Kuehn, S.; Kugel, A.; Kuhl, T.; Kuhn, D.; Kukhtin, V.; Kulchitsky, Y.; Kuleshov, S.; Kummer, C.; Kuna, M.; Kundu, N.; Kunkle, J.; Kupco, A.; Kurashige, H.; Kurata, M.; Kurochkin, Y. A.; Kus, V.; Kuze, M.; Kuzhir, P.; Kvita, J.; Kwee, R.; La Rosa, A.; La Rotonda, L.; Labarga, L.; Labbe, J.; Lablak, S.; Lacasta, C.; Lacava, F.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Laisne, E.; Lamanna, M.; Lampen, C. L.; Lampl, W.; Lancon, E.; Landgraf, U.; Landon, M. P. J.; Landsman, H.; Lane, J. L.; Lange, C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Larionov, A. V.; Larner, A.; Lasseur, C.; Lassnig, M.; Laurelli, P.; Lavrijsen, W.; Laycock, P.; Lazarev, A. B.; Le Dortz, O.; Le Guirriec, E.; Le Maner, C.; Le Menedeu, E.; Lebel, C.; LeCompte, T.; Ledroit-Guillon, F.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lee, L.; Lefebvre, M.; Legendre, M.; Leger, A.; LeGeyt, B. C.; Legger, F.; Leggett, C.; Lehmacher, M.; Lehmann Miotto, G.; Lei, X.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Leltchouk, M.; Lemmer, B.; Lendermann, V.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lenzi, B.; Leonhardt, K.; Leontsinis, S.; Leroy, C.; Lessard, J.-R.; Lesser, J.; Lester, C. G.; Leung Fook Cheong, A.; Levêque, J.; Levin, D.; Levinson, L. J.; Levitski, M. S.; Lewandowska, M.; Lewis, A.; Lewis, G. H.; Leyko, A. M.; Leyton, M.; Li, B.; Li, H.; Li, S.; Li, X.; Liang, Z.; Liang, Z.; Liao, H.; Liberti, B.; Lichard, P.; Lichtnecker, M.; Lie, K.; Liebig, W.; Lifshitz, R.; Lilley, J. N.; Limbach, C.; Limosani, A.; Limper, M.; Lin, S. C.; Linde, F.; Linnemann, J. T.; Lipeles, E.; Lipinsky, L.; Lipniacka, A.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, C.; Liu, D.; Liu, H.; Liu, J. B.; Liu, M.; Liu, S.; Liu, Y.; Livan, M.; Livermore, S. S. A.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Loken, J.; Lombardo, V. P.; Long, R. E.; Lopes, L.; Lopez Mateos, D.; Losada, M.; Loscutoff, P.; Lo Sterzo, F.; Losty, M. J.; Lou, X.; Lounis, A.; Loureiro, K. F.; Love, J.; Love, P. A.; Lowe, A. J.; Lu, F.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Ludwig, A.; Ludwig, D.; Ludwig, I.; Ludwig, J.; Luehring, F.; Luijckx, G.; Lumb, D.; Luminari, L.; Lund, E.; Lund-Jensen, B.; Lundberg, B.; Lundberg, J.; Lundquist, J.; Lungwitz, M.; Lupi, A.; Lutz, G.; Lynn, D.; Lys, J.; Lytken, E.; Ma, H.; Ma, L. L.; Macana Goia, J. A.; Maccarrone, G.; Macchiolo, A.; Maček, B.; Machado Miguens, J.; Mackeprang, R.; Madaras, R. J.; Mader, W. F.; Maenner, R.; Maeno, T.; Mättig, P.; Mättig, S.; Magnoni, L.; Magradze, E.; Mahalalel, Y.; Mahboubi, K.; Mahout, G.; Maiani, C.; Maidantchik, C.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Mal, P.; Malaescu, B.; Malecki, Pa.; Malecki, P.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyshev, V.; Malyukov, S.; Mameghani, R.; Mamuzic, J.; Manabe, A.; Mandelli, L.; Mandić, I.; Mandrysch, R.; Maneira, J.; Mangeard, P. S.; Manjavidze, I. D.; Mann, A.; Manning, P. M.; Manousakis-Katsikakis, A.; Mansoulie, B.; Manz, A.; Mapelli, A.; Mapelli, L.; March, L.; Marchand, J. F.; Marchese, F.; Marchiori, G.; Marcisovsky, M.; Marin, A.; Marino, C. P.; Marroquim, F.; Marshall, R.; Marshall, Z.; Martens, F. K.; Marti-Garcia, S.; Martin, A. J.; Martin, B.; Martin, B.; Martin, F. F.; Martin, J. P.; Martin, Ph.; Martin, T. A.; Martin, V. J.; Martin dit Latour, B.; Martin-Haugh, S.; Martinez, M.; Martinez Outschoorn, V.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massaro, G.; Massol, N.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Mathes, M.; Matricon, P.; Matsumoto, H.; Matsunaga, H.; Matsushita, T.; Mattravers, C.; Maugain, J. M.; Maxfield, S. J.; Maximov, D. A.; May, E. N.; Mayne, A.; Mazini, R.; Mazur, M.; Mazzanti, M.; Mazzoni, E.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McCubbin, N. A.; McFarlane, K. W.; Mcfayden, J. A.; McGlone, H.; Mchedlidze, G.; McLaren, R. A.; Mclaughlan, T.; McMahon, S. J.; McPherson, R. A.; Meade, A.; Mechnich, J.; Mechtel, M.; Medinnis, M.; Meera-Lebbai, R.; Meguro, T.; Mehdiyev, R.; Mehlhase, S.; Mehta, A.; Meier, K.; Meinhardt, J.; Meirose, B.; Melachrinos, C.; Mellado Garcia, B. R.; Mendoza Navas, L.; Meng, Z.; Mengarelli, A.; Menke, S.; Menot, C.; Meoni, E.; Mercurio, K. M.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Meyer, J.; Meyer, T. C.; Meyer, W. T.; Miao, J.; Michal, S.; Micu, L.; Middleton, R. P.; Miele, P.; Migas, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Miller, D. W.; Miller, R. J.; Mills, W. J.; Mills, C.; Milov, A.; Milstead, D. A.; Milstein, D.; Minaenko, A. A.; Miñano Moya, M.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mirabelli, G.; Miralles Verge, L.; Misawa, S.; Misiejuk, A.; Mitrevski, J.; Mitrofanov, G. Y.; Mitsou, V. A.; Mitsui, S.; Miyagawa, P. S.; Miyazaki, K.; Mjörnmark, J. U.; Moa, T.; Mockett, P.; Moed, S.; Moeller, V.; Mönig, K.; Möser, N.; Mohapatra, S.; Mohr, W.; Mohrdieck-Möck, S.; Moisseev, A. M.; Moles-Valls, R.; Molina-Perez, J.; Monk, J.; Monnier, E.; Montesano, S.; Monticelli, F.; Monzani, S.; Moore, R. W.; Moorhead, G. F.; Mora Herrera, C.; Moraes, A.; Morange, N.; Morel, J.; Morello, G.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Morii, M.; Morin, J.; Morley, A. K.; Mornacchi, G.; Morozov, S. V.; Morris, J. D.; Morvaj, L.; Moser, H. G.; Mosidze, M.; Moss, J.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Mudrinic, M.; Mueller, F.; Mueller, J.; Mueller, K.; Müller, T. A.; Muenstermann, D.; Muir, A.; Munwes, Y.; Murray, W. J.; Mussche, I.; Musto, E.; Myagkov, A. G.; Myska, M.; Nadal, J.; Nagai, K.; Nagano, K.; Nagasaka, Y.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Nanava, G.; Napier, A.; Nash, M.; Nation, N. R.; Nattermann, T.; Naumann, T.; Navarro, G.; Neal, H. A.; Nebot, E.; Nechaeva, P. Yu.; Negri, A.; Negri, G.; Nektarijevic, S.; Nelson, A.; Nelson, S.; Nelson, T. K.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Nesterov, S. Y.; Neubauer, M. S.; Neusiedl, A.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nguyen Thi Hong, V.; Nickerson, R. B.; Nicolaidou, R.; Nicolas, L.; Nicoletti, G.; Nicquevert, B.; Niedercorn, F.; Nielsen, J.; Niinikoski, T.; Nikiforou, N.; Nikiforov, A.; Nikolaenko, V.; Nikolaev, K.; Nikolic-Audit, I.; Nikolics, K.; Nikolopoulos, K.; Nilsen, H.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nishiyama, T.; Nisius, R.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Nordberg, M.; Nordkvist, B.; Norton, P. R.; Notz, D.; Novakova, J.; Nozaki, M.; Nozka, L.; Nugent, I. M.; Nuncio-Quiroz, A.-E.; Nunes Hanninger, G.; Nunnemann, T.; Nurse, E.; Nyman, T.; O'Brien, B. J.; O'Neale, S. W.; O'Neil, D. C.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Ocariz, J.; Ochi, A.; Oda, S.; Odaka, S.; Odier, J.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohshima, T.; Ohshita, H.; Ohsugi, T.; Okada, S.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olcese, M.; Olchevski, A. G.; Oliveira, M.; Oliveira Damazio, D.; Oliver Garcia, E.; Olivito, D.; Olszewski, A.; Olszowska, J.; Omachi, C.; Onofre, A.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlov, I.; Oropeza Barrera, C.; Orr, R. S.; Osculati, B.; Ospanov, R.; Osuna, C.; Otero y Garzon, G.; Ottersbach, J. P.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Ouyang, Q.; Owen, M.; Owen, S.; Ozcan, V. E.; Ozturk, N.; Pacheco Pages, A.; Padilla Aranda, C.; Pagan Griso, S.; Paganis, E.; Paige, F.; Pajchel, K.; Palacino, G.; Paleari, C. P.; Palestini, S.; Pallin, D.; Palma, A.; Palmer, J. D.; Pan, Y. B.; Panagiotopoulou, E.; Panes, B.; Panikashvili, N.; Panitkin, S.; Pantea, D.; Panuskova, M.; Paolone, V.; Papadelis, A.; Papadopoulou, Th. D.; Paramonov, A.; Park, W.; Parker, M. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pasqualucci, E.; Passeri, A.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pecsy, M.; Pedraza Morales, M. I.; Peleganchuk, S. V.; Peng, H.; Pengo, R.; Penson, A.; Penwell, J.; Perantoni, M.; Perez, K.; Perez Cavalcanti, T.; Perez Codina, E.; Pérez García-Estañ, M. T.; Perez Reale, V.; Perini, L.; Pernegger, H.; Perrino, R.; Perrodo, P.; Persembe, S.; Perus, A.; Peshekhonov, V. D.; Petersen, B. A.; Petersen, J.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petschull, D.; Petteni, M.; Pezoa, R.; Pfeifer, B.; Phan, A.; Phillips, A. W.; Phillips, P. W.; Piacquadio, G.; Piccaro, E.; Piccinini, M.; Pickford, A.; Piec, S. M.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pina, J.; Pinamonti, M.; Pinder, A.; Pinfold, J. L.; Ping, J.; Pinto, B.; Pirotte, O.; Pizio, C.; Placakyte, R.; Plamondon, M.; Pleier, M.-A.; Pleskach, A. V.; Poblaguev, A.; Poddar, S.; Podlyski, F.; Poggioli, L.; Poghosyan, T.; Pohl, M.; Polci, F.; Polesello, G.; Policicchio, A.; Polini, A.; Poll, J.; Polychronakos, V.; Pomarede, D. M.; Pomeroy, D.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Portell Bueso, X.; Posch, C.; Pospelov, G. E.; Pospisil, S.; Potekhin, M.; Potrap, I. N.; Potter, C. J.; Potter, K. P.; Potter, C. T.; Poulard, G.; Poveda, J.; Prabhu, R.; Pralavorio, P.; Prasad, S.; Pravahan, R.; Prell, S.; Pretzl, K.; Pribyl, L.; Price, D.; Price, L. E.; Price, M. J.; Prichard, P. M.; Prieur, D.; Primavera, M.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Prudent, X.; Przysiezniak, H.; Psoroulas, S.; Ptacek, E.; Pueschel, E.; Purdham, J.; Purohit, M.; Puzo, P.; Pylypchenko, Y.; Qian, J.; Qian, W.; Qian, Z.; Qin, Z.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Quinonez, F.; Raas, M.; Radeka, V.; Radescu, V.; Radics, B.; Rador, T.; Ragusa, F.; Rahal, G.; Rahimi, A. M.; Rahm, D.; Rajagopalan, S.; Rammensee, M.; Rammes, M.; Ramstedt, M.; Randle-Conde, A. S.; Randrianarivony, K.; Ratoff, P. N.; Rauscher, F.; Rauter, E.; Raymond, M.; Read, A. L.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Reichold, A.; Reinherz-Aronis, E.; Reinsch, A.; Reisinger, I.; Reljic, D.; Rembser, C.; Ren, Z. L.; Renaud, A.; Renkel, P.; Rescia, S.; Rescigno, M.; Resconi, S.; Resende, B.; Reznicek, P.; Rezvani, R.; Richards, A.; Richter, R.; Richter-Was, E.; Ridel, M.; Rieke, S.; Rijpstra, M.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Rios, R. R.; Riu, I.; Rivoltella, G.; Rizatdinova, F.; Rizvi, E.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robinson, M.; Robson, A.; Rocha de Lima, J. G.; Roda, C.; Roda Dos Santos, D.; Rodier, S.; Rodriguez, D.; Rodriguez Garcia, Y.; Roe, A.; Roe, S.; Røhne, O.; Rojo, V.; Rolli, S.; Romaniouk, A.; Romano, M.; Romanov, V. M.; Romeo, G.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, A.; Rose, M.; Rosenbaum, G. A.; Rosenberg, E. I.; Rosendahl, P. L.; Rosenthal, O.; Rosselet, L.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rossi, L.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubinskiy, I.; Ruckert, B.; Ruckstuhl, N.; Rud, V. I.; Rudolph, C.; Rudolph, G.; Rühr, F.; Ruggieri, F.; Ruiz-Martinez, A.; Rulikowska-Zarebska, E.; Rumiantsev, V.; Rumyantsev, L.; Runge, K.; Runolfsson, O.; Rurikova, Z.; Rusakovich, N. A.; Rust, D. R.; Rutherfoord, J. P.; Ruwiedel, C.; Ruzicka, P.; Ryabov, Y. F.; Ryadovikov, V.; Ryan, P.; Rybar, M.; Rybkin, G.; Ryder, N. C.; Rzaeva, S.; Saavedra, A. F.; Sadeh, I.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Sakamoto, H.; Salamanna, G.; Salamon, A.; Saleem, M.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvachua Ferrando, B. M.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sampsonidis, D.; Samset, B. H.; Sanchez, A.; Sandaker, H.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandoval, T.; Sandoval, C.; Sandstroem, R.; Sandvoss, S.; Sankey, D. P. C.; Sansoni, A.; Santamarina Rios, C.; Santoni, C.; Santonico, R.; Santos, H.; Saraiva, J. G.; Sarangi, T.; Sarkisyan-Grinbaum, E.; Sarri, F.; Sartisohn, G.; Sasaki, O.; Sasaki, T.; Sasao, N.; Satsounkevitch, I.; Sauvage, G.; Sauvan, E.; Sauvan, J. B.; Savard, P.; Savine, A. Y.; Savinov, V.; Savu, D. O.; Savva, P.; Sawyer, L.; Saxon, D. H.; Says, L. P.; Sbarra, C.; Sbrizzi, A.; Scallon, O.; Scannicchio, D. A.; Schaarschmidt, J.; Schacht, P.; Schäfer, U.; Schaepe, S.; Schaetzel, S.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Schamov, A. G.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schieck, J.; Schioppa, M.; Schlenker, S.; Schlereth, J. L.; Schmidt, E.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schmitz, M.; Schöning, A.; Schott, M.; Schouten, D.; Schovancova, J.; Schram, M.; Schroeder, C.; Schroer, N.; Schuh, S.; Schuler, G.; Schultes, J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, J. W.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwanenberger, C.; Schwartzman, A.; Schwemling, Ph.; Schwienhorst, R.; Schwierz, R.; Schwindling, J.; Schwindt, T.; Scott, W. G.; Searcy, J.; Sedov, G.; Sedykh, E.; Segura, E.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Seliverstov, D. M.; Sellden, B.; Sellers, G.; Seman, M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Seuster, R.; Severini, H.; Sevior, M. E.; Sfyrla, A.; Shabalina, E.; Shamim, M.; Shan, L. Y.; Shank, J. T.; Shao, Q. T.; Shapiro, M.; Shatalov, P. B.; Shaver, L.; Shaw, K.; Sherman, D.; Sherwood, P.; Shibata, A.; Shichi, H.; Shimizu, S.; Shimojima, M.; Shin, T.; Shmeleva, A.; Shochet, M. J.; Short, D.; Shupe, M. A.; Sicho, P.; Sidoti, A.; Siebel, A.; Siegert, F.; Siegrist, J.; Sijacki, Dj.; Silbert, O.; Silva, J.; Silver, Y.; Silverstein, D.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simmons, B.; Simonyan, M.; Sinervo, P.; Sinev, N. B.; Sipica, V.; Siragusa, G.; Sircar, A.; Sisakyan, A. N.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinnari, L. A.; Skottowe, H. P.; Skovpen, K.; Skubic, P.; Skvorodnev, N.; Slater, M.; Slavicek, T.; Sliwa, K.; Sloper, J.; Smakhtin, V.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, B. C.; Smith, D.; Smith, K. M.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snow, S. W.; Snow, J.; Snuverink, J.; Snyder, S.; Soares, M.; Sobie, R.; Sodomka, J.; Soffer, A.; Solans, C. A.; Solar, M.; Solc, J.; Soldatov, E.; Soldevila, U.; Solfaroli Camillocci, E.; Solodkov, A. A.; Solovyanov, O. V.; Sondericker, J.; Soni, N.; Sopko, V.; Sopko, B.; Sorbi, M.; Sosebee, M.; Soualah, R.; Soukharev, A.; Spagnolo, S.; Spanò, F.; Spighi, R.; Spigo, G.; Spila, F.; Spiriti, E.; Spiwoks, R.; Spousta, M.; Spreitzer, T.; Spurlock, B.; St. Denis, R. D.; Stahl, T.; Stahlman, J.; Stamen, R.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stapnes, S.; Starchenko, E. A.; Stark, J.; Staroba, P.; Starovoitov, P.; Staude, A.; Stavina, P.; Stavropoulos, G.; Steele, G.; Steinbach, P.; Steinberg, P.; Stekl, I.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stevenson, K.; Stewart, G. A.; Stillings, J. A.; Stockmanns, T.; Stockton, M. C.; Stoerig, K.; Stoicea, G.; Stonjek, S.; Strachota, P.; Stradling, A. R.; Straessner, A.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strang, M.; Strauss, E.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Strong, J. A.; Stroynowski, R.; Strube, J.; Stugu, B.; Stumer, I.; Stupak, J.; Sturm, P.; Soh, D. A.; Su, D.; Subramania, HS.; Succurro, A.; Sugaya, Y.; Sugimoto, T.; Suhr, C.; Suita, K.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Sushkov, S.; Susinno, G.; Sutton, M. R.; Suzuki, Y.; Suzuki, Y.; Svatos, M.; Sviridov, Yu. M.; Swedish, S.; Sykora, I.; Sykora, T.; Szeless, B.; Sánchez, J.; Ta, D.; Tackmann, K.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takahashi, Y.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Talby, M.; Talyshev, A.; Tamsett, M. C.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanaka, Y.; Tani, K.; Tannoury, N.; Tappern, G. P.; Tapprogge, S.; Tardif, D.; Tarem, S.; Tarrade, F.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tassi, E.; Tatarkhanov, M.; Tayalati, Y.; Taylor, C.; Taylor, F. E.; Taylor, G. N.; Taylor, W.; Teinturier, M.; Teixeira Dias Castanheira, M.; Teixeira-Dias, P.; Temming, K. K.; Ten Kate, H.; Teng, P. K.; Terada, S.; Terashi, K.; Terron, J.; Terwort, M.; Testa, M.; Teuscher, R. J.; Thadome, J.; Therhaag, J.; Theveneaux-Pelzer, T.; Thioye, M.; Thoma, S.; Thomas, J. P.; Thompson, E. N.; Thompson, P. D.; Thompson, P. D.; Thompson, A. S.; Thomson, E.; Thomson, M.; Thompson, R. J.; Thun, R. P.; Tian, F.; Tic, T.; Tikhomirov, V. O.; Tikhonov, Y. A.; Timmermans, C. J. W. P.; Tipton, P.; Tique Aires Viegas, F. J.; Tisserant, S.; Tobias, J.; Toczek, B.; Todorov, T.; Todorova-Nova, S.; Toggerson, B.; Tojo, J.; Tokár, S.; Tokunaga, K.; Tokushuku, K.; Tollefson, K.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, G.; Tonoyan, A.; Topfel, C.; Topilin, N. D.; Torchiani, I.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Traynor, D.; Trefzger, T.; Tremblet, L.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Trinh, T. N.; Tripiana, M. F.; Trischuk, W.; Trivedi, A.; Trocmé, B.; Troncon, C.; Trottier-McDonald, M.; Trzupek, A.; Tsarouchas, C.; Tseng, J. C.-L.; Tsiakiris, M.; Tsiareshka, P. V.; Tsionou, D.; Tsipolitis, G.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsung, J.-W.; Tsuno, S.; Tsybychev, D.; Tua, A.; Tudorache, A.; Tudorache, V.; Tuggle, J. M.; Turala, M.; Turecek, D.; Turk Cakir, I.; Turlay, E.; Turra, R.; Tuts, P. M.; Twomey, M. S.; Tykhonov, A.; Tylmad, M.; Tyndel, M.; Tyrvainen, H.; Tzanakos, G.; Uchida, K.; Ueda, I.; Ueno, R.; Ugland, M.; Uhlenbrock, M.; Uhrmacher, M.; Ukegawa, F.; Unal, G.; Underwood, D. G.; Undrus, A.; Unel, G.; Unno, Y.; Urbaniec, D.; Urkovsky, E.; Urrejola, P.; Usai, G.; Uslenghi, M.; Vacavant, L.; Vacek, V.; Vachon, B.; Vahsen, S.; Valenta, J.; Valente, P.; Valentinetti, S.; Valkar, S.; Valladolid Gallego, E.; Vallecorsa, S.; Valls Ferrer, J. A.; van der Graaf, H.; van der Kraaij, E.; Van Der Leeuw, R.; van der Poel, E.; van der Ster, D.; van Eldik, N.; van Gemmeren, P.; van Kesteren, Z.; van Vulpen, I.; Vanadia, M.; Vandelli, W.; Vandoni, G.; Vaniachine, A.; Vankov, P.; Vannucci, F.; Varela Rodriguez, F.; Vari, R.; Varnes, E. W.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vassilakopoulos, V. I.; Vazeille, F.; Vegni, G.; Veillet, J. J.; Vellidis, C.; Veloso, F.; Veness, R.; Veneziano, S.; Ventura, A.; Ventura, D.; Venturi, M.; Venturi, N.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Vichou, I.; Vickey, T.; Vickey Boeriu, O. E.; Viehhauser, G. H. A.; Viel, S.; Villa, M.; Villani, E. G.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinek, E.; Vinogradov, V. B.; Virchaux, M.; Virzi, J.; Vitells, O.; Viti, M.; Vivarelli, I.; Vives Vaque, F.; Vlachos, S.; Vladoiu, D.; Vlasak, M.; Vlasov, N.; Vogel, A.; Vokac, P.; Volpi, G.; Volpi, M.; Volpini, G.; von der Schmitt, H.; von Loeben, J.; von Radziewski, H.; von Toerne, E.; Vorobel, V.; Vorobiev, A. P.; Vorwerk, V.; Vos, M.; Voss, R.; Voss, T. T.; Vossebeld, J. H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vu Anh, T.; Vuillermet, R.; Vujicic, M.; Vukotic, I.; Wagner, W.; Wagner, P.; Wahlen, H.; Wakabayashi, J.; Walbersloh, J.; Walch, S.; Walder, J.; Walker, R.; Walkowiak, W.; Wall, R.; Waller, P.; Wang, C.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, J. C.; Wang, R.; Wang, S. M.; Warburton, A.; Ward, C. P.; Warsinsky, M.; Wastie, R.; Watkins, P. M.; Watson, A. T.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, A. T.; Waugh, B. M.; Weber, J.; Weber, M.; Weber, M. S.; Weber, P.; Weidberg, A. R.; Weigell, P.; Weingarten, J.; Weiser, C.; Wellenstein, H.; Wells, P. S.; Wen, M.; Wenaus, T.; Wendler, S.; Weng, Z.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Werth, M.; Wessels, M.; Weydert, C.; Whalen, K.; Wheeler-Ellis, S. J.; Whitaker, S. P.; White, A.; White, M. J.; White, S.; Whitehead, S. R.; Whiteson, D.; Whittington, D.; Wicek, F.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wijeratne, P. A.; Wildauer, A.; Wildt, M. A.; Wilhelm, I.; Wilkens, H. G.; Will, J. Z.; Williams, E.; Williams, H. H.; Willis, W.; Willocq, S.; Wilson, J. A.; Wilson, M. G.; Wilson, A.; Wingerter-Seez, I.; Winkelmann, S.; Winklmeier, F.; Wittgen, M.; Wolter, M. W.; Wolters, H.; Wong, W. C.; Wooden, G.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wraight, K.; Wright, C.; Wright, M.; Wright, D.; Wrona, B.; Wu, S. L.; Wu, X.; Wu, Y.; Wulf, E.; Wunstorf, R.; Wynne, B. M.; Xaplanteris, L.; Xella, S.; Xie, S.; Xie, Y.; Xu, C.; Xu, D.; Xu, G.; Yabsley, B.; Yacoob, S.; Yamada, M.; Yamaguchi, H.; Yamamoto, A.; Yamamoto, K.; Yamamoto, S.; Yamamura, T.; Yamanaka, T.; Yamaoka, J.; Yamazaki, T.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, U. K.; Yang, Y.; Yang, Y.; Yang, Z.; Yanush, S.; Yao, Y.; Yasu, Y.; Ybeles Smit, G. V.; Ye, J.; Ye, S.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, R.; Young, C.; Youssef, S.; Yu, D.; Yu, J.; Yu, J.; Yuan, L.; Yurkewicz, A.; Zaets, V. G.; Zaidan, R.; Zaitsev, A. M.; Zajacova, Z.; Zalite, Yo. K.; Zanello, L.; Zarzhitsky, P.; Zaytsev, A.; Zeitnitz, C.; Zeller, M.; Zeman, M.; Zemla, A.; Zendler, C.; Zenin, O.; Ženiš, T.; Zenonos, Z.; Zenz, S.; Zerwas, D.; Zevi della Porta, G.; Zhan, Z.; Zhang, D.; Zhang, H.; Zhang, J.; Zhang, X.; Zhang, Z.; Zhang, Q.; Zhao, L.; Zhao, T.; Zhao, Z.; Zhemchugov, A.; Zheng, S.; Zhong, J.; Zhou, B.; Zhou, N.; Zhou, Y.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zhuravlov, V.; Zieminska, D.; Zimmermann, R.; Zimmermann, S.; Zimmermann, S.; Zinonos, Z.; Ziolkowski, M.; Zitoun, R.; Živković, L.; Zmouchko, V. V.; Zobernig, G.; Zoccoli, A.; Zolnierowski, Y.; Zsenei, A.; zur Nedden, M.; Zutshi, V.; Zwalinski, L.</p> <p>2013-03-01</p> <p>The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of sqrt{s}=7 TeV corresponding to an integrated luminosity of 38 pb-1. Jets are reconstructed with the anti- k t algorithm with distance parameters R=0.4 or R=0.6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta p T≥20 GeV and pseudorapidities | η|<4.5. The jet energy systematic uncertainty is estimated using the single isolated hadron response measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2.5 % in the central calorimeter region (| η|<0.8) for jets with 60≤ p T<800 GeV, and is maximally 14 % for p T<30 GeV in the most forward region 3.2≤| η|<4.5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon p T, the sum of the transverse momenta of tracks associated to the jet, or a system of low- p T jets recoiling against a high- p T jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high- p T jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1407172-jet-energy-measurement-atlas-detector-proton-proton-collisions-sqrt-mathrm-mathrm-tev','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1407172-jet-energy-measurement-atlas-detector-proton-proton-collisions-sqrt-mathrm-mathrm-tev"><span>Jet energy measurement with the ATLAS detector in proton-proton collisions at $$\\sqrt{\\mathrm{s}}=7\\ \\mathrm{TeV}$$</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Aad, G.; Abbott, B.; Abdallah, J.; ...</p> <p>2013-03-02</p> <p>The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of √s = 7 TeV corresponding to an integrated luminosity of 38 pb -1. Jets are reconstructed with the anti-k t algorithm with distance parameters R = 0.4 or R = 0.6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta p T ≥ 20 GeV and pseudorapidities |η| < 4.5. The jet energy systematic uncertainty is estimated using the single isolated hadron responsemore » measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2.5 % in the central calorimeter region (|η| < 0.8) for jets with 60 ≤ p T < 800 GeV, and is maximally 14 % for p T ≤ 30 GeV in the most forward region 3.2 ≤ |η| < 4.5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon p T, the sum of the transverse momenta of tracks associated to the jet, or a system of low-p T jets recoiling against a high-p T jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high-p T jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.H53G0942E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.H53G0942E"><span>The Use of Radar-Based Products for Deriving Extreme Rainfall Frequencies Using Regional Frequency Analysis with Application in South Louisiana</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Eldardiry, H. A.; Habib, E. H.</p> <p>2014-12-01</p> <p>Radar-based technologies have made spatially and temporally distributed quantitative precipitation estimates (QPE) available in an operational environmental compared to the raingauges. The floods identified through flash flood monitoring and prediction systems are subject to at least three sources of uncertainties: (a) those related to rainfall estimation errors, (b) those due to streamflow prediction errors due to model structural issues, and (c) those due to errors in defining a flood event. The current study focuses on the first source of uncertainty and its effect on deriving important climatological characteristics of extreme rainfall statistics. Examples of such characteristics are rainfall amounts with certain Average Recurrence Intervals (ARI) or Annual Exceedance Probability (AEP), which are highly valuable for hydrologic and civil engineering design purposes. Gauge-based precipitation frequencies estimates (PFE) have been maturely developed and widely used over the last several decades. More recently, there has been a growing interest by the research community to explore the use of radar-based rainfall products for developing PFE and understand the associated uncertainties. This study will use radar-based multi-sensor precipitation estimates (MPE) for 11 years to derive PFE's corresponding to various return periods over a spatial domain that covers the state of Louisiana in southern USA. The PFE estimation approach used in this study is based on fitting generalized extreme value distribution to hydrologic extreme rainfall data based on annual maximum series (AMS). Some of the estimation problems that may arise from fitting GEV distributions at each radar pixel is the large variance and seriously biased quantile estimators. Hence, a regional frequency analysis approach (RFA) is applied. The RFA involves the use of data from different pixels surrounding each pixel within a defined homogenous region. In this study, region of influence approach along with the index flood technique are used in the RFA. A bootstrap technique procedure is carried out to account for the uncertainty in the distribution parameters to construct 90% confidence intervals (i.e., 5% and 95% confidence limits) on AMS-based precipitation frequency curves.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.H13A1026D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.H13A1026D"><span>Evaluating the Impacts of an Agricultural Water Market in the Guadalupe River Basin, Texas: An Agent-based Modeling Approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Du, E.; Cai, X.; Minsker, B. S.</p> <p>2014-12-01</p> <p>Agriculture comprises about 80 percent of the total water consumption in the US. Under conditions of water shortage and fully committed water rights, market-based water allocations could be promising instruments for agricultural water redistribution from marginally profitable areas to more profitable ones. Previous studies on water market have mainly focused on theoretical or statistical analysis. However, how water users' heterogeneous physical attributes and decision rules about water use and water right trading will affect water market efficiency has been less addressed. In this study, we developed an agent-based model to evaluate the benefits of an agricultural water market in the Guadalupe River Basin during drought events. Agricultural agents with different attributes (i.e., soil type for crops, annual water diversion permit and precipitation) are defined to simulate the dynamic feedback between water availability, irrigation demand and water trading activity. Diversified crop irrigation rules and water bidding rules are tested in terms of crop yield, agricultural profit, and water-use efficiency. The model was coupled with a real-time hydrologic model and run under different water scarcity scenarios. Preliminary results indicate that an agricultural water market is capable of increasing crop yield, agricultural profit, and water-use efficiency. This capability is more significant under moderate drought scenarios than in mild and severe drought scenarios. The water market mechanism also increases agricultural resilience to climate uncertainty by reducing crop yield variance in drought events. The challenges of implementing an agricultural water market under climate uncertainty are also discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/sir/2017/5038/sir20175038.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/sir/2017/5038/sir20175038.pdf"><span>Application of at-site peak-streamflow frequency analyses for very low annual exceedance probabilities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Asquith, William H.; Kiang, Julie E.; Cohn, Timothy A.</p> <p>2017-07-17</p> <p>The U.S. Geological Survey (USGS), in cooperation with the U.S. Nuclear Regulatory Commission, has investigated statistical methods for probabilistic flood hazard assessment to provide guidance on very low annual exceedance probability (AEP) estimation of peak-streamflow frequency and the quantification of corresponding uncertainties using streamgage-specific data. The term “very low AEP” implies exceptionally rare events defined as those having AEPs less than about 0.001 (or 1 × 10–3 in scientific notation or for brevity 10–3). Such low AEPs are of great interest to those involved with peak-streamflow frequency analyses for critical infrastructure, such as nuclear power plants. Flood frequency analyses at streamgages are most commonly based on annual instantaneous peak streamflow data and a probability distribution fit to these data. The fitted distribution provides a means to extrapolate to very low AEPs. Within the United States, the Pearson type III probability distribution, when fit to the base-10 logarithms of streamflow, is widely used, but other distribution choices exist. The USGS-PeakFQ software, implementing the Pearson type III within the Federal agency guidelines of Bulletin 17B (method of moments) and updates to the expected moments algorithm (EMA), was specially adapted for an “Extended Output” user option to provide estimates at selected AEPs from 10–3 to 10–6. Parameter estimation methods, in addition to product moments and EMA, include L-moments, maximum likelihood, and maximum product of spacings (maximum spacing estimation). This study comprehensively investigates multiple distributions and parameter estimation methods for two USGS streamgages (01400500 Raritan River at Manville, New Jersey, and 01638500 Potomac River at Point of Rocks, Maryland). The results of this study specifically involve the four methods for parameter estimation and up to nine probability distributions, including the generalized extreme value, generalized log-normal, generalized Pareto, and Weibull. Uncertainties in streamflow estimates for corresponding AEP are depicted and quantified as two primary forms: quantile (aleatoric [random sampling] uncertainty) and distribution-choice (epistemic [model] uncertainty). Sampling uncertainties of a given distribution are relatively straightforward to compute from analytical or Monte Carlo-based approaches. Distribution-choice uncertainty stems from choices of potentially applicable probability distributions for which divergence among the choices increases as AEP decreases. Conventional goodness-of-fit statistics, such as Cramér-von Mises, and L-moment ratio diagrams are demonstrated in order to hone distribution choice. The results generally show that distribution choice uncertainty is larger than sampling uncertainty for very low AEP values.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70034278','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70034278"><span>The 1170 and 1202 CE Dead Sea Rift earthquakes and long-term magnitude distribution of the Dead Sea Fault zone</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Hough, S.E.; Avni, R.</p> <p>2009-01-01</p> <p>In combination with the historical record, paleoseismic investigations have provided a record of large earthquakes in the Dead Sea Rift that extends back over 1500 years. Analysis of macroseismic effects can help refine magnitude estimates for large historical events. In this study we consider the detailed intensity distributions for two large events, in 1170 CE and 1202 CE, as determined from careful reinterpretation of available historical accounts, using the 1927 Jericho earthquake as a guide in their interpretation. In the absence of an intensity attenuation relationship for the Dead Sea region, we use the 1927 Jericho earthquake to develop a preliminary relationship based on a modification of the relationships developed in other regions. Using this relation, we estimate M7.6 for the 1202 earthquake and M6.6 for the 1170 earthquake. The uncertainties for both estimates are large and difficult to quantify with precision. The large uncertainties illustrate the critical need to develop a regional intensity attenuation relation. We further consider the distribution of magnitudes in the historic record and show that it is consistent with a b-value distribution with a b-value of 1. Considering the entire Dead Sea Rift zone, we show that the seismic moment release rate over the past 1500 years is sufficient, within the uncertainties of the data, to account for the plate tectonic strain rate along the plate boundary. The results reveal that an earthquake of M7.8 is expected within the zone on average every 1000 years. ?? 2011 Science From Israel/LPPLtd.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.5231B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.5231B"><span>Flash floods in small Alpine catchments in a changing climate</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Breinl, Korbinian; Di Baldassarre, Giuliano</p> <p>2017-04-01</p> <p>Climate change is expected to increase the frequency and intensity of hazardous meteorological and hydrological events in numerous mountainous areas. The mountain environment is becoming more and more important for urbanization and the tourism-based economy. Here we show new and innovative methodologies for assessing intensity and frequency of flash floods in small Alpine catchments, in South Tyrol (Italy), under climate change. This research is done within the STEEP STREAMS project, whereby we work closely with decision makers in Italian authorities, and the final goal is to provide them with clear guidelines on how to adapt current structural solutions for mitigating hazardous events under future climate conditions. To this end, we develop a coupled framework of weather generation (i.e. extrapolation of observations and trained with climate projections), time series disaggregation and hydrological modelling using the conceptual HBV model. One of the key challenges is the transfer of comparatively coarse RCM projections to small catchments, whose sizes range from only about 10km2 to 100km2. We examine different strategies to downscale the RCM data from e.g. the EURO-CORDEX dataset using our weather generator. The selected projections represent combinations of warmer, milder, drier and wetter conditions. In general, our main focus is to develop an improved understanding of the impact of the multiple sources of uncertainty in this modelling framework, and make these uncertainties tangible. The output of this study (i.e. discharge with a return period and associated uncertainty) will allow hydraulic and sediment transport modelling of flash floods and debris flows.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2012-08-09/pdf/2012-19580.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2012-08-09/pdf/2012-19580.pdf"><span>77 FR 47552 - Event Data Recorders</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2012-08-09</p> <p>... uncertainties in multiple event crashes; Revised certain sensor ranges and accuracies to reflect current state... resolution specification of 5 degrees. In its petition the Alliance stated that steering wheel angle sensors... angle sensors. Both Nissan and GAM submitted comments in support of the Alliance and Honda petitions to...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20070005030','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20070005030"><span>Evaluating Shielding Effectiveness for Reducing Space Radiation Cancer Risks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei</p> <p>2007-01-01</p> <p>We discuss calculations of probability distribution functions (PDF) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPE). The PDF s are used in significance tests of the effectiveness of potential radiation shielding approaches. Uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments are considered in models of cancer risk PDF s. Competing mortality risks and functional correlations in radiation quality factor uncertainties are treated in the calculations. We show that the cancer risk uncertainty, defined as the ratio of the 95% confidence level (CL) to the point estimate is about 4-fold for lunar and Mars mission risk projections. For short-stay lunar missions (<180 d), SPE s present the most significant risk, however one that is mitigated effectively by shielding, especially for carbon composites structures with high hydrogen content. In contrast, for long duration lunar (>180 d) or Mars missions, GCR risks may exceed radiation risk limits, with 95% CL s exceeding 10% fatal risk for males and females on a Mars mission. For reducing GCR cancer risks, shielding materials are marginally effective because of the penetrating nature of GCR and secondary radiation produced in tissue by relativistic particles. At the present time, polyethylene or carbon composite shielding can not be shown to significantly reduce risk compared to aluminum shielding based on a significance test that accounts for radiobiology uncertainties in GCR risk projection.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28873257','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28873257"><span>Deep Uncertainties in Sea-Level Rise and Storm Surge Projections: Implications for Coastal Flood Risk Management.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus</p> <p>2017-09-05</p> <p>Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018AtmRe.204..136F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018AtmRe.204..136F"><span>Estimating the snowfall limit in alpine and pre-alpine valleys: A local evaluation of operational approaches</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fehlmann, Michael; Gascón, Estíbaliz; Rohrer, Mario; Schwarb, Manfred; Stoffel, Markus</p> <p>2018-05-01</p> <p>The snowfall limit has important implications for different hazardous processes associated with prolonged or heavy precipitation such as flash floods, rain-on-snow events and freezing precipitation. To increase preparedness and to reduce risk in such situations, early warning systems are frequently used to monitor and predict precipitation events at different temporal and spatial scales. However, in alpine and pre-alpine valleys, the estimation of the snowfall limit remains rather challenging. In this study, we characterize uncertainties related to snowfall limit for different lead times based on local measurements of a vertically pointing micro rain radar (MRR) and a disdrometer in the Zulg valley, Switzerland. Regarding the monitoring, we show that the interpolation of surface temperatures tends to overestimate the altitude of the snowfall limit and can thus lead to highly uncertain estimates of liquid precipitation in the catchment. This bias is much smaller in the Integrated Nowcasting through Comprehensive Analysis (INCA) system, which integrates surface station and remotely sensed data as well as outputs of a numerical weather prediction model. To reduce systematic error, we perform a bias correction based on local MRR measurements and thereby demonstrate the added value of such measurements for the estimation of liquid precipitation in the catchment. Regarding the nowcasting, we show that the INCA system provides good estimates up to 6 h ahead and is thus considered promising for operational hydrological applications. Finally, we explore the medium-range forecasting of precipitation type, especially with respect to rain-on-snow events. We show for a selected case study that the probability for a certain precipitation type in an ensemble-based forecast is more persistent than the respective type in the high-resolution forecast (HRES) of the European Centre for Medium Range Weather Forecasts Integrated Forecasting System (ECMWF IFS). In this case study, the ensemble-based forecast could be used to anticipate such an event up to 7-8 days ahead, whereas the use of the HRES is limited to a lead time of 4-5 days. For the different lead times investigated, we point out possibilities of considering uncertainties in snowfall limit and precipitation type estimates so as to increase preparedness to risk situations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17923973','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17923973"><span>Illness uncertainty and treatment motivation in type 2 diabetes patients.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Apóstolo, João Luís Alves; Viveiros, Catarina Sofia Castro; Nunes, Helena Isabel Ribeiro; Domingues, Helena Raquel Faustino</p> <p>2007-01-01</p> <p>To characterize the uncertainty in illness and the motivation for treatment and to evaluate the existing relation between these variables in individuals with type 2 diabetes. Descriptive, correlational study, using a sample of 62 individuals in diabetes consultation sessions. The Uncertainty Stress Scale and the Treatment Self-Regulation Questionnaire were used. The individuals with type 2 diabetes present low levels of uncertainty in illness and a high motivation for treatment, with a stronger intrinsic than extrinsic motivation. A negative correlation was verified between the uncertainty in the face of the prognosis and treatment and the intrinsic motivation. These individuals are already adapted, acting according to the meanings they attribute to illness. Uncertainty can function as a threat, intervening negatively in the attribution of meaning to the events related to illness and in the process of adaptation and motivation to adhere to treatment. Intrinsic motivation seems to be essential to adhere to treatment.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16554172','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16554172"><span>The evaluation of uncertainty in low-level LSC measurements of water samples.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rusconi, R; Forte, M; Caresana, M; Bellinzona, S; Cazzaniga, M T; Sgorbati, G</p> <p>2006-01-01</p> <p>The uncertainty in measurements of gross alpha and beta activities in water samples by liquid scintillation counting with alpha/beta discrimination has been evaluated considering the problems typical of low-level measurements of environmental samples. The use of a pulse shape analysis device to discriminate alpha and beta events introduces a correlation between some of the input quantities, and it has to be considered. Main contributors to total uncertainty have been assessed by specifically designed experimental tests. Results have been fully examined and discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1810805B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1810805B"><span>Automated rapid finite fault inversion for megathrust earthquakes: Application to the Maule (2010), Iquique (2014) and Illapel (2015) great earthquakes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Benavente, Roberto; Cummins, Phil; Dettmer, Jan</p> <p>2016-04-01</p> <p>Rapid estimation of the spatial and temporal rupture characteristics of large megathrust earthquakes by finite fault inversion is important for disaster mitigation. For example, estimates of the spatio-temporal evolution of rupture can be used to evaluate population exposure to tsunami waves and ground shaking soon after the event by providing more accurate predictions than possible with point source approximations. In addition, rapid inversion results can reveal seismic source complexity to guide additional, more detailed subsequent studies. This work develops a method to rapidly estimate the slip distribution of megathrust events while reducing subjective parameter choices by automation. The method is simple yet robust and we show that it provides excellent preliminary rupture models as soon as 30 minutes for three great earthquakes in the South-American subduction zone. This may slightly change for other regions depending on seismic station coverage but method can be applied to any subduction region. The inversion is based on W-phase data since it is rapidly and widely available and of low amplitude which avoids clipping at close stations for large events. In addition, prior knowledge of the slab geometry (e.g. SLAB 1.0) is applied and rapid W-phase point source information (time delay and centroid location) is used to constrain the fault geometry and extent. Since the linearization by multiple time window (MTW) parametrization requires regularization, objective smoothing is achieved by the discrepancy principle in two fully automated steps. First, the residuals are estimated assuming unknown noise levels, and second, seeking a subsequent solution which fits the data to noise level. The MTW scheme is applied with positivity constraints and a solution is obtained by an efficient non-negative least squares solver. Systematic application of the algorithm to the Maule (2010), Iquique (2014) and Illapel (2015) events illustrates that rapid finite fault inversion with teleseismic data is feasible and provides meaningful results. The results for the three events show excellent data fits and are consistent with other solutions showing most of the slip occurring close to the trench for the Maule an Illapel events and some deeper slip for the Iquique event. Importantly, the Illapel source model predicts tsunami waveforms of close agreement with observed waveforms. Finally, we develop a new Bayesian approach to approximate uncertainties as part of the rapid inversion scheme with positivity constraints. Uncertainties are estimated by approximating the posterior distribution as a multivariate log-normal distribution. While solving for the posterior adds some additional computational cost, we illustrate that uncertainty estimation is important for meaningful interpretation of finite fault models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.3399G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.3399G"><span>Validation and evaluation of epistemic uncertainty in rainfall thresholds for regional scale landslide forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gariano, Stefano Luigi; Brunetti, Maria Teresa; Iovine, Giulio; Melillo, Massimo; Peruccacci, Silvia; Terranova, Oreste Giuseppe; Vennari, Carmela; Guzzetti, Fausto</p> <p>2015-04-01</p> <p>Prediction of rainfall-induced landslides can rely on empirical rainfall thresholds. These are obtained from the analysis of past rainfall events that have (or have not) resulted in slope failures. Accurate prediction requires reliable thresholds, which need to be validated before their use in operational landslide warning systems. Despite the clear relevance of validation, only a few studies have addressed the problem, and have proposed and tested robust validation procedures. We propose a validation procedure that allows for the definition of optimal thresholds for early warning purposes. The validation is based on contingency table, skill scores, and receiver operating characteristic (ROC) analysis. To establish the optimal threshold, which maximizes the correct landslide predictions and minimizes the incorrect predictions, we propose an index that results from the linear combination of three weighted skill scores. Selection of the optimal threshold depends on the scope and the operational characteristics of the early warning system. The choice is made by selecting appropriately the weights, and by searching for the optimal (maximum) value of the index. We discuss weakness in the validation procedure caused by the inherent lack of information (epistemic uncertainty) on landslide occurrence typical of large study areas. When working at the regional scale, landslides may have occurred and may have not been reported. This results in biases and variations in the contingencies and the skill scores. We introduce two parameters to represent the unknown proportion of rainfall events (above and below the threshold) for which landslides occurred and went unreported. We show that even a very small underestimation in the number of landslides can result in a significant decrease in the performance of a threshold measured by the skill scores. We show that the variations in the skill scores are different for different uncertainty of events above or below the threshold. This has consequences in the ROC analysis. We applied the proposed procedure to a catalogue of rainfall conditions that have resulted in landslides, and to a set of rainfall events that - presumably - have not resulted in landslides, in Sicily, in the period 2002-2012. First, we determined regional event duration-cumulated event (ED) rainfall thresholds for shallow landslide occurrence using 200 rainfall conditions that have resulted in 223 shallow landslides in Sicily in the period 2002-2011. Next, we validated the thresholds using 29 rainfall conditions that have triggered 42 shallow landslides in Sicily in 2012, and 1250 rainfall events that presumably have not resulted in landslides in the same year. We performed a back analysis simulating the use of the thresholds in a hypothetical landslide warning system operating in 2012.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JHEP...01..144H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JHEP...01..144H"><span>Model-independent determination of the strong phase difference between D 0 and {\\overline{D}}^0\\to {π}+{π}-{π}+{π}- amplitudes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Harnew, Samuel; Naik, Paras; Prouve, Claire; Rademacker, Jonas; Asner, David</p> <p>2018-01-01</p> <p>For the first time, the strong phase difference between D 0 and {\\overline{D}}^0\\to {π}+{π}-{π}+{π}- amplitudes is determined in bins of the decay phase space. The measurement uses 818 pb-1 of e + e - collision data that is taken at the ψ(3770) resonance and collected by the CLEO-c experiment. The measurement is important for the determination of the CP -violating phase γ in B ± → DK ± (and similar) decays, where the D meson (which represents a superposition of D 0 and {\\overline{D}}^0 ) subsequently decays to π + π - π + π -. To obtain optimal sensitivity to γ, the phase space of the D → π + π - π + π - decay is divided into bins based on a recent amplitude model of the decay. Although an amplitude model is used to define the bins, the measurements obtained are model-independent. The CP -even fraction of the D → π + π - π + π - decay is determined to be F + 4 π = 0.769 ± 0.021 ± 0.010, where the uncertainties are statistical and systematic, respectively. Using simulated B ± → DK ±, D → π + π - π + π - decays, it is estimated that by the end of the current LHC run, the LHCb experiment could determine γ from this decay mode with an uncertainty of (±10 ± 7)°, where the first uncertainty is statistical based on estimated LHCb event yields, and the second is due to the uncertainties on the parameters determined in this paper.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26199367','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26199367"><span>Phylogenetic congruence of parasitic smut fungi (Anthracoidea, Anthracoideaceae) and their host plants (Carex, Cyperaceae): Cospeciation or host-shift speciation?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Escudero, Marcial</p> <p>2015-07-01</p> <p>• Fahrenholz's rule states that common ancestors of extant parasites were parasites of the common ancestors of extant hosts. Consequently, parasite phylogeny should mirror host phylogeny. The smut fungi genus Anthracoidea (Anthracoideaceae) is mainly hosted by species of the genus Carex (Cyperaceae). Whether smut fungi phylogeny mirrors sedge phylogeny is still under debate.• The nuclear large subunit DNA region (LSU; 57 accessions) from 31 Anthracoidea species and the ITS, ETS, and trnL-F spacer-trnL intron complex from 41 Carex species were used to infer the phylogenetic history of parasites and their hosts using a maximum likelihood approach. Event-based and distance-based cophylogenetic methods were used to test the hypothesis of whether the phylogeny of smut fungi from the genus Anthracoidea matches the phylogeny of the sedge Carex species they host.• Cophylogenetic reconstructions taking into account phylogenetic uncertainties based on event-based analyses demonstrated that the Anthracoidea phylogeny has significant topological congruence with the phylogeny of their Carex hosts. A distance-based test was also significant; therefore, the phylogenies of Anthracoide and Carex are partially congruent.• The phylogenetic congruence of Anthracoidea and Carex is partially based on smut fungi species being preferentially hosted by closely related sedges (host conservatism). In addition, many different events rather than only codivergence events are inferred. All of this evidence suggests that host-shift speciation rather than cospeciation seems to explain the cophylogenetic patterns of Anthracoidea and Carex. © 2015 Botanical Society of America, Inc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JSeis..22..439S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JSeis..22..439S"><span>Two large earthquakes in western Switzerland in the sixteenth century: 1524 in Ardon (VS) and 1584 in Aigle (VD)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schwarz-Zanetti, Gabriela; Fäh, Donat; Gache, Sylvain; Kästli, Philipp; Loizeau, Jeanluc; Masciadri, Virgilio; Zenhäusern, Gregor</p> <p>2018-03-01</p> <p>The Valais is the most seismically active region of Switzerland. Strong damaging events occurred in 1755, 1855, and 1946. Based on historical documents, we discuss two known damaging events in the sixteenth century: the 1524 Ardon and the 1584 Aigle earthquakes. For the 1524, a document describes damage in Ardon, Plan-Conthey, and Savièse, and a stone tablet at the new bell tower of the Ardon church confirms the reconstruction of the bell tower after the earthquake. Additionally, a significant construction activity in the Upper Valais churches during the second quarter of the sixteenth century is discussed that however cannot be clearly related to this event. The assessed moment magnitude Mw of the 1524 event is 5.8, with an error of about 0.5 units corresponding to one standard deviation. The epicenter is at 46.27 N, 7.27 E with a high uncertainty of about 50 km corresponding to one standard deviation. The assessed moment magnitude Mw of the 1584 main shock is 5.9, with an error of about 0.25 units corresponding to one standard deviation. The epicenter is at 46.33 N and 6.97 E with an uncertainty of about 25 km corresponding to one standard deviation. Exceptional movements in the Lake Geneva wreaked havoc along the shore of the Rhone delta. The large dimension of the induced damage can be explained by an expanded subaquatic slide with resultant tsunami and seiche in Lake Geneva. The strongest of the aftershocks occurred on March 14 with magnitude 5.4 and triggered a destructive landslide covering the villages Corbeyrier and Yvorne, VD.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=classical+AND+mechanics&pg=4&id=EJ956759','ERIC'); return false;" href="https://eric.ed.gov/?q=classical+AND+mechanics&pg=4&id=EJ956759"><span>Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie</p> <p>2011-01-01</p> <p>Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018GeoJI.213..940L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018GeoJI.213..940L"><span>Impact of magnitude uncertainties on seismic catalogue properties</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Leptokaropoulos, K. M.; Adamaki, A. K.; Roberts, R. G.; Gkarlaouni, C. G.; Paradisopoulou, P. M.</p> <p>2018-05-01</p> <p>Catalogue-based studies are of central importance in seismological research, to investigate the temporal, spatial and size distribution of earthquakes in specified study areas. Methods for estimating the fundamental catalogue parameters like the Gutenberg-Richter (G-R) b-value and the completeness magnitude (Mc) are well established and routinely applied. However, the magnitudes reported in seismicity catalogues contain measurement uncertainties which may significantly distort the estimation of the derived parameters. In this study, we use numerical simulations of synthetic data sets to assess the reliability of different methods for determining b-value and Mc, assuming the G-R law validity. After contaminating the synthetic catalogues with Gaussian noise (with selected standard deviations), the analysis is performed for numerous data sets of different sample size (N). The noise introduced to the data generally leads to a systematic overestimation of magnitudes close to and above Mc. This fact causes an increase of the average number of events above Mc, which in turn leads to an apparent decrease of the b-value. This may result to a significant overestimation of seismicity rate even well above the actual completeness level. The b-value can in general be reliably estimated even for relatively small data sets (N < 1000) when only magnitudes higher than the actual completeness level are used. Nevertheless, a correction of the total number of events belonging in each magnitude class (i.e. 0.1 unit) should be considered, to deal with the magnitude uncertainty effect. Because magnitude uncertainties (here with the form of Gaussian noise) are inevitable in all instrumental catalogues, this finding is fundamental for seismicity rate and seismic hazard assessment analyses. Also important is that for some data analyses significant bias cannot necessarily be avoided by choosing a high Mc value for analysis. In such cases, there may be a risk of severe miscalculation of seismicity rate regardless the selected magnitude threshold, unless possible bias is properly assessed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26436729','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26436729"><span>Experimental Research Examining How People Can Cope with Uncertainty Through Soft Haptic Sensations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>van Horen, Femke; Mussweiler, Thomas</p> <p>2015-09-16</p> <p>Human beings are constantly surrounded by uncertainty and change. The question arises how people cope with such uncertainty. To date, most research has focused on the cognitive strategies people adopt to deal with uncertainty. However, especially when uncertainty is due to unpredictable societal events (e.g., economical crises, political revolutions, terrorism threats) of which one is unable to judge the impact on one's future live, cognitive strategies (like seeking additional information) is likely to fail to combat uncertainty. Instead, the current paper discusses a method demonstrating that people might deal with uncertainty experientially through soft haptic sensations. More specifically, because touching something soft creates a feeling of comfort and security, people prefer objects with softer as compared to harder properties when feeling uncertain. Seeking for softness is a highly efficient and effective tool to deal with uncertainty as our hands are available at all times. This protocol describes a set of methods demonstrating 1) how environmental (un)certainty can be situationally activated with an experiential priming procedure, 2) that the quality of the softness experience (what type of softness and how it is experienced) matters and 3) how uncertainty can be reduced using different methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFM.B43C0503S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFM.B43C0503S"><span>Modeling dynamics of western juniper under climate change in a semiarid ecosystem</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Shrestha, R.; Glenn, N. F.; Flores, A. N.</p> <p>2013-12-01</p> <p>Modeling future vegetation dynamics in response to climate change and disturbances such as fire relies heavily on model parameterization. Fine-scale field-based measurements can provide the necessary parameters for constraining models at a larger scale. But the time- and labor-intensive nature of field-based data collection leads to sparse sampling and significant spatial uncertainties in retrieved parameters. In this study we quantify the fine-scale carbon dynamics and uncertainty of juniper woodland in the Reynolds Creek Experimental Watershed (RCEW) in southern Idaho, which is a proposed critical zone observatory (CZO) site for soil carbon processes. We leverage field-measured vegetation data along with airborne lidar and timeseries Landsat imagery to initialize a state-and-transition model (VDDT) and a process-based fire-model (FlamMap) to examine the vegetation dynamics in response to stochastic fire events and climate change. We utilize recently developed and novel techniques to measure biomass and canopy characteristics of western juniper at the individual tree scale using terrestrial and airborne laser scanning techniques in RCEW. These fine-scale data are upscaled across the watershed for the VDDT and FlamMap models. The results will immediately improve our understanding of fine-scale dynamics and carbon stocks and fluxes of woody vegetation in a semi-arid ecosystem. Moreover, quantification of uncertainty will also provide a basis for generating ensembles of spatially-explicit alternative scenarios to guide future land management decisions in the region.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1912797N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1912797N"><span>Rainfall or parameter uncertainty? The power of sensitivity analysis on grouped factors</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nossent, Jiri; Pereira, Fernando; Bauwens, Willy</p> <p>2017-04-01</p> <p>Hydrological models are typically used to study and represent (a part of) the hydrological cycle. In general, the output of these models mostly depends on their input rainfall and parameter values. Both model parameters and input precipitation however, are characterized by uncertainties and, therefore, lead to uncertainty on the model output. Sensitivity analysis (SA) allows to assess and compare the importance of the different factors for this output uncertainty. Hereto, the rainfall uncertainty can be incorporated in the SA by representing it as a probabilistic multiplier. Such multiplier can be defined for the entire time series, or several of these factors can be determined for every recorded rainfall pulse or for hydrological independent storm events. As a consequence, the number of parameters included in the SA related to the rainfall uncertainty can be (much) lower or (much) higher than the number of model parameters. Although such analyses can yield interesting results, it remains challenging to determine which type of uncertainty will affect the model output most due to the different weight both types will have within the SA. In this study, we apply the variance based Sobol' sensitivity analysis method to two different hydrological simulators (NAM and HyMod) for four diverse watersheds. Besides the different number of model parameters (NAM: 11 parameters; HyMod: 5 parameters), the setup of our sensitivity and uncertainty analysis-combination is also varied by defining a variety of scenarios including diverse numbers of rainfall multipliers. To overcome the issue of the different number of factors and, thus, the different weights of the two types of uncertainty, we build on one of the advantageous properties of the Sobol' SA, i.e. treating grouped parameters as a single parameter. The latter results in a setup with a single factor for each uncertainty type and allows for a straightforward comparison of their importance. In general, the results show a clear influence of the weights in the different SA scenarios. However, working with grouped factors resolves this issue and leads to clear importance results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.C11A0893P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.C11A0893P"><span>Challenges in Understanding and Predicting Greenland Lake Drainage Events</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Poinar, K.; Andrews, L. C.; Moon, T. A.; Nowicki, S.</p> <p>2017-12-01</p> <p>To accurately predict ice flow, an ice-sheet model must resolve the complex spatio-temporal variability of the ice-sheet hydrologic system. For Greenland, this requires understanding rapid lake drainage events, by which moulins deliver water from supraglacial lakes to the ice-sheet base. Critical metrics include the drainage event location and its timing during the melt season. Here, we use multiple remote sensing datasets to investigate whether local principal strain rates control the dates of rapid supraglacial lake drainage events. We identify 359 rapid lake drainage events through a semi-automated analysis of MODIS and Landsat imagery, which we apply to Pâkitsoq, western Greenland, over nine summers (2006-2010 and 2013-2016). We compare these drainage dates to principal strain rates derived from InSAR (MEaSUREs and other products) and Landsat (GoLIVE and other products) satellite data over the same years. The InSAR-derived strain rates have lower uncertainties ( 0.01 yr-1) but capture only a wintertime average; the Landsat-derived strain rates have larger uncertainties ( 0.1 yr-1) but feature higher temporal resolution (≥16 days) and span the entire year, including the melt season. We find that locations with more-tensile wintertime strain rates are associated with earlier draining of supraglacial lakes in the subsequent summer. This is consistent with observations of lake drainage "clusters" or "cascades", where the perturbation from an initial lake drainage event is thought to trigger other lake drainages in the area. Our relation is not statistically significant, however, and any causality is complicated by a stronger correlation with more traditional metrics such as surface elevation and cumulative melt days. We also find that the Landsat-derived summertime strain rates, despite their higher temporal resolution, do not resolve the transient extensional strain rates known from GPS observations to accompany and/or incite rapid lake drainages. Our results highlight the current challenges in observing, at the regional scale, the causes of rapid lake drainage events, which must be better understood in order to parameterize surface-to-bed hydrological connections in ice-sheet models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70195523','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70195523"><span>Variability in soil-water retention properties and implications for physics-based simulation of landslide early warning criteria</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Thomas, Matthew A.; Mirus, Benjamin B.; Collins, Brian D.; Lu, Ning; Godt, Jonathan W.</p> <p>2018-01-01</p> <p>Rainfall-induced shallow landsliding is a persistent hazard to human life and property. Despite the observed connection between infiltration through the unsaturated zone and shallow landslide initiation, there is considerable uncertainty in how estimates of unsaturated soil-water retention properties affect slope stability assessment. This source of uncertainty is critical to evaluating the utility of physics-based hydrologic modeling as a tool for landslide early warning. We employ a numerical model of variably saturated groundwater flow parameterized with an ensemble of texture-, laboratory-, and field-based estimates of soil-water retention properties for an extensively monitored landslide-prone site in the San Francisco Bay Area, CA, USA. Simulations of soil-water content, pore-water pressure, and the resultant factor of safety show considerable variability across and within these different parameter estimation techniques. In particular, we demonstrate that with the same permeability structure imposed across all simulations, the variability in soil-water retention properties strongly influences predictions of positive pore-water pressure coincident with widespread shallow landsliding. We also find that the ensemble of soil-water retention properties imposes an order-of-magnitude and nearly two-fold variability in seasonal and event-scale landslide susceptibility, respectively. Despite the reduced factor of safety uncertainty during wet conditions, parameters that control the dry end of the soil-water retention function markedly impact the ability of a hydrologic model to capture soil-water content dynamics observed in the field. These results suggest that variability in soil-water retention properties should be considered for objective physics-based simulation of landslide early warning criteria.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.S23A0783C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.S23A0783C"><span>Broad-band Lg Attenuation Tomography in Eastern Eurasia and The Resolution, Uncertainty and Data Predication</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chen, Y.; Xu, X.</p> <p>2017-12-01</p> <p>The broad band Lg 1/Q tomographic models in eastern Eurasia are inverted from source- and site-corrected path 1/Q data. The path 1/Q are measured between stations (or events) by the two-station (TS), reverse two-station (RTS) and reverse two-event (RTE) methods, respectively. Because path 1/Q are computed using logarithm of the product of observed spectral ratios and simplified 1D geometrical spreading correction, they are subject to "modeling errors" dominated by uncompensated 3D structural effects. We have found in Chen and Xie [2017] that these errors closely follow normal distribution after the long-tailed outliers are screened out (similar to teleseismic travel time residuals). We thus rigorously analyze the statistics of these errors collected from repeated samplings of station (and event) pairs from 1.0 to 10.0Hz and reject about 15% outliers at each frequency band. The resultant variance of Δ/Q decreases with frequency as 1/f2. The 1/Q tomography using screened data is now a stochastic inverse problem with solutions approximate the means of Gaussian random variables and the model covariance matrix is that of Gaussian variables with well-known statistical behavior. We adopt a new SVD based tomographic method to solve for 2D Q image together with its resolution and covariance matrices. The RTS and RTE yield the most reliable 1/Q data free of source and site effects, but the path coverage is rather sparse due to very strict recording geometry. The TS absorbs the effects of non-unit site response ratios into 1/Q data. The RTS also yields site responses, which can then be corrected from the path 1/Q of TS to make them also free of site effect. The site corrected TS data substantially improve path coverage, allowing able to solve for 1/Q tomography up to 6.0Hz. The model resolution and uncertainty are first quantitively accessed by spread functions (fulfilled by resolution matrix) and covariance matrix. The reliably retrieved Q models correlate well with the distinct tectonic blocks featured by the most recent major deformations and vary with frequencies. With the 1/Q tomographic model and its covariance matrix, we can formally estimate the uncertainty of any path-specific Lg 1/Q prediction. This new capability significantly benefits source estimation for which reliable uncertainty estimate is especially important.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17..434J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17..434J"><span>How does higher frequency monitoring data affect the calibration of a process-based water quality model?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jackson-Blake, Leah; Helliwell, Rachel</p> <p>2015-04-01</p> <p>Process-based catchment water quality models are increasingly used as tools to inform land management. However, for such models to be reliable they need to be well calibrated and shown to reproduce key catchment processes. Calibration can be challenging for process-based models, which tend to be complex and highly parameterised. Calibrating a large number of parameters generally requires a large amount of monitoring data, spanning all hydrochemical conditions. However, regulatory agencies and research organisations generally only sample at a fortnightly or monthly frequency, even in well-studied catchments, often missing peak flow events. The primary aim of this study was therefore to investigate how the quality and uncertainty of model simulations produced by a process-based, semi-distributed catchment model, INCA-P (the INtegrated CAtchment model of Phosphorus dynamics), were improved by calibration to higher frequency water chemistry data. Two model calibrations were carried out for a small rural Scottish catchment: one using 18 months of daily total dissolved phosphorus (TDP) concentration data, another using a fortnightly dataset derived from the daily data. To aid comparability, calibrations were carried out automatically using the Markov Chain Monte Carlo - DiffeRential Evolution Adaptive Metropolis (MCMC-DREAM) algorithm. Calibration to daily data resulted in improved simulation of peak TDP concentrations and improved model performance statistics. Parameter-related uncertainty in simulated TDP was large when fortnightly data was used for calibration, with a 95% credible interval of 26 μg/l. This uncertainty is comparable in size to the difference between Water Framework Directive (WFD) chemical status classes, and would therefore make it difficult to use this calibration to predict shifts in WFD status. The 95% credible interval reduced markedly with the higher frequency monitoring data, to 6 μg/l. The number of parameters that could be reliably auto-calibrated was lower for the fortnightly data, with a physically unrealistic TDP simulation being produced when too many parameters were allowed to vary during model calibration. Parameters should not therefore be varied spatially for models such as INCA-P unless there is solid evidence that this is appropriate, or there is a real need to do so for the model to fulfil its purpose. This study highlights the potential pitfalls of using low frequency timeseries of observed water quality to calibrate complex process-based models. For reliable model calibrations to be produced, monitoring programmes need to be designed which capture system variability, in particular nutrient dynamics during high flow events. In addition, there is a need for simpler models, so that all model parameters can be included in auto-calibration and uncertainty analysis, and to reduce the data needs during calibration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013E%26PSL.366..151G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013E%26PSL.366..151G"><span>Coseismic slip variation assessed from terrestrial lidar scans of the El Mayor-Cucapah surface rupture</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gold, Peter O.; Oskin, Michael E.; Elliott, Austin J.; Hinojosa-Corona, Alejandro; Taylor, Michael H.; Kreylos, Oliver; Cowgill, Eric</p> <p>2013-03-01</p> <p>We analyze high-resolution (>103 points/m2) terrestrial lidar surveys of the 4 April 2010 El Mayor-Cucapah earthquake rupture (Baja California, Mexico), collected at three sites 12-18 days after the event. Using point cloud-based tools in an immersive visualization environment, we quantify coseismic fault slip for hundreds of meters along strike and construct densely constrained along-strike slip distributions from measurements of offset landforms. Uncertainty bounds for each offset, determined empirically by repeatedly measuring offsets at each site sequentially, illuminate measurement uncertainties that are difficult to quantify in the field. These uncertainties are used to define length scales over which variability in slip distributions may be assumed to reflect either recognizable earthquake mechanisms or measurement noise. At two sites characterized by 2-3 m of concentrated right-oblique slip, repeat measurements yield 2σ uncertainties of ±11-12%. Each site encompasses ∼200 m along strike, and a smoothed linear slip gradient satisfies all measurement distributions, implying along-fault strains of ∼10-3. Conversely, the common practice of defining the slip curve by the local slip maxima distorts the curve, overestimates along-fault strain, and may overestimate actual fault slip by favoring measurements with large, positive, uncertainties. At a third site characterized by 1-2.5 m of diffuse normal slip, repeat measurements of fault throw summed along fault-perpendicular profiles yield 2σ uncertainties of ±17%. Here, a low order polynomial fit through the measurement averages best approximates surface slip. However independent measurements of off-fault strain accommodated by hanging wall flexure suggest that over the ∼200 m length of this site, a linear interpolation through the average values for the slip maxima at either end of this site most accurately represents subsurface displacement. In aggregate, these datasets show that given uncertainties of greater than ±11% (2σ), slip distributions over shorter scales are likely to be less uneven than those derived from a single set of field- or lidar-based measurements. This suggests that the relatively smooth slip curves we obtain over ∼102 m distances reflect real physical phenomena, whereas short wavelength variability over ∼100-101 m distances can be attributed to measurement uncertainty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/43795','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/43795"><span>Modeling wildfire incident complexity dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p>Matthew P. Thompson</p> <p>2013-01-01</p> <p>Wildfire management in the United States and elsewhere is challenged by substantial uncertainty regarding the location and timing of fire events, the socioeconomic and ecological consequences of these events, and the costs of suppression. Escalating U.S. Forest Service suppression expenditures is of particular concern at a time of fiscal austerity as swelling fire...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMPA53B0268S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMPA53B0268S"><span>Risk-based decision making to manage water quality failures caused by combined sewer overflows</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sriwastava, A. K.; Torres-Matallana, J. A.; Tait, S.; Schellart, A.</p> <p>2017-12-01</p> <p>Regulatory authorities set certain environmental permit for water utilities such that the combined sewer overflows (CSO) managed by these companies conform to the regulations. These utility companies face the risk of paying penalty or negative publicity in case they breach the environmental permit. These risks can be addressed by designing appropriate solutions such as investing in additional infrastructure which improve the system capacity and reduce the impact of CSO spills. The performance of these solutions is often estimated using urban drainage models. Hence, any uncertainty in these models can have a significant effect on the decision making process. This study outlines a risk-based decision making approach to address water quality failure caused by CSO spills. A calibrated lumped urban drainage model is used to simulate CSO spill quality in Haute-Sûre catchment in Luxembourg. Uncertainty in rainfall and model parameters is propagated through Monte Carlo simulations to quantify uncertainty in the concentration of ammonia in the CSO spill. A combination of decision alternatives such as the construction of a storage tank at the CSO and the reduction in the flow contribution of catchment surfaces are selected as planning measures to avoid the water quality failure. Failure is defined as exceedance of a concentration-duration based threshold based on Austrian emission standards for ammonia (De Toffol, 2006) with a certain frequency. For each decision alternative, uncertainty quantification results into a probability distribution of the number of annual CSO spill events which exceed the threshold. For each alternative, a buffered failure probability as defined in Rockafellar & Royset (2010), is estimated. Buffered failure probability (pbf) is a conservative estimate of failure probability (pf), however, unlike failure probability, it includes information about the upper tail of the distribution. A pareto-optimal set of solutions is obtained by performing mean- pbf optimization. The effectiveness of using buffered failure probability compared to the failure probability is tested by comparing the solutions obtained by using mean-pbf and mean-pf optimizations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25340764','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25340764"><span>Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A</p> <p>2014-01-01</p> <p>Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4207681','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4207681"><span>Using Uncertainty and Sensitivity Analyses in Socioecological Agent-Based Models to Improve Their Analytical Performance and Policy Relevance</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.</p> <p>2014-01-01</p> <p>Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JHyd..529.1601D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JHyd..529.1601D"><span>Multi-model approach to assess the impact of climate change on runoff</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dams, J.; Nossent, J.; Senbeta, T. B.; Willems, P.; Batelaan, O.</p> <p>2015-10-01</p> <p>The assessment of climate change impacts on hydrology is subject to uncertainties related to the climate change scenarios, stochastic uncertainties of the hydrological model and structural uncertainties of the hydrological model. This paper focuses on the contribution of structural uncertainty of hydrological models to the overall uncertainty of the climate change impact assessment. To quantify the structural uncertainty of hydrological models, four physically based hydrological models (SWAT, PRMS and a semi- and fully distributed version of the WetSpa model) are set up for a catchment in Belgium. Each model is calibrated using four different objective functions. Three climate change scenarios with a high, mean and low hydrological impact are statistically perturbed from a large ensemble of climate change scenarios and are used to force the hydrological models. This methodology allows assessing and comparing the uncertainty introduced by the climate change scenarios with the uncertainty introduced by the hydrological model structure. Results show that the hydrological model structure introduces a large uncertainty on both the average monthly discharge and the extreme peak and low flow predictions under the climate change scenarios. For the low impact climate change scenario, the uncertainty range of the mean monthly runoff is comparable to the range of these runoff values in the reference period. However, for the mean and high impact scenarios, this range is significantly larger. The uncertainty introduced by the climate change scenarios is larger than the uncertainty due to the hydrological model structure for the low and mean hydrological impact scenarios, but the reverse is true for the high impact climate change scenario. The mean and high impact scenarios project increasing peak discharges, while the low impact scenario projects increasing peak discharges only for peak events with return periods larger than 1.6 years. All models suggest for all scenarios a decrease of the lowest flows, except for the SWAT model with the mean hydrological impact climate change scenario. The results of this study indicate that besides the uncertainty introduced by the climate change scenarios also the hydrological model structure uncertainty should be taken into account in the assessment of climate change impacts on hydrology. To make it more straightforward and transparent to include model structural uncertainty in hydrological impact studies, there is a need for hydrological modelling tools that allow flexible structures and methods to validate model structures in their ability to assess impacts under unobserved future climatic conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1408510','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1408510"><span>Multivariate analysis techniques</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Bendavid, Josh; Fisher, Wade C.; Junk, Thomas R.</p> <p>2016-01-01</p> <p>The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually bothmore » be improved by separating signal events from background events with higher efficiency and purity.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/415014-deregulation-allows-new-opportunities-utilities','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/415014-deregulation-allows-new-opportunities-utilities"><span>Deregulation allows new opportunities for utilities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hansen, T.</p> <p>1996-10-01</p> <p>The changes electric utilities face today are both scary and exciting. In the past several years utilities have faced uncertainties that have caused major upheaval in their structures and business processes. There has been an increase in the number of mergers and acquisitions as utilities position themselves for competition. many utility employees have faced layoffs, resulting form reengineering and downsizing. Similar events and uncertainties were faced by the airline and telecommunications industries during their transformations form monopolistic to competitive environments. Even though these events have been difficult and unpleasant, there is a bright side. Today`s electric utilities have the opportunitiesmore » to cash in on some innovative new ideas and technologies.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFM.A13F0293M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFM.A13F0293M"><span>Ensemble-based diagnosis of the large-scale processes associated with multiple high-impact weather events over North America during late October 2007</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Moore, B. J.; Bosart, L. F.; Keyser, D.</p> <p>2013-12-01</p> <p>During late October 2007, the interaction between a deep polar trough and Tropical Cyclone (TC) Kajiki off the eastern Asian coast perturbed the North Pacific jet stream and resulted in the development of a high-amplitude Rossby wave train extending into North America, contributing to three concurrent high-impact weather events in North America: wildfires in southern California associated with strong Santa Ana winds, a cold surge into eastern Mexico, and widespread heavy rainfall (~150 mm) in the south-central United States. Observational analysis indicates that these high-impact weather events were all dynamically linked with the development of a major high-latitude ridge over the eastern North Pacific and western North America and a deep trough over central North America. In this study, global operational ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF) obtained from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) archive are used to characterize the medium-range predictability of the large-scale flow pattern associated with the three events and to diagnose the large-scale atmospheric processes favorable, or unfavorable, for the occurrence of the three events. Examination of the ECMWF forecasts leading up to the time period of the three high-impact weather events (~23-25 October 2007) indicates that ensemble spread (i.e., uncertainty) in the 500-hPa geopotential height field develops in connection with downstream baroclinic development (DBD) across the North Pacific, associated with the interaction between TC Kajiki and the polar trough along the eastern Asian coast, and subsequently moves downstream into North America, yielding considerable uncertainty with respect to the structure, amplitude, and position of the ridge-trough pattern over North America. Ensemble sensitivity analysis conducted for key sensible weather parameters corresponding to the three high-impact weather events, including relative humidity, temperature, and precipitation, demonstrates quantitatively that all three high-impact weather events are closely linked with the development of the ridge-trough pattern over North America. Moreover, results of this analysis indicate that the development of the ridge-trough pattern is modulated by DBD and cyclogenesis upstream over the central and eastern North Pacific. Specifically, ensemble members exhibiting less intense cyclogenesis and a more poleward cyclone track over the central and eastern North Pacific feature the development of a poleward-displaced ridge over the eastern North Pacific and western North America and a cut-off low over the Intermountain West, an unfavorable scenario for the occurrence the three high-impact weather events. Conversely, ensemble members exhibiting more intense cyclogenesis and a less poleward cyclone track feature persistent ridging along the western coast of North America and trough development over central North America, establishing a favorable flow pattern for the three high-impact weather events. Results demonstrate that relatively small initial differences in the large-scale flow pattern over the North Pacific among ensemble members can result in large uncertainty in the forecast downstream flow response over North America.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PhRvD..97i6010A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PhRvD..97i6010A"><span>Weak vector boson production with many jets at the LHC √{s }=13 TeV</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Anger, F. R.; Febres Cordero, F.; Höche, S.; Maître, D.</p> <p>2018-05-01</p> <p>Signatures with an electroweak vector boson and many jets play a crucial role at the Large Hadron Collider, both in the measurement of Standard-Model parameters and in searches for new physics. Precise predictions for these multiscale processes are therefore indispensable. We present next-to-leading order QCD predictions for W±/Z +jets at √{s }=13 TeV , including up to five/four jets in the final state. All production channels are included, and leptonic decays of the vector bosons are considered at the amplitude level. We assess theoretical uncertainties arising from renormalization- and factorization-scale dependence by considering fixed-order dynamical scales based on the HT variable as well as on the MiNLO procedure. We also explore uncertainties associated with different choices of parton-distribution functions. We provide event samples that can be explored through publicly available n -tuple sets, generated with BlackHat in combination with Sherpa.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28515667','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28515667"><span>Measurement of azimuthal correlations of D mesons with charged particles in pp collisions at [Formula: see text] TeV and p-Pb collisions at [Formula: see text] TeV.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Adam, J; Adamová, D; Aggarwal, M M; Aglieri Rinella, G; Agnello, M; Agrawal, N; Ahammed, Z; Ahmad, S; Ahn, S U; Aiola, S; Akindinov, A; Alam, S N; Albuquerque, D S D; Aleksandrov, D; Alessandro, B; Alexandre, D; Alfaro Molina, R; Alici, A; Alkin, A; Almaraz, J R M; Alme, J; Alt, T; Altinpinar, S; Altsybeev, I; Alves Garcia Prado, C; Andrei, C; Andronic, A; Anguelov, V; Antičić, T; Antinori, F; Antonioli, P; Aphecetche, L; Appelshäuser, H; Arcelli, S; Arnaldi, R; Arnold, O W; Arsene, I C; Arslandok, M; Audurier, B; Augustinus, A; Averbeck, R; Azmi, M D; Badalà, A; Baek, Y W; Bagnasco, S; Bailhache, R; Bala, R; Balasubramanian, S; Baldisseri, A; Baral, R C; Barbano, A M; Barbera, R; Barile, F; Barnaföldi, G G; Barnby, L S; Barret, V; Bartalini, P; Barth, K; Bartke, J; Bartsch, E; Basile, M; Bastid, N; Basu, S; Bathen, B; Batigne, G; Batista Camejo, A; Batyunya, B; Batzing, P C; Bearden, I G; Beck, H; Bedda, C; Behera, N K; Belikov, I; Bellini, F; Bello Martinez, H; Bellwied, R; Belmont, R; Belmont-Moreno, E; Beltran, L G E; Belyaev, V; Bencedi, G; Beole, S; Berceanu, I; Bercuci, A; Berdnikov, Y; Berenyi, D; Bertens, R A; Berzano, D; Betev, L; Bhasin, A; Bhat, I R; Bhati, A K; Bhattacharjee, B; Bhom, J; Bianchi, L; Bianchi, N; Bianchin, C; Bielčík, J; Bielčíková, J; Bilandzic, A; Biro, G; Biswas, R; Biswas, S; Bjelogrlic, S; Blair, J T; Blau, D; Blume, C; Bock, F; Bogdanov, A; Bøggild, H; Boldizsár, L; Bombara, M; Bonora, M; Book, J; Borel, H; Borissov, A; Borri, M; Bossú, F; Botta, E; Bourjau, C; Braun-Munzinger, P; Bregant, M; Breitner, T; Broker, T A; Browning, T A; Broz, M; Brucken, E J; Bruna, E; Bruno, G E; Budnikov, D; Buesching, H; Bufalino, S; Buitron, S A I; Buncic, P; Busch, O; Buthelezi, Z; Butt, J B; Buxton, J T; Cabala, J; Caffarri, D; Cai, X; Caines, H; Diaz, L Calero; Caliva, A; Calvo Villar, E; Camerini, P; Carena, F; Carena, W; Carnesecchi, F; Castillo Castellanos, J; Castro, A J; Casula, E A R; Ceballos Sanchez, C; Cepila, J; Cerello, P; Cerkala, J; Chang, B; Chapeland, S; Chartier, M; Charvet, J L; Chattopadhyay, S; Chattopadhyay, S; Chauvin, A; Chelnokov, V; Cherney, M; Cheshkov, C; Cheynis, B; Chibante Barroso, V; Chinellato, D D; Cho, S; Chochula, P; Choi, K; Chojnacki, M; Choudhury, S; Christakoglou, P; Christensen, C H; Christiansen, P; Chujo, T; Chung, S U; Cicalo, C; Cifarelli, L; Cindolo, F; Cleymans, J; Colamaria, F; Colella, D; Collu, A; Colocci, M; Conesa Balbastre, G; Conesa Del Valle, Z; Connors, M E; Contreras, J G; Cormier, T M; Corrales Morales, Y; Cortés Maldonado, I; Cortese, P; Cosentino, M R; Costa, F; Crkovská, J; Crochet, P; Cruz Albino, R; Cuautle, E; Cunqueiro, L; Dahms, T; Dainese, A; Danisch, M C; Danu, A; Das, D; Das, I; Das, S; Dash, A; Dash, S; De, S; De Caro, A; de Cataldo, G; de Conti, C; de Cuveland, J; De Falco, A; De Gruttola, D; De Marco, N; De Pasquale, S; De Souza, R D; Deisting, A; Deloff, A; Dénes, E; Deplano, C; Dhankher, P; Di Bari, D; Di Mauro, A; Di Nezza, P; Di Ruzza, B; Diaz Corchero, M A; Dietel, T; Dillenseger, P; Divià, R; Djuvsland, Ø; Dobrin, A; Domenicis Gimenez, D; Dönigus, B; Dordic, O; Drozhzhova, T; Dubey, A K; Dubla, A; Ducroux, L; Dupieux, P; Ehlers, R J; Elia, D; Endress, E; Engel, H; Epple, E; Erazmus, B; Erdemir, I; Erhardt, F; Espagnon, B; Estienne, M; Esumi, S; Eum, J; Evans, D; Evdokimov, S; Eyyubova, G; Fabbietti, L; Fabris, D; Faivre, J; Fantoni, A; Fasel, M; Feldkamp, L; Feliciello, A; Feofilov, G; Ferencei, J; Fernández Téllez, A; Ferreiro, E G; Ferretti, A; Festanti, A; Feuillard, V J G; Figiel, J; Figueredo, M A S; Filchagin, S; Finogeev, D; Fionda, F M; Fiore, E M; Fleck, M G; Floris, M; Foertsch, S; Foka, P; Fokin, S; Fragiacomo, E; Francescon, A; Francisco, A; Frankenfeld, U; Fronze, G G; Fuchs, U; Furget, C; Furs, A; Fusco Girard, M; Gaardhøje, J J; Gagliardi, M; Gago, A M; Gajdosova, K; Gallio, M; Galvan, C D; Gangadharan, D R; Ganoti, P; Gao, C; Garabatos, C; Garcia-Solis, E; Gargiulo, C; Gasik, P; Gauger, E F; Germain, M; Gheata, M; Ghosh, P; Ghosh, S K; Gianotti, P; Giubellino, P; Giubilato, P; Gladysz-Dziadus, E; Glässel, P; Goméz Coral, D M; Gomez Ramirez, A; Gonzalez, A S; Gonzalez, V; González-Zamora, P; Gorbunov, S; Görlich, L; Gotovac, S; Grabski, V; Grachov, O A; Graczykowski, L K; Graham, K L; Grelli, A; Grigoras, A; Grigoras, C; Grigoriev, V; Grigoryan, A; Grigoryan, S; Grinyov, B; Grion, N; Gronefeld, J M; Grosse-Oetringhaus, J F; Grosso, R; Gruber, L; Guber, F; Guernane, R; Guerzoni, B; Gulbrandsen, K; Gunji, T; Gupta, A; Gupta, R; Haake, R; Hadjidakis, C; Haiduc, M; Hamagaki, H; Hamar, G; Hamon, J C; Harris, J W; Harton, A; Hatzifotiadou, D; Hayashi, S; Heckel, S T; Hellbär, E; Helstrup, H; Herghelegiu, A; Herrera Corral, G; Hess, B A; Hetland, K F; Hillemanns, H; Hippolyte, B; Horak, D; Hosokawa, R; Hristov, P; Hughes, C; Humanic, T J; Hussain, N; Hussain, T; Hutter, D; Hwang, D S; Ilkaev, R; Inaba, M; Incani, E; Ippolitov, M; Irfan, M; Isakov, V; Ivanov, M; Ivanov, V; Izucheev, V; Jacak, B; Jacazio, N; Jacobs, P M; Jadhav, M B; Jadlovska, S; Jadlovsky, J; Jahnke, C; Jakubowska, M J; Janik, M A; Jayarathna, P H S Y; Jena, C; Jena, S; Jimenez Bustamante, R T; Jones, P G; Jusko, A; Kalinak, P; Kalweit, A; Kang, J H; Kaplin, V; Kar, S; Karasu Uysal, A; Karavichev, O; Karavicheva, T; Karayan, L; Karpechev, E; Kebschull, U; Keidel, R; Keijdener, D L D; Keil, M; Khan, M Mohisin; Khan, P; Khan, S A; Khanzadeev, A; Kharlov, Y; Kileng, B; Kim, D W; Kim, D J; Kim, D; Kim, H; Kim, J S; Kim, J; Kim, M; Kim, M; Kim, S; Kim, T; Kirsch, S; Kisel, I; Kiselev, S; Kisiel, A; Kiss, G; Klay, J L; Klein, C; Klein, J; Klein-Bösing, C; Klewin, S; Kluge, A; Knichel, M L; Knospe, A G; Kobdaj, C; Kofarago, M; Kollegger, T; Kolojvari, A; Kondratiev, V; Kondratyeva, N; Kondratyuk, E; Konevskikh, A; Kopcik, M; Kour, M; Kouzinopoulos, C; Kovalenko, O; Kovalenko, V; Kowalski, M; Koyithatta Meethaleveedu, G; Králik, I; Kravčáková, A; Krivda, M; Krizek, F; Kryshen, E; Krzewicki, M; Kubera, A M; Kučera, V; Kuhn, C; Kuijer, P G; Kumar, A; Kumar, J; Kumar, L; Kumar, S; Kurashvili, P; Kurepin, A; Kurepin, A B; Kuryakin, A; Kweon, M J; Kwon, Y; La Pointe, S L; La Rocca, P; Ladron de Guevara, P; Lagana Fernandes, C; Lakomov, I; Langoy, R; Lapidus, K; Lara, C; Lardeux, A; Lattuca, A; Laudi, E; Lea, R; Leardini, L; Lee, S; Lehas, F; Lehner, S; Lemmon, R C; Lenti, V; Leogrande, E; León Monzón, I; León Vargas, H; Leoncino, M; Lévai, P; Li, S; Li, X; Lien, J; Lietava, R; Lindal, S; Lindenstruth, V; Lippmann, C; Lisa, M A; Ljunggren, H M; Lodato, D F; Loenne, P I; Loginov, V; Loizides, C; Lopez, X; López Torres, E; Lowe, A; Luettig, P; Lunardon, M; Luparello, G; Lupi, M; Lutz, T H; Maevskaya, A; Mager, M; Mahajan, S; Mahmood, S M; Maire, A; Majka, R D; Malaev, M; Maldonado Cervantes, I; Malinina, L; Mal'Kevich, D; Malzacher, P; Mamonov, A; Manko, V; Manso, F; Manzari, V; Mao, Y; Marchisone, M; Mareš, J; Margagliotti, G V; Margotti, A; Margutti, J; Marín, A; Markert, C; Marquard, M; Martin, N A; Martinengo, P; Martínez, M I; Martínez García, G; Martinez Pedreira, M; Mas, A; Masciocchi, S; Masera, M; Masoni, A; Mastroserio, A; Matyja, A; Mayer, C; Mazer, J; Mazzoni, M A; Mcdonald, D; Meddi, F; Melikyan, Y; Menchaca-Rocha, A; Meninno, E; Mercado Pérez, J; Meres, M; Mhlanga, S; Miake, Y; Mieskolainen, M M; Mikhaylov, K; Milano, L; Milosevic, J; Mischke, A; Mishra, A N; Miśkowiec, D; Mitra, J; Mitu, C M; Mohammadi, N; Mohanty, B; Molnar, L; Montaño Zetina, L; Montes, E; Moreira De Godoy, D A; Moreno, L A P; Moretto, S; Morreale, A; Morsch, A; Muccifora, V; Mudnic, E; Mühlheim, D; Muhuri, S; Mukherjee, M; Mulligan, J D; Munhoz, M G; Münning, K; Munzer, R H; Murakami, H; Murray, S; Musa, L; Musinsky, J; Naik, B; Nair, R; Nandi, B K; Nania, R; Nappi, E; Naru, M U; Natal da Luz, H; Nattrass, C; Navarro, S R; Nayak, K; Nayak, R; Nayak, T K; Nazarenko, S; Nedosekin, A; Negrao De Oliveira, R A; Nellen, L; Ng, F; Nicassio, M; Niculescu, M; Niedziela, J; Nielsen, B S; Nikolaev, S; Nikulin, S; Nikulin, V; Noferini, F; Nomokonov, P; Nooren, G; Noris, J C C; Norman, J; Nyanin, A; Nystrand, J; Oeschler, H; Oh, S; Oh, S K; Ohlson, A; Okatan, A; Okubo, T; Olah, L; Oleniacz, J; Oliveira Da Silva, A C; Oliver, M H; Onderwaater, J; Oppedisano, C; Orava, R; Oravec, M; Ortiz Velasquez, A; Oskarsson, A; Otwinowski, J; Oyama, K; Ozdemir, M; Pachmayer, Y; Pagano, D; Pagano, P; Paić, G; Pal, S K; Palni, P; Pan, J; Pandey, A K; Papikyan, V; Pappalardo, G S; Pareek, P; Park, J; Park, W J; Parmar, S; Passfeld, A; Paticchio, V; Patra, R N; Paul, B; Pei, H; Peitzmann, T; Peng, X; Pereira Da Costa, H; Peresunko, D; Perez Lezama, E; Peskov, V; Pestov, Y; Petráček, V; Petrov, V; Petrovici, M; Petta, C; Piano, S; Pikna, M; Pillot, P; Pimentel, L O D L; Pinazza, O; Pinsky, L; Piyarathna, D B; Płoskoń, M; Planinic, M; Pluta, J; Pochybova, S; Podesta-Lerma, P L M; Poghosyan, M G; Polichtchouk, B; Poljak, N; Poonsawat, W; Pop, A; Poppenborg, H; Porteboeuf-Houssais, S; Porter, J; Pospisil, J; Prasad, S K; Preghenella, R; Prino, F; Pruneau, C A; Pshenichnov, I; Puccio, M; Puddu, G; Pujahari, P; Punin, V; Putschke, J; Qvigstad, H; Rachevski, A; Raha, S; Rajput, S; Rak, J; Rakotozafindrabe, A; Ramello, L; Rami, F; Raniwala, R; Raniwala, S; Räsänen, S S; Rascanu, B T; Rathee, D; Read, K F; Redlich, K; Reed, R J; Rehman, A; Reichelt, P; Reidt, F; Ren, X; Renfordt, R; Reolon, A R; Reshetin, A; Reygers, K; Riabov, V; Ricci, R A; Richert, T; Richter, M; Riedler, P; Riegler, W; Riggi, F; Ristea, C; Rocco, E; Rodríguez Cahuantzi, M; Rodriguez Manso, A; Røed, K; Rogochaya, E; Rohr, D; Röhrich, D; Ronchetti, F; Ronflette, L; Rosnet, P; Rossi, A; Roukoutakis, F; Roy, A; Roy, C; Roy, P; Rubio Montero, A J; Rui, R; Russo, R; Ryabinkin, E; Ryabov, Y; Rybicki, A; Saarinen, S; Sadhu, S; Sadovsky, S; Šafařík, K; Sahlmuller, B; Sahoo, P; Sahoo, R; Sahoo, S; Sahu, P K; Saini, J; Sakai, S; Saleh, M A; Salzwedel, J; Sambyal, S; Samsonov, V; Šándor, L; Sandoval, A; Sano, M; Sarkar, D; Sarkar, N; Sarma, P; Scapparone, E; Scarlassara, F; Schiaua, C; Schicker, R; Schmidt, C; Schmidt, H R; Schmidt, M; Schuchmann, S; Schukraft, J; Schutz, Y; Schwarz, K; Schweda, K; Scioli, G; Scomparin, E; Scott, R; Šefčík, M; Seger, J E; Sekiguchi, Y; Sekihata, D; Selyuzhenkov, I; Senosi, K; Senyukov, S; Serradilla, E; Sevcenco, A; Shabanov, A; Shabetai, A; Shadura, O; Shahoyan, R; Shangaraev, A; Sharma, A; Sharma, M; Sharma, M; Sharma, N; Sheikh, A I; Shigaki, K; Shou, Q; Shtejer, K; Sibiriak, Y; Siddhanta, S; Sielewicz, K M; Siemiarczuk, T; Silvermyr, D; Silvestre, C; Simatovic, G; Simonetti, G; Singaraju, R; Singh, R; Singhal, V; Sinha, T; Sitar, B; Sitta, M; Skaali, T B; Slupecki, M; Smirnov, N; Snellings, R J M; Snellman, T W; Song, J; Song, M; Song, Z; Soramel, F; Sorensen, S; Sozzi, F; Spiriti, E; Sputowska, I; Spyropoulou-Stassinaki, M; Stachel, J; Stan, I; Stankus, P; Stenlund, E; Steyn, G; Stiller, J H; Stocco, D; Strmen, P; Suaide, A A P; Sugitate, T; Suire, C; Suleymanov, M; Suljic, M; Sultanov, R; Šumbera, M; Sumowidagdo, S; Szabo, A; Szarka, I; Szczepankiewicz, A; Szymanski, M; Tabassam, U; Takahashi, J; Tambave, G J; Tanaka, N; Tarhini, M; Tariq, M; Tarzila, M G; Tauro, A; Muñoz, G Tejeda; Telesca, A; Terasaki, K; Terrevoli, C; Teyssier, B; Thäder, J; Thakur, D; Thomas, D; Tieulent, R; Tikhonov, A; Timmins, A R; Toia, A; Trogolo, S; Trombetta, G; Trubnikov, V; Trzaska, W H; Tsuji, T; Tumkin, A; Turrisi, R; Tveter, T S; Ullaland, K; Uras, A; Usai, G L; Utrobicic, A; Vala, M; Valencia Palomo, L; Vallero, S; Van Der Maarel, J; Van Hoorne, J W; van Leeuwen, M; Vanat, T; Vande Vyvre, P; Varga, D; Vargas, A; Vargyas, M; Varma, R; Vasileiou, M; Vasiliev, A; Vauthier, A; Vázquez Doce, O; Vechernin, V; Veen, A M; Velure, A; Vercellin, E; Vergara Limón, S; Vernet, R; Verweij, M; Vickovic, L; Viinikainen, J; Vilakazi, Z; Villalobos Baillie, O; Villatoro Tello, A; Vinogradov, A; Vinogradov, L; Virgili, T; Vislavicius, V; Viyogi, Y P; Vodopyanov, A; Völkl, M A; Voloshin, K; Voloshin, S A; Volpe, G; von Haller, B; Vorobyev, I; Vranic, D; Vrláková, J; Vulpescu, B; Wagner, B; Wagner, J; Wang, H; Wang, M; Watanabe, D; Watanabe, Y; Weber, M; Weber, S G; Weiser, D F; Wessels, J P; Westerhoff, U; Whitehead, A M; Wiechula, J; Wikne, J; Wilk, G; Wilkinson, J; Willems, G A; Williams, M C S; Windelband, B; Winn, M; Yalcin, S; Yang, P; Yano, S; Yin, Z; Yokoyama, H; Yoo, I-K; Yoon, J H; Yurchenko, V; Zaborowska, A; Zaccolo, V; Zaman, A; Zampolli, C; Zanoli, H J C; Zaporozhets, S; Zardoshti, N; Zarochentsev, A; Závada, P; Zaviyalov, N; Zbroszczyk, H; Zgura, I S; Zhalov, M; Zhang, H; Zhang, X; Zhang, Y; Zhang, C; Zhang, Z; Zhao, C; Zhigareva, N; Zhou, D; Zhou, Y; Zhou, Z; Zhu, H; Zhu, J; Zichichi, A; Zimmermann, A; Zimmermann, M B; Zinovjev, G; Zyzak, M</p> <p>2017-01-01</p> <p>The azimuthal correlations of D mesons with charged particles were measured with the ALICE apparatus in pp collisions at [Formula: see text] and p-Pb collisions at [Formula: see text] at the Large Hadron Collider. [Formula: see text], [Formula: see text], and [Formula: see text] mesons and their charge conjugates with transverse momentum [Formula: see text] and rapidity in the nucleon-nucleon centre-of-mass system [Formula: see text] (pp collisions) and [Formula: see text] (p-Pb collisions) were correlated to charged particles with [Formula: see text]. The yield of charged particles in the correlation peak induced by the jet containing the D meson and the peak width are compatible within uncertainties in the two collision systems. The data are described within uncertainties by Monte-Carlo simulations based on PYTHIA, POWHEG, and EPOS 3 event generators.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010GMS...191.....T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010GMS...191.....T"><span>Rainfall: State of the Science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Testik, Firat Y.; Gebremichael, Mekonnen</p> <p></p> <p>Rainfall: State of the Science offers the most up-to-date knowledge on the fundamental and practical aspects of rainfall. Each chapter, self-contained and written by prominent scientists in their respective fields, provides three forms of information: fundamental principles, detailed overview of current knowledge and description of existing methods, and emerging techniques and future research directions. The book discusses • Rainfall microphysics: raindrop morphodynamics, interactions, size distribution, and evolution • Rainfall measurement and estimation: ground-based direct measurement (disdrometer and rain gauge), weather radar rainfall estimation, polarimetric radar rainfall estimation, and satellite rainfall estimation • Statistical analyses: intensity-duration-frequency curves, frequency analysis of extreme events, spatial analyses, simulation and disaggregation, ensemble approach for radar rainfall uncertainty, and uncertainty analysis of satellite rainfall products The book is tailored to be an indispensable reference for researchers, practitioners, and graduate students who study any aspect of rainfall or utilize rainfall information in various science and engineering disciplines.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1379638','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1379638"><span>An Independent Assessment of Anthropogenic Attribution Statements for Recent Extreme Temperature and Rainfall Events</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Angélil, Oliver; Stone, Dáithí; Wehner, Michael</p> <p></p> <p>The annual "State of the Climate" report, published in the Bulletin of the American Meteorological Society (BAMS), has included a supplement since 2011 composed of brief analyses of the human influence on recent major extreme weather events. There are now several dozen extreme weather events examined in these supplements, but these studies have all differed in their data sources as well as their approaches to defining the events, analyzing the events, and the consideration of the role of anthropogenic emissions. This study reexamines most of these events using a single analytical approach and a single set of climate model andmore » observational data sources. In response to recent studies recommending the importance of using multiple methods for extreme weather event attribution, results are compared from these analyses to those reported in the BAMS supplements collectively, with the aim of characterizing the degree to which the lack of a common methodological framework may or may not influence overall conclusions. Results are broadly similar to those reported earlier for extreme temperature events but disagree for a number of extreme precipitation events. Based on this, it is advised that the lack of comprehensive uncertainty analysis in recent extreme weather attribution studies is important and should be considered when interpreting results, but as yet it has not introduced a systematic bias across these studies.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1379638-independent-assessment-anthropogenic-attribution-statements-recent-extreme-temperature-rainfall-events','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1379638-independent-assessment-anthropogenic-attribution-statements-recent-extreme-temperature-rainfall-events"><span>An Independent Assessment of Anthropogenic Attribution Statements for Recent Extreme Temperature and Rainfall Events</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Angélil, Oliver; Stone, Dáithí; Wehner, Michael; ...</p> <p>2016-12-16</p> <p>The annual "State of the Climate" report, published in the Bulletin of the American Meteorological Society (BAMS), has included a supplement since 2011 composed of brief analyses of the human influence on recent major extreme weather events. There are now several dozen extreme weather events examined in these supplements, but these studies have all differed in their data sources as well as their approaches to defining the events, analyzing the events, and the consideration of the role of anthropogenic emissions. This study reexamines most of these events using a single analytical approach and a single set of climate model andmore » observational data sources. In response to recent studies recommending the importance of using multiple methods for extreme weather event attribution, results are compared from these analyses to those reported in the BAMS supplements collectively, with the aim of characterizing the degree to which the lack of a common methodological framework may or may not influence overall conclusions. Results are broadly similar to those reported earlier for extreme temperature events but disagree for a number of extreme precipitation events. Based on this, it is advised that the lack of comprehensive uncertainty analysis in recent extreme weather attribution studies is important and should be considered when interpreting results, but as yet it has not introduced a systematic bias across these studies.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016NHESS..16.2009L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016NHESS..16.2009L"><span>Decision support system for emergency management of oil spill accidents in the Mediterranean Sea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liubartseva, Svitlana; Coppini, Giovanni; Pinardi, Nadia; De Dominicis, Michela; Lecci, Rita; Turrisi, Giuseppe; Cretì, Sergio; Martinelli, Sara; Agostini, Paola; Marra, Palmalisa; Palermo, Francesco</p> <p>2016-08-01</p> <p>This paper presents an innovative web-based decision support system to facilitate emergency management in the case of oil spill accidents, called WITOIL (Where Is The Oil). The system can be applied to create a forecast of oil spill events, evaluate uncertainty of the predictions, and calculate hazards based on historical meteo-oceanographic datasets. To compute the oil transport and transformation, WITOIL uses the MEDSLIK-II oil spill model forced by operational meteo-oceanographic services. Results of the modeling are visualized through Google Maps. A special application for Android is designed to provide mobile access for competent authorities, technical and scientific institutions, and citizens.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018EPJWC.17604006S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018EPJWC.17604006S"><span>Optical properties of an industrial fire observed with a ground based N2-Raman lidar over the Paris area</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Shang, Xiaoxia; Chazette, Patrick; Totems, Julien</p> <p>2018-04-01</p> <p>This paper presents the first, to our knowledge, lidar measurement of an industrial fire plume, which covered the north of the Paris area on 17th April 2015. The fire started in a textile warehouse and rapidly spread by emitting large quantities of aerosols into the low troposphere. A ground based N2-Raman lidar performed continuous measurements during this event. Vertical profiles of the aerosol extinction coefficient, depolarization and lidar ratio are derived. A Monte Carlo algorithm was used to assess the uncertainties on the optical parameters, and to evaluate lidar inversion methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25324216','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25324216"><span>Uncertainty during pain anticipation: the adaptive value of preparatory processes.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Seidel, Eva-Maria; Pfabigan, Daniela M; Hahn, Andreas; Sladky, Ronald; Grahl, Arvina; Paul, Katharina; Kraus, Christoph; Küblböck, Martin; Kranz, Georg S; Hummer, Allan; Lanzenberger, Rupert; Windischberger, Christian; Lamm, Claus</p> <p>2015-02-01</p> <p>Anticipatory processes prepare the organism for upcoming experiences. The aim of this study was to investigate neural responses related to anticipation and processing of painful stimuli occurring with different levels of uncertainty. Twenty-five participants (13 females) took part in an electroencephalography and functional magnetic resonance imaging (fMRI) experiment at separate times. A visual cue announced the occurrence of an electrical painful or nonpainful stimulus, delivered with certainty or uncertainty (50% chance), at some point during the following 15 s. During the first 2 s of the anticipation phase, a strong effect of uncertainty was reflected in a pronounced frontal stimulus-preceding negativity (SPN) and increased fMRI activation in higher visual processing areas. In the last 2 s before stimulus delivery, we observed stimulus-specific preparatory processes indicated by a centroparietal SPN and posterior insula activation that was most pronounced for the certain pain condition. Uncertain anticipation was associated with attentional control processes. During stimulation, the results revealed that unexpected painful stimuli produced the strongest activation in the affective pain processing network and a more pronounced offset-P2. Our results reflect that during early anticipation uncertainty is strongly associated with affective mechanisms and seems to be a more salient event compared to certain anticipation. During the last 2 s before stimulation, attentional control mechanisms are initiated related to the increased salience of uncertainty. Furthermore, stimulus-specific preparatory mechanisms during certain anticipation also shaped the response to stimulation, underlining the adaptive value of stimulus-targeted preparatory activity which is less likely when facing an uncertain event. © 2014 Wiley Periodicals, Inc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.4667Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.4667Z"><span>Rainfall extremes from TRMM data and the Metastatistical Extreme Value Distribution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zorzetto, Enrico; Marani, Marco</p> <p>2017-04-01</p> <p>A reliable quantification of the probability of weather extremes occurrence is essential for designing resilient water infrastructures and hazard mitigation measures. However, it is increasingly clear that the presence of inter-annual climatic fluctuations determines a substantial long-term variability in the frequency of occurrence of extreme events. This circumstance questions the foundation of the traditional extreme value theory, hinged on stationary Poisson processes or on asymptotic assumptions to derive the Generalized Extreme Value (GEV) distribution. We illustrate here, with application to daily rainfall, a new approach to extreme value analysis, the Metastatistical Extreme Value Distribution (MEVD). The MEVD relaxes the above assumptions and is based on the whole distribution of daily rainfall events, thus allowing optimal use of all available observations. Using a global dataset of rain gauge observations, we show that the MEVD significantly outperforms the Generalized Extreme Value distribution, particularly for long average recurrence intervals and when small samples are available. The latter property suggests MEVD to be particularly suited for applications to satellite rainfall estimates, which only cover two decades, thus making extreme value estimation extremely challenging. Here we apply MEVD to the TRMM TMPA 3B42 product, an 18-year dataset of remotely-sensed daily rainfall providing a quasi-global coverage. Our analyses yield a global scale mapping of daily rainfall extremes and of their distributional tail properties, bridging the existing large gaps in ground-based networks. Finally, we illustrate how our global-scale analysis can provide insight into how properties of local rainfall regimes affect tail estimation uncertainty when using the GEV or MEVD approach. We find a dependence of the estimation uncertainty, for both the GEV- and MEV-based approaches, on the average annual number and on the inter-annual variability of rainy days. In particular, estimation uncertainty decreases 1) as the mean annual number of wet days increases, and 2) as the variability in the number of rainy days, expressed by its coefficient of variation, decreases. We tentatively explain this behavior in terms of the assumptions underlying the two approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A31A2157S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A31A2157S"><span>Disdrometer-based C-Band Radar Quantitative Precipitation Estimation (QPE) in a highly complex terrain region in tropical Colombia.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sepúlveda, J.; Hoyos Ortiz, C. D.</p> <p>2017-12-01</p> <p>An adequate quantification of precipitation over land is critical for many societal applications including agriculture, hydroelectricity generation, water supply, and risk management associated with extreme events. The use of rain gauges, a traditional method for precipitation estimation, and an excellent one, to estimate the volume of liquid water during a particular precipitation event, does not allow to fully capture the highly spatial variability of the phenomena which is a requirement for almost all practical applications. On the other hand, the weather radar, an active remote sensing sensor, provides a proxy for rainfall with fine spatial resolution and adequate temporary sampling, however, it does not measure surface precipitation. In order to fully exploit the capabilities of the weather radar, it is necessary to develop quantitative precipitation estimation (QPE) techniques combining radar information with in-situ measurements. Different QPE methodologies are explored and adapted to local observations in a highly complex terrain region in tropical Colombia using a C-Band radar and a relatively dense network of rain gauges and disdrometers. One important result is that the expressions reported in the literature for extratropical locations are not representative of the conditions found in the tropical region studied. In addition to reproducing the state-of-the-art techniques, a new multi-stage methodology based on radar-derived variables and disdrometer data is proposed in order to achieve the best QPE possible. The main motivation for this new methodology is based on the fact that most traditional QPE methods do not directly take into account the different uncertainty sources involved in the process. The main advantage of the multi-stage model compared to traditional models is that it allows assessing and quantifying the uncertainty in the surface rain rate estimation. The sub-hourly rainfall estimations using the multi-stage methodology are realistic compared to observed data in spite of the many sources of uncertainty including the sampling volume, the different physical principles of the sensors, the incomplete understanding of the microphysics of precipitation and, the most important, the rapidly varying droplet size distribution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED261318.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED261318.pdf"><span>Uncertainty and Clinical Psychology: Therapists' Responses.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Bienenfeld, Sheila</p> <p></p> <p>Three sources of professional uncertainty have been described: uncertainty about the practitioner's mastery of knowledge; uncertainty due to gaps in the knowledge base itself; and uncertainty about the source of the uncertainty, i.e., the practitioner does not know whether his uncertainty is due to gaps in the knowledge base or to personal…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008IJMPA..23.3155Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008IJMPA..23.3155Z"><span>Computing the Entropy of Kerr-Newman Black Hole Without Brick Walls Method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Li-Chun; Wu, Yue-Qin; Li, Huai-Fan; Ren, Zhao</p> <p></p> <p>By using the entanglement entropy method, the statistical entropy of the Bose and Fermi fields in a thin film is calculated and the Bekenstein-Hawking entropy of Kerr-Newman black hole is obtained. Here, the Bose and Fermi fields are entangled with the quantum states in Kerr-Newman black hole and are outside of the horizon. The divergence of brick-wall model is avoided without any cutoff by the new equation of state density obtained with the generalized uncertainty principle. The calculation implies that the high density quantum states near the event horizon are strongly correlated with the quantum states in black hole. The black hole entropy is a quantum effect. It is an intrinsic characteristic of space-time. The ultraviolet cutoff in the brick-wall model is unreasonable. The generalized uncertainty principle should be considered in the high energy quantum field near the event horizon. From the calculation, the constant λ introduced in the generalized uncertainty principle is related to polar angle θ in an axisymmetric space-time.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19890012014','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19890012014"><span>Periodicity of extinction: A 1988 update</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Sepkowski, J. John, Jr.</p> <p>1988-01-01</p> <p>The hypothesis that events of mass extinction recur periodically at approximately 26 my intervals is an empirical claim based on analysis of data from the fossil record. The hypothesis has become closely linked with catastrophism because several events in the periodic series are associated with evidence of extraterrestrial impacts, and terrestrial forcing mechanisms with long, periodic recurrences are not easily conceived. Astronomical mechanisms that have been hypothesized include undetected solar companions and solar oscillation about the galactic plane, which induce comet showers and result in impacts on Earth at regular intervals. Because these mechanisms are speculative, they have been the subject of considerable controversy, as has the hypothesis of periodicity of extinction. In response to criticisms and uncertainties, a data base was developed on times of extinction of marine animal genera. A time series is given and analyzed with 49 sample points for the per-genus extinction rate from the Late Permian to the Recent. An unexpected pattern in the data is the uniformity of magnitude of many of the periodic extinction events. Observations suggest that the sequence of extinction events might be the result of two sets of mechanisms: a periodic forcing that normally induces only moderate amounts of extinction, and independent incidents or catastrophes that, when coincident with the periodic forcing, amplify its signal and produce major-mass extinctions.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMPA21B1869C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMPA21B1869C"><span>Communicating Climate Uncertainties: Challenges and Opportunities Related to Spatial Scales, Extreme Events, and the Warming 'Hiatus'</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Casola, J. H.; Huber, D.</p> <p>2013-12-01</p> <p>Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision makers can arrive at invalid conclusions from a seemingly valid scientific messages. Honest discussion of uncertainties, and recognition of the spatial and time scales associated with decision making, can work to combat this potential confusion.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMSM32A..03F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMSM32A..03F"><span>How Might the Thermosphere and Ionosphere React to an Extreme Space Weather Event?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fuller-Rowell, T. J.; Fedrizzi, M.; Codrescu, M.; Maruyama, N.; Raeder, J.</p> <p>2015-12-01</p> <p>If a Carrington-type CME event of 1859 hit Earth how might the thermosphere, ionosphere, and plasmasphere respond? To start with, the response would be dependent on how the magnetosphere reacts and channels the energy into the upper atmosphere. For now we can assume the magnetospheric convection and auroral precipitation inputs would look similar to a 2003 Halloween storm but stronger and more expanded to mid-latitude, much like what the Weimer empirical model predicts if the solar wind Bz and velocity were -60nT and 1500km/s respectively. For a Halloween-level geomagnetic storm event, the sequence of physical process in the thermosphere and ionosphere are thought to be well understood. The physics-based coupled models, however, have been designed and somewhat tuned to simulate the response to this level of event that have been observed in the last two solar cycles. For an extreme solar storm, it is unclear if the response would be a natural linear extrapolation of the response or if non-linear processes would begin to dominate. A numerical simulation has been performed with a coupled thermosphere ionosphere model to quantify the likely response to an extreme space weather event. The simulation predict the neutral atmosphere would experience horizontal winds of 1500m/s, vertical winds exceeding 150m/s, and the "top" of the thermosphere well above 1000km. Predicting the ionosphere response is somewhat more challenging because there is significant uncertainty in quantifying some of the other driver-response relationships such as the magnitude and shielding time-scale of the penetration electric field, the possible feedback to the magnetosphere, and the amount of nitric oxide production. Within the limits of uncertainty of the drivers, the magnitude of the response can be quantified and both linear and non-linear responses are predicted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1990WRR....26.2275M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1990WRR....26.2275M"><span>A Reliability Estimation in Modeling Watershed Runoff With Uncertainties</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.</p> <p>1990-10-01</p> <p>The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMEP33E..03B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMEP33E..03B"><span>Quantifying uncertainty in morphologically-derived bedload transport rates for large braided rivers: insights from high-resolution, high-frequency digital elevation model differencing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Brasington, J.; Hicks, M.; Wheaton, J. M.; Williams, R. D.; Vericat, D.</p> <p>2013-12-01</p> <p>Repeat surveys of channel morphology provide a means to quantify fluvial sediment storage and enable inferences about changes in long-term sediment supply, watershed delivery and bed level adjustment; information vital to support effective river and land management. Over shorter time-scales, direct differencing of fluvial terrain models may also offer a route to predict reach-averaged sediment transport rates and quantify the patterns of channel morphodynamics and the processes that force them. Recent and rapid advances in geomatics have facilitated these goals by enabling the acquisition of topographic data at spatial resolutions and precisions suitable for characterising river morphology at the scale of individual grains over multi-kilometre reaches. Despite improvements in topographic surveying, inverting the terms of the sediment budget to derive estimates of sediment transport and link these to morphodynamic processes is, nonetheless, often confounded by limited knowledge of either the sediment supply or efflux across a boundary of the control volume, or unobserved cut-and-fill taking place between surveys. This latter problem is particularly poorly constrained, as field logistics frequently preclude surveys at a temporal frequency sufficient to capture changes in sediment storage associated with each competent event, let alone changes during individual floods. In this paper, we attempt to quantify the principal sources of uncertainty in morphologically-derived bedload transport rates for the large, labile, gravel-bed braided Rees River which drains the Southern Alps of NZ. During the austral summer of 2009-10, a unique timeseries of 10 high quality DEMs was derived for a 3 x 0.7 km reach of the Rees, using a combination of mobile terrestrial laser scanning, aDcp soundings and aerial image analysis. Complementary measurements of the forcing flood discharges and estimates of event-based particle step lengths were also acquired during the field campaign. Together, the resulting dataset quantifies the evolution of the study reach over an annual flood season and provides an unprecedented insight into the patterns and processes of braiding. Uncertainties in the inferred rates of bedload transport are associated with the temporal and spatial frequency of measurements used to estimate the storage term of the sediment budget, and methods used to derive the boundary sediment flux. Results obtained reveal that over the annual flood season, over 80% of the braidplain was mobilised and that more than 50% of the bed experienced multiple cycles of cut and fill. Integration of cut and fill volumes event-by-event were found to be approximately 300% of the net change between October and May. While significant uncertainties reside in estimates of the boundary flux, rates of bedload transport derived for individual events are shown to correlate well with total energy expenditure and suggest that a relatively simple relationship may exist between the driving hydraulic forces at the reach scale and the geomorphic work performed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AdG....44...23P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AdG....44...23P"><span>Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.</p> <p>2017-04-01</p> <p>Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AAS...21523101M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AAS...21523101M"><span>A Statistician's View of Upcoming Grand Challenges</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Meng, Xiao Li</p> <p>2010-01-01</p> <p>In this session we have seen some snapshots of the broad spectrum of challenges, in this age of huge, complex, computer-intensive models, data, instruments,and questions. These challenges bridge astronomy at many wavelengths; basic physics; machine learning; -- and statistics. At one end of our spectrum, we think of 'compressing' the data with non-parametric methods. This raises the question of creating 'pseudo-replicas' of the data for uncertainty estimates. What would be involved in, e.g. boot-strap and related methods? Somewhere in the middle are these non-parametric methods for encapsulating the uncertainty information. At the far end, we find more model-based approaches, with the physics model embedded in the likelihood and analysis. The other distinctive problem is really the 'black-box' problem, where one has a complicated e.g. fundamental physics-based computer code, or 'black box', and one needs to know how changing the parameters at input -- due to uncertainties of any kind -- will map to changing the output. All of these connect to challenges in complexity of data and computation speed. Dr. Meng will highlight ways to 'cut corners' with advanced computational techniques, such as Parallel Tempering and Equal Energy methods. As well, there are cautionary tales of running automated analysis with real data -- where "30 sigma" outliers due to data artifacts can be more common than the astrophysical event of interest.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19208234','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19208234"><span>A semiparametric Bayesian proportional hazards model for interval censored data with frailty effects.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich</p> <p>2009-02-10</p> <p>Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1911236M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1911236M"><span>Drought Persistence in Models and Observations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Moon, Heewon; Gudmundsson, Lukas; Seneviratne, Sonia</p> <p>2017-04-01</p> <p>Many regions of the world have experienced drought events that persisted several years and caused substantial economic and ecological impacts in the 20th century. However, it remains unclear whether there are significant trends in the frequency or severity of these prolonged drought events. In particular, an important issue is linked to systematic biases in the representation of persistent drought events in climate models, which impedes analysis related to the detection and attribution of drought trends. This study assesses drought persistence errors in global climate model (GCM) simulations from the 5th phase of Coupled Model Intercomparison Project (CMIP5), in the period of 1901-2010. The model simulations are compared with five gridded observational data products. The analysis focuses on two aspects: the identification of systematic biases in the models and the partitioning of the spread of drought-persistence-error into four possible sources of uncertainty: model uncertainty, observation uncertainty, internal climate variability and the estimation error of drought persistence. We use monthly and yearly dry-to-dry transition probabilities as estimates for drought persistence with drought conditions defined as negative precipitation anomalies. For both time scales we find that most model simulations consistently underestimated drought persistence except in a few regions such as India and Eastern South America. Partitioning the spread of the drought-persistence-error shows that at the monthly time scale model uncertainty and observation uncertainty are dominant, while the contribution from internal variability does play a minor role in most cases. At the yearly scale, the spread of the drought-persistence-error is dominated by the estimation error, indicating that the partitioning is not statistically significant, due to a limited number of considered time steps. These findings reveal systematic errors in the representation of drought persistence in current climate models and highlight the main contributors of uncertainty of drought-persistence-error. Future analyses will focus on investigating the temporal propagation of drought persistence to better understand the causes for the identified errors in the representation of drought persistence in state-of-the-art climate models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20044342','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20044342"><span>Uncertainty during breast diagnostic evaluation: state of the science.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Montgomery, Mariann</p> <p>2010-01-01</p> <p>To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/13966','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/13966"><span>Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>BABA,T.; ISHIGURO,K.; ISHIHARA,Y.</p> <p>1999-08-30</p> <p>Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JGE....15..539Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JGE....15..539Y"><span>Accuracy and sensitivity analysis on seismic anisotropy parameter estimation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yan, Fuyong; Han, De-Hua</p> <p>2018-04-01</p> <p>There is significant uncertainty in measuring the Thomsen’s parameter δ in laboratory even though the dimensions and orientations of the rock samples are known. It is expected that more challenges will be encountered in the estimating of the seismic anisotropy parameters from field seismic data. Based on Monte Carlo simulation of vertical transversely isotropic layer cake model using the database of laboratory anisotropy measurement from the literature, we apply the commonly used quartic non-hyperbolic reflection moveout equation to estimate the seismic anisotropy parameters and test its accuracy and sensitivities to the source-receive offset, vertical interval velocity error and time picking error. The testing results show that the methodology works perfectly for noise-free synthetic data with short spread length. However, this method is extremely sensitive to the time picking error caused by mild random noises, and it requires the spread length to be greater than the depth of the reflection event. The uncertainties increase rapidly for the deeper layers and the estimated anisotropy parameters can be very unreliable for a layer with more than five overlain layers. It is possible that an isotropic formation can be misinterpreted as a strong anisotropic formation. The sensitivity analysis should provide useful guidance on how to group the reflection events and build a suitable geological model for anisotropy parameter inversion.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JInst..13P2001A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JInst..13P2001A"><span>Position reconstruction in LUX</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Akerib, D. S.; Alsum, S.; Araújo, H. M.; Bai, X.; Bailey, A. J.; Balajthy, J.; Beltrame, P.; Bernard, E. P.; Bernstein, A.; Biesiadzinski, T. P.; Boulton, E. M.; Brás, P.; Byram, D.; Cahn, S. B.; Carmona-Benitez, M. C.; Chan, C.; Currie, A.; Cutter, J. E.; Davison, T. J. R.; Dobi, A.; Druszkiewicz, E.; Edwards, B. N.; Fallon, S. R.; Fan, A.; Fiorucci, S.; Gaitskell, R. J.; Genovesi, J.; Ghag, C.; Gilchriese, M. G. D.; Hall, C. R.; Hanhardt, M.; Haselschwardt, S. J.; Hertel, S. A.; Hogan, D. P.; Horn, M.; Huang, D. Q.; Ignarra, C. M.; Jacobsen, R. G.; Ji, W.; Kamdin, K.; Kazkaz, K.; Khaitan, D.; Knoche, R.; Larsen, N. A.; Lenardo, B. G.; Lesko, K. T.; Lindote, A.; Lopes, M. I.; Manalaysay, A.; Mannino, R. L.; Marzioni, M. F.; McKinsey, D. N.; Mei, D.-M.; Mock, J.; Moongweluwan, M.; Morad, J. A.; Murphy, A. St. J.; Nehrkorn, C.; Nelson, H. N.; Neves, F.; O'Sullivan, K.; Oliver-Mallory, K. C.; Palladino, K. J.; Pease, E. K.; Rhyne, C.; Shaw, S.; Shutt, T. A.; Silva, C.; Solmaz, M.; Solovov, V. N.; Sorensen, P.; Sumner, T. J.; Szydagis, M.; Taylor, D. J.; Taylor, W. C.; Tennyson, B. P.; Terman, P. A.; Tiedt, D. R.; To, W. H.; Tripathi, M.; Tvrznikova, L.; Uvarov, S.; Velan, V.; Verbus, J. R.; Webb, R. C.; White, J. T.; Whitis, T. J.; Witherell, M. S.; Wolfs, F. L. H.; Xu, J.; Yazdani, K.; Young, S. K.; Zhang, C.</p> <p>2018-02-01</p> <p>The (x, y) position reconstruction method used in the analysis of the complete exposure of the Large Underground Xenon (LUX) experiment is presented. The algorithm is based on a statistical test that makes use of an iterative method to recover the photomultiplier tube (PMT) light response directly from the calibration data. The light response functions make use of a two dimensional functional form to account for the photons reflected on the inner walls of the detector. To increase the resolution for small pulses, a photon counting technique was employed to describe the response of the PMTs. The reconstruction was assessed with calibration data including 83mKr (releasing a total energy of 41.5 keV) and 3H (β- with Q = 18.6 keV) decays, and a deuterium-deuterium (D-D) neutron beam (2.45 MeV) . Within the detector's fiducial volume, the reconstruction has achieved an (x, y) position uncertainty of σ = 0.82 cm and σ = 0.17 cm for events of only 200 and 4,000 detected electroluminescence photons respectively. Such signals are associated with electron recoils of energies ~0.25 keV and ~10 keV, respectively. The reconstructed position of the smallest events with a single electron emitted from the liquid surface (22 detected photons) has a horizontal (x, y) uncertainty of 2.13 cm.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70029508','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70029508"><span>Loss estimates for a Puente Hills blind-thrust earthquake in Los Angeles, California</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Field, E.H.; Seligson, H.A.; Gupta, N.; Gupta, V.; Jordan, T.H.; Campbell, K.W.</p> <p>2005-01-01</p> <p>Based on OpenSHA and HAZUS-MH, we present loss estimates for an earthquake rupture on the recently identified Puente Hills blind-thrust fault beneath Los Angeles. Given a range of possible magnitudes and ground motion models, and presuming a full fault rupture, we estimate the total economic loss to be between $82 and $252 billion. This range is not only considerably higher than a previous estimate of $69 billion, but also implies the event would be the costliest disaster in U.S. history. The analysis has also provided the following predictions: 3,000-18,000 fatalities, 142,000-735,000 displaced households, 42,000-211,000 in need of short-term public shelter, and 30,000-99,000 tons of debris generated. Finally, we show that the choice of ground motion model can be more influential than the earthquake magnitude, and that reducing this epistemic uncertainty (e.g., via model improvement and/or rejection) could reduce the uncertainty of the loss estimates by up to a factor of two. We note that a full Puente Hills fault rupture is a rare event (once every ???3,000 years), and that other seismic sources pose significant risk as well. ?? 2005, Earthquake Engineering Research Institute.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.S41G..01K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.S41G..01K"><span>Probing the DPRK nuclear test-site to low magnitude using seismic pattern detectors</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kvaerna, T.; Gibbons, S. J.; Mykkeltveit, S.</p> <p>2017-12-01</p> <p>Six declared nuclear explosions at North Korea's Punggye-ri test-site between October 2006 and September 2017 were detected seismically both at regional and teleseismic distances. The similarity of body-wave signals from explosion to explosion allows us to locate these events relative to each other with high accuracy. Greater uncertainty in the relative time measurements for the most recent test on 3 September 2017 results in a greater uncertainty in the relative location estimate for this event, although it appears to have taken place below optimal overburden close to the peak of Mount Mantap. A number of smaller events, detected mainly at regional distances, have been identified as being at, or very close to, the test site. Due to waveform differences and available station coverage, a simple double-difference relative location is often not possible. In addition to the apparent collapse event some 8 minutes after the declared nuclear test, small seismic events have been detected on 25 May 2014, 11 September 2016, 23 September 2017, and 12 October 2017. The signals from these events differ significantly from those from the declared nuclear tests with far weaker Pn and far stronger Lg phases. Multi-channel correlation analysis and empirical matched field processing allow us to categorize these weaker seismic events with far greater confidence than classical waveform analysis allows.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUSM.U33A..03L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUSM.U33A..03L"><span>Recent Mega-Thrust Tsunamigenic Earthquakes and PTHA</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lorito, S.</p> <p>2013-05-01</p> <p>The occurrence of several mega-thrust tsunamigenic earthquakes in the last decade, including but not limited to the 2004 Sumatra-Andaman, the 2010 Maule, and 2011 Tohoku earthquakes, has been a dramatic reminder of the limitations in our capability of assessing earthquake and tsunami hazard and risk. However, the increasingly high-quality geophysical observational networks allowed the retrieval of most accurate than ever models of the rupture process of mega-thrust earthquakes, thus paving the way for future improved hazard assessments. Probabilistic Tsunami Hazard Analysis (PTHA) methodology, in particular, is less mature than its seismic counterpart, PSHA. Worldwide recent research efforts of the tsunami science community allowed to start filling this gap, and to define some best practices that are being progressively employed in PTHA for different regions and coasts at threat. In the first part of my talk, I will briefly review some rupture models of recent mega-thrust earthquakes, and highlight some of their surprising features that likely result in bigger error bars associated to PTHA results. More specifically, recent events of unexpected size at a given location, and with unexpected rupture process features, posed first-order open questions which prevent the definition of an heterogeneous rupture probability along a subduction zone, despite of several recent promising results on the subduction zone seismic cycle. In the second part of the talk, I will dig a bit more into a specific ongoing effort for improving PTHA methods, in particular as regards epistemic and aleatory uncertainties determination, and the computational PTHA feasibility when considering the full assumed source variability. Only logic trees are usually explicated in PTHA studies, accounting for different possible assumptions on the source zone properties and behavior. The selection of the earthquakes to be actually modelled is then in general made on a qualitative basis or remains implicit, despite different methods like event trees have been used for different applications. I will define a quite general PTHA framework, based on the mixed use of logic and event trees. I will first discuss a particular class of epistemic uncertainties, i.e. those related to the parametric fault characterization in terms of geometry, kinematics, and assessment of activity rates. A systematic classification in six justification levels of epistemic uncertainty related with the existence and behaviour of fault sources will be presented. Then, a particular branch of the logic tree is chosen in order to discuss just the aleatory variability of earthquake parameters, represented with an event tree. Even so, PTHA based on numerical scenarios is a too demanding computational task, particularly when probabilistic inundation maps are needed. For trying to reduce the computational burden without under-representing the source variability, the event tree is first constructed by taking care of densely (over-)sampling the earthquake parameter space, and then the earthquakes are filtered basing on their associated tsunami impact offshore, before calculating inundation maps. I'll describe this approach by means of a case study in the Mediterranean Sea, namely the PTHA for some locations of Eastern Sicily coasts and Southern Crete coast due to potential subduction earthquakes occurring on the Hellenic Arc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29760103','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29760103"><span>Comprehensive phylogeny of ray-finned fishes (Actinopterygii) based on transcriptomic and genomic data.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hughes, Lily C; Ortí, Guillermo; Huang, Yu; Sun, Ying; Baldwin, Carole C; Thompson, Andrew W; Arcila, Dahiana; Betancur-R, Ricardo; Li, Chenhong; Becker, Leandro; Bellora, Nicolás; Zhao, Xiaomeng; Li, Xiaofeng; Wang, Min; Fang, Chao; Xie, Bing; Zhou, Zhuocheng; Huang, Hai; Chen, Songlin; Venkatesh, Byrappa; Shi, Qiong</p> <p>2018-05-14</p> <p>Our understanding of phylogenetic relationships among bony fishes has been transformed by analysis of a small number of genes, but uncertainty remains around critical nodes. Genome-scale inferences so far have sampled a limited number of taxa and genes. Here we leveraged 144 genomes and 159 transcriptomes to investigate fish evolution with an unparalleled scale of data: >0.5 Mb from 1,105 orthologous exon sequences from 303 species, representing 66 out of 72 ray-finned fish orders. We apply phylogenetic tests designed to trace the effect of whole-genome duplication events on gene trees and find paralogy-free loci using a bioinformatics approach. Genome-wide data support the structure of the fish phylogeny, and hypothesis-testing procedures appropriate for phylogenomic datasets using explicit gene genealogy interrogation settle some long-standing uncertainties, such as the branching order at the base of the teleosts and among early euteleosts, and the sister lineage to the acanthomorph and percomorph radiations. Comprehensive fossil calibrations date the origin of all major fish lineages before the end of the Cretaceous.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3785948','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3785948"><span>Risk-based principles for defining and managing water security</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hall, Jim; Borgomeo, Edoardo</p> <p>2013-01-01</p> <p>The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12069679','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12069679"><span>News media coverage of screening mammography for women in their 40s and tamoxifen for primary prevention of breast cancer.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schwartz, Lisa M; Woloshin, Steven</p> <p>2002-06-19</p> <p>In the late 1990s, 3 events pertaining to breast cancer prevention received considerable attention in the US news media: a National Institutes of Health (NIH) consensus panel recommended against routine screening mammography for women in their 40s (January 1997), the National Cancer Institute (NCI) subsequently reversed the recommendation (March 1997), and an NCI-sponsored study demonstrated the efficacy of tamoxifen in the primary prevention of breast cancer (April 1998). To examine how the major US news media covered the potential benefits and harms of 2 breast cancer preventive strategies. Content analysis of US news stories reporting on the breast cancer prevention events. We used Lexis-Nexis to search for print news stories in the 10 highest-circulation US newspapers and requested transcripts from 3 major television networks to obtain all relevant news coverage in the 2 weeks following each event. Attitude toward preventive strategy (encourage, neutral, discourage); level of uncertainty about benefit and how benefits and harms were presented. Twenty-seven stories about the NIH consensus panel, 24 about the NCI reversal, and 34 about tamoxifen appeared in high-profile news media within 2 weeks of each event. Sixty-seven percent of NIH consensus panel stories left the impression that there was a lot of uncertainty about whether women aged 40 to 49 years should undergo screening, but 59% suggested that women should probably or definitely be screened. Only 4 stories suggested that women faced a genuine decision about what to do. The level of uncertainty reported was substantially lower following the NCI reversal (21% suggested a lot of uncertainty), and most stories (96%) suggested that women should be screened. In contrast, tamoxifen stories highlighted uncertainty about what women at high risk should do (62% suggested there was a lot of uncertainty), and none left the impression that women should definitely take the drug (24% suggested they probably should). Sixty-five percent of these stories suggested that women faced a genuine choice and would have to weigh the risks and benefits themselves. Most news stories favored routine use of screening mammography and urged caution about using tamoxifen. Almost all noted the potential harms of each preventive strategy; however, the negative aspects of tamoxifen received greater emphasis. Whereas taking tamoxifen was presented as a difficult decision, having a mammogram was presented as something women ought to do.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/985583','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/985583"><span>Wind Energy Management System Integration Project Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.</p> <p>2010-09-01</p> <p>The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation) and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and windmore » forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. In order to improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively, by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. In this report, a new methodology to predict the uncertainty ranges for the required balancing capacity, ramping capability and ramp duration is presented. Uncertainties created by system load forecast errors, wind and solar forecast errors, generation forced outages are taken into account. The uncertainty ranges are evaluated for different confidence levels of having the actual generation requirements within the corresponding limits. The methodology helps to identify system balancing reserve requirement based on a desired system performance levels, identify system “breaking points”, where the generation system becomes unable to follow the generation requirement curve with the user-specified probability level, and determine the time remaining to these potential events. The approach includes three stages: statistical and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence intervals. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis incorporating all sources of uncertainty and parameters of a continuous (wind forecast and load forecast errors) and discrete (forced generator outages and failures to start up) nature. Preliminary simulations using California Independent System Operator (California ISO) real life data have shown the effectiveness of the proposed approach. A tool developed based on the new methodology described in this report will be integrated with the California ISO systems. Contractual work is currently in place to integrate the tool with the AREVA EMS system.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2602819','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2602819"><span>Perceptual uncertainty and line-call challenges in professional tennis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Mather, George</p> <p>2008-01-01</p> <p>Fast-moving sports such as tennis require both players and match officials to make rapid accurate perceptual decisions about dynamic events in the visual world. Disagreements arise regularly, leading to disputes about decisions such as line calls. A number of factors must contribute to these disputes, including lapses in concentration, bias and gamesmanship. Fundamental uncertainty or variability in the sensory information supporting decisions must also play a role. Modern technological innovations now provide detailed and accurate physical information that can be compared against the decisions of players and officials. The present paper uses this psychophysical data to assess the significance of perceptual limitations as a contributor to real-world decisions in professional tennis. A detailed analysis is presented of a large body of data on line-call challenges in professional tennis tournaments over the last 2 years. Results reveal that the vast majority of challenges can be explained in a direct highly predictable manner by a simple model of uncertainty in perceptual information processing. Both players and line judges are remarkably accurate at judging ball bounce position, with a positional uncertainty of less than 40 mm. Line judges are more reliable than players. Judgements are more difficult for balls bouncing near base and service lines than those bouncing near side and centre lines. There is no evidence for significant errors in localization due to image motion. PMID:18426755</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18426755','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18426755"><span>Perceptual uncertainty and line-call challenges in professional tennis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mather, George</p> <p>2008-07-22</p> <p>Fast-moving sports such as tennis require both players and match officials to make rapid accurate perceptual decisions about dynamic events in the visual world. Disagreements arise regularly, leading to disputes about decisions such as line calls. A number of factors must contribute to these disputes, including lapses in concentration, bias and gamesmanship. Fundamental uncertainty or variability in the sensory information supporting decisions must also play a role. Modern technological innovations now provide detailed and accurate physical information that can be compared against the decisions of players and officials. The present paper uses this psychophysical data to assess the significance of perceptual limitations as a contributor to real-world decisions in professional tennis. A detailed analysis is presented of a large body of data on line-call challenges in professional tennis tournaments over the last 2 years. Results reveal that the vast majority of challenges can be explained in a direct highly predictable manner by a simple model of uncertainty in perceptual information processing. Both players and line judges are remarkably accurate at judging ball bounce position, with a positional uncertainty of less than 40mm. Line judges are more reliable than players. Judgements are more difficult for balls bouncing near base and service lines than those bouncing near side and centre lines. There is no evidence for significant errors in localization due to image motion.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=338482&Lab=NHEERL&keyword=life+AND+science&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=338482&Lab=NHEERL&keyword=life+AND+science&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Twenty-five years after "Wingspread"- Environmental endocrine disruptors (EDCs) and human health</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>The development of life-stage and tissue-specific AOPs for EDCs can reduce the uncertainty in extrapolating of the effects of EDCs from in vitro and in vivo studies in laboratory animals to humans. When the key events (KEs) and molecular initiating event (MIEs) in a pathway are...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014APS..APR.L1050I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014APS..APR.L1050I"><span>Self-completeness and the generalized uncertainty principle</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero</p> <p>2014-03-01</p> <p>The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self- consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013JHEP...11..139I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013JHEP...11..139I"><span>Self-completeness and the generalized uncertainty principle</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero</p> <p>2013-11-01</p> <p>The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self-consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/11540040','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/11540040"><span>Uncertainties in radiation effect predictions for the natural radiation environments of space.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>McNulty, P J; Stassinopoulos, E G</p> <p>1994-10-01</p> <p>Future manned missions beyond low earth orbit require accurate predictions of the risk to astronauts and to critical systems from exposure to ionizing radiation. For low-level exposures, the hazards are dominated by rare single-event phenomena where individual cosmic-ray particles or spallation reactions result in potentially catastrophic changes in critical components. Examples might be a biological lesion leading to cancer in an astronaut or a memory upset leading to an undesired rocket firing. The risks of such events appears to depend on the amount of energy deposited within critical sensitive volumes of biological cells and microelectronic components. The critical environmental information needed to estimate the risks posed by the natural space environments, including solar flares, is the number of times more than a threshold amount of energy for an event will be deposited in the critical microvolumes. These predictions are complicated by uncertainties in the natural environments, particularly the composition of flares, and by the effects of shielding. Microdosimetric data for large numbers of orbits are needed to improve the environmental models and to test the transport codes used to predict event rates.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19940039131&hterms=947&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D80%26Ntt%3D%2526%2523947','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19940039131&hterms=947&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D80%26Ntt%3D%2526%2523947"><span>Uncertainties in radiation effect predictions for the natural radiation environments of space</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mcnulty, P. J.; Stassinopoulos, E. G.</p> <p>1994-01-01</p> <p>Future manned missions beyond low earth orbit require accurate predictions of the risk to astronauts and to critical systems from exposure to ionizing radiation. For low-level exposures, the hazards are dominated by rare single-event phenomena where individual cosmic-ray particles or spallation reactions result in potentially catastrophic changes in critical components. Examples might be a biological lesion leading to cancer in an astronaut or a memory upset leading to an undesired rocket firing. The risks of such events appears to depend on the amount of energy deposited within critical sensitive volumes of biological cells and microelectronic components. The critical environmental information needed to estimate the risks posed by the natural space environments, including solar flares, is the number of times more than a threshold amount of energy for an event will be deposited in the critical microvolumes. These predictions are complicated by uncertainties in the natural environments, particularly the composition of flares, and by the effects of shielding. Microdosimetric data for large numbers of orbits are needed to improve the environmental models and to test the transport codes used to predict event rates.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3681675','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3681675"><span>Sensitizing events as trigger for discursive renewal and institutional change in Flanders’ environmental health approach, 1970s-1990s</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2013-01-01</p> <p>Background Sensitizing events may trigger and stimulate discursive renewal. From a discursive institutional perspective, changing discourses are the driving force behind the institutional dynamics of policy domains. Theoretically informed by discursive institutionalism, this article assesses the impact of a series of four sensitizing events that triggered serious environmental health concerns in Flanders between the 1970s till the 1990s, and led onto the gradual institutionalization of a Flemish environmental health arrangement. Methods The Policy Arrangement Approach is used as the analytical framework to structure the empirical results of the historical analysis based on document analysis and in-depth interviews. Results Until the 1990s, environmental health was characterized as an ad hoc policy field in Flanders, where agenda setting was based on sensitizing events – also referred to as incident-driven. Each of these events contributed to a gradual rethinking of the epistemological discourses about environmental health risks and uncertainties. These new discourses were the driving forces behind institutional dynamics as they gradually resulted in an increased need for: 1) long-term, policy-oriented, interdisciplinary environmental health research; 2) policy coordination and integration between the environmental and public health policy fields; and 3) new forms of science-policy interactions based on mutual learning. These changes are desirable in order to detect environmental health problems as fast as possible, to react immediately and communicate appropriately. Conclusions The series of four events that triggered serious environmental health concerns in Flanders provided the opportunity to rethink and re-organize the current affairs concerning environmental health and gradually resulted into the institutionalization of a Flemish environmental health arrangement. PMID:23758822</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.6342A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.6342A"><span>Uncertainties in hydrological extremes projections and its effects on decision-making processes in an Amazonian sub-basin.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Andres Rodriguez, Daniel; Garofolo, Lucas; Lazaro Siqueira Junior, Jose</p> <p>2013-04-01</p> <p>Uncertainties in Climate Change projections are affected by irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process. Such uncertainties affect the impact studies, complicating the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. Through these kinds of analyses it is possible to identify critical issues, which must be deeper studied. For this study we used several future's projections from General Circulation Models to feed a Hydrological Model, applied to the Amazonian sub-basin of Ji-Paraná. Hydrological Model integrations are performed for present historical time (1970-1990) and for future period (2010-2100). Extreme values analyses are performed to each simulated time series and results are compared with extremes events in present time. A simple approach to identify potential vulnerabilities consists of evaluating the hydrologic system response to climate variability and extreme events observed in the past, comparing them with the conditions projected for the future. Thus it is possible to identify critical issues that need attention and more detailed studies. For the goal of this work, we used socio-economic data from Brazilian Institute of Geography and Statistics, the Operator of the National Electric System, the Brazilian National Water Agency and scientific and press published information. This information is used to characterize impacts associated to extremes hydrological events in the basin during the present historical time and to evaluate potential impacts in the future face to the different hydrological projections. Results show inter-model variability results in a broad dispersion on projected extreme's values. The impact of such dispersion is differentiated for different aspects of socio-economic and natural systems and must be carefully addressed in order to help in decision-making processes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3646035','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3646035"><span>A Systematic Review and Methodological Evaluation of Published Cost-Effectiveness Analyses of Aromatase Inhibitors versus Tamoxifen in Early Stage Breast Cancer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>John-Baptiste, Ava A.; Wu, Wei; Rochon, Paula; Anderson, Geoffrey M.; Bell, Chaim M.</p> <p>2013-01-01</p> <p>Background A key priority in developing policies for providing affordable cancer care is measuring the value for money of new therapies using cost-effectiveness analyses (CEAs). For CEA to be useful it should focus on relevant outcomes and include thorough investigation of uncertainty. Randomized controlled trials (RCTs) of five years of aromatase inhibitors (AI) versus five years of tamoxifen in the treatment of post-menopausal women with early stage breast cancer, show benefit of AI in terms of disease free survival (DFS) but not overall survival (OS) and indicate higher risk of fracture with AI. Policy-relevant CEA of AI versus tamoxifen should focus on OS and include analysis of uncertainty over key assumptions. Methods We conducted a systematic review of published CEAs comparing an AI to tamoxifen. We searched Ovid MEDLINE, EMBASE, PsychINFO, and the Cochrane Database of Systematic Reviews without language restrictions. We selected CEAs with outcomes expressed as cost per life year or cost per quality adjusted life year (QALY). We assessed quality using the Neumann checklist. Using structured forms two abstractors collected descriptive information, sources of data, baseline assumptions on effectiveness and adverse events, and recorded approaches to assessing parameter uncertainty, methodological uncertainty, and structural uncertainty. Results We identified 1,622 citations and 18 studies met inclusion criteria. All CE estimates assumed a survival benefit for aromatase inhibitors. Twelve studies performed sensitivity analysis on the risk of adverse events and 7 assumed no additional mortality risk with any adverse event. Sub-group analysis was limited; 6 studies examined older women, 2 examined women with low recurrence risk, and 1 examined women with multiple comorbidities. Conclusion Published CEAs comparing AIs to tamoxifen assumed an OS benefit though none has been shown in RCTs, leading to an overestimate of the cost-effectiveness of AIs. Results of these CEA analyses may be suboptimal for guiding policy. PMID:23671612</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23671612','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23671612"><span>A systematic review and methodological evaluation of published cost-effectiveness analyses of aromatase inhibitors versus tamoxifen in early stage breast cancer.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>John-Baptiste, Ava A; Wu, Wei; Rochon, Paula; Anderson, Geoffrey M; Bell, Chaim M</p> <p>2013-01-01</p> <p>A key priority in developing policies for providing affordable cancer care is measuring the value for money of new therapies using cost-effectiveness analyses (CEAs). For CEA to be useful it should focus on relevant outcomes and include thorough investigation of uncertainty. Randomized controlled trials (RCTs) of five years of aromatase inhibitors (AI) versus five years of tamoxifen in the treatment of post-menopausal women with early stage breast cancer, show benefit of AI in terms of disease free survival (DFS) but not overall survival (OS) and indicate higher risk of fracture with AI. Policy-relevant CEA of AI versus tamoxifen should focus on OS and include analysis of uncertainty over key assumptions. We conducted a systematic review of published CEAs comparing an AI to tamoxifen. We searched Ovid MEDLINE, EMBASE, PsychINFO, and the Cochrane Database of Systematic Reviews without language restrictions. We selected CEAs with outcomes expressed as cost per life year or cost per quality adjusted life year (QALY). We assessed quality using the Neumann checklist. Using structured forms two abstractors collected descriptive information, sources of data, baseline assumptions on effectiveness and adverse events, and recorded approaches to assessing parameter uncertainty, methodological uncertainty, and structural uncertainty. We identified 1,622 citations and 18 studies met inclusion criteria. All CE estimates assumed a survival benefit for aromatase inhibitors. Twelve studies performed sensitivity analysis on the risk of adverse events and 7 assumed no additional mortality risk with any adverse event. Sub-group analysis was limited; 6 studies examined older women, 2 examined women with low recurrence risk, and 1 examined women with multiple comorbidities. Published CEAs comparing AIs to tamoxifen assumed an OS benefit though none has been shown in RCTs, leading to an overestimate of the cost-effectiveness of AIs. Results of these CEA analyses may be suboptimal for guiding policy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010CoPhC.181..813C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010CoPhC.181..813C"><span>Monte Carlo event generators in atomic collisions: A new tool to tackle the few-body dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ciappina, M. F.; Kirchner, T.; Schulz, M.</p> <p>2010-04-01</p> <p>We present a set of routines to produce theoretical event files, for both single and double ionization of atoms by ion impact, based on a Monte Carlo event generator (MCEG) scheme. Such event files are the theoretical counterpart of the data obtained from a kinematically complete experiment; i.e. they contain the momentum components of all collision fragments for a large number of ionization events. Among the advantages of working with theoretical event files is the possibility to incorporate the conditions present in a real experiment, such as the uncertainties in the measured quantities. Additionally, by manipulating them it is possible to generate any type of cross sections, specially those that are usually too complicated to compute with conventional methods due to a lack of symmetry. Consequently, the numerical effort of such calculations is dramatically reduced. We show examples for both single and double ionization, with special emphasis on a new data analysis tool, called four-body Dalitz plots, developed very recently. Program summaryProgram title: MCEG Catalogue identifier: AEFV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2695 No. of bytes in distributed program, including test data, etc.: 18 501 Distribution format: tar.gz Programming language: FORTRAN 77 with parallelization directives using scripting Computer: Single machines using Linux and Linux servers/clusters (with cores with any clock speed, cache memory and bits in a word) Operating system: Linux (any version and flavor) and FORTRAN 77 compilers Has the code been vectorised or parallelized?: Yes RAM: 64-128 kBytes (the codes are very cpu intensive) Classification: 2.6 Nature of problem: The code deals with single and double ionization of atoms by ion impact. Conventional theoretical approaches aim at a direct calculation of the corresponding cross sections. This has the important shortcoming that it is difficult to account for the experimental conditions when comparing results to measured data. In contrast, the present code generates theoretical event files of the same type as are obtained in a real experiment. From these event files any type of cross sections can be easily extracted. The theoretical schemes are based on distorted wave formalisms for both processes of interest. Solution method: The codes employ a Monte Carlo Event Generator based on theoretical formalisms to generate event files for both single and double ionization. One of the main advantages of having access to theoretical event files is the possibility of adding the conditions present in real experiments (parameter uncertainties, environmental conditions, etc.) and to incorporate additional physics in the resulting event files (e.g. elastic scattering or other interactions absent in the underlying calculations). Additional comments: The computational time can be dramatically reduced if a large number of processors is used. Since the codes has no communication between processes it is possible to achieve an efficiency of a 100% (this number certainly will be penalized by the queuing waiting time). Running time: Times vary according to the process, single or double ionization, to be simulated, the number of processors and the type of theoretical model. The typical running time is between several hours and up to a few weeks.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29284369','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29284369"><span>Bayesian Phase II optimization for time-to-event data based on historical information.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bertsche, Anja; Fleischer, Frank; Beyersmann, Jan; Nehmiz, Gerhard</p> <p>2017-01-01</p> <p>After exploratory drug development, companies face the decision whether to initiate confirmatory trials based on limited efficacy information. This proof-of-concept decision is typically performed after a Phase II trial studying a novel treatment versus either placebo or an active comparator. The article aims to optimize the design of such a proof-of-concept trial with respect to decision making. We incorporate historical information and develop pre-specified decision criteria accounting for the uncertainty of the observed treatment effect. We optimize these criteria based on sensitivity and specificity, given the historical information. Specifically, time-to-event data are considered in a randomized 2-arm trial with additional prior information on the control treatment. The proof-of-concept criterion uses treatment effect size, rather than significance. Criteria are defined on the posterior distribution of the hazard ratio given the Phase II data and the historical control information. Event times are exponentially modeled within groups, allowing for group-specific conjugate prior-to-posterior calculation. While a non-informative prior is placed on the investigational treatment, the control prior is constructed via the meta-analytic-predictive approach. The design parameters including sample size and allocation ratio are then optimized, maximizing the probability of taking the right decision. The approach is illustrated with an example in lung cancer.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23888847','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23888847"><span>Evolutionary origin and early biogeography of otophysan fishes (Ostariophysi: Teleostei).</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chen, Wei-Jen; Lavoué, Sébastien; Mayden, Richard L</p> <p>2013-08-01</p> <p>The biogeography of the mega-diverse, freshwater, and globally distributed Otophysi has received considerable attention. This attraction largely stems from assumptions as to their ancient origin, the clade being almost exclusively freshwater, and their suitability as to explanations of trans-oceanic distributions. Despite multiple hypotheses explaining present-day distributions, problems remain, precluding more parsimonious explanations. Underlying previous hypotheses are alternative phylogenies for Otophysi, uncertainties as to temporal diversification and assumptions integral to various explanations. We reexamine the origin and early diversification of this clade based on a comprehensive time-calibrated, molecular-based phylogenetic analysis and event-based approaches for ancestral range inference of lineages. Our results do not corroborate current phylogenetic classifications of otophysans. We demonstrate Siluriformes are never sister to Gymnotiformes and Characiformes are most likely nonmonophyletic. Divergence time estimates specify a split between Cypriniformes and Characiphysi with the fragmentation of Pangea. The early diversification of characiphysans either predated, or was contemporary with, the separation of Africa and South America, and involved a combination of within- and between-continental divergence events for these lineages. The intercontinental diversification of siluroids and characoids postdated major intercontinental tectonic fragmentations (<90 Mya). Post-tectonic drift dispersal events are hypothesized to account for their current distribution patterns. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMNH43B1759D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMNH43B1759D"><span>Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Day, S. J.; Fearnley, C. J.</p> <p>2013-12-01</p> <p>Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted population centers with poor enforcement of building codes, unrealistic expectations of warning systems or failures to understand local seismic damage mechanisms; and the interaction of land use restriction strategies and responsive warning strategies around lahar-prone volcanoes. A more complete understanding of the interactions between these different types of mitigation strategy, especially the consequences for the expectations and behaviors of the populations at risk, requires models of decision-making under high levels of both uncertainty and danger. The Observation-Orientation-Decision-Action (OODA) loop model (Boyd, 1987) may be a particularly useful model. It emphasizes the importance of 'orientation' (the interpretation of observations and assessment of their significance for the observer and decision-maker), the feedback between decisions and subsequent observations and orientations, and the importance of developing mitigation strategies that are flexible and so able to respond to the occurrence of the unexpected. REFERENCE: Boyd, J.R. A Discourse on Winning and Losing [http://dnipogo.org/john-r-boyd/</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25500470','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25500470"><span>Simplifying impact of urban development on sewer systems.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kleidorfer, Manfred; Sitzenfrei, Robert; Rauch, Wolfgang</p> <p>2014-01-01</p> <p>Linking urban development and urban drainage models is a more and more popular approach when impacts of pavement of urban areas on sewer system performance are evaluated. As such an approach is a difficult task, this is not a feasible procedure for everyday engineering practice. We propose an alternative method, based on a developed simple near-quadratic relationship, which directly translates change (increase or decrease) of paved area into a change in the return period (RP) of the design rainfall event or design rainfall intensity. This formula is simple to use and compatible with existing design guidelines. A further advantage is that the calculated design RP can also be used to communicate the impact of a change in impervious areas to stakeholders or the public community. The method is developed using a set of 250 virtual and two real-world case studies and hydrodynamic simulations. It is validated on a small catchment for which we compare system performance and redesigned pipe diameters. Of course such a simplification contains different uncertainties. But these uncertainties have to be seen in the context of overall uncertainties when trying to predict city development into the future. Hence it still is a significant advantage compared to today's engineering practice.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/6993388','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/6993388"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.</p> <p></p> <p>This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom wasmore » calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AtmEn.115..361S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AtmEn.115..361S"><span>Uncertainties of wild-land fires emission in AQMEII phase 2 case study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Soares, J.; Sofiev, M.; Hakkarainen, J.</p> <p>2015-08-01</p> <p>The paper discusses the main uncertainties of wild-land fire emission estimates used in the AQMEII-II case study. The wild-land fire emission of particulate matter for the summer fire season of 2010 in Eurasia was generated by the Integrated System for wild-land Fires (IS4FIRES). The emission calculation procedure included two steps: bottom-up emission compilation from radiative energy of individual fires observed by MODIS instrument on-board of Terra and Aqua satellites; and top-down calibration of emission factors based on the comparison between observations and modelled results. The approach inherits various uncertainties originating from imperfect information on fires, inaccuracies of the inverse problem solution, and simplifications in the fire description. These are analysed in regard to the Eurasian fires in 2010. It is concluded that the total emission is likely to be over-estimated by up to 50% with individual-fire emission accuracy likely to vary in a wide range. The first results of the new IS4FIRESv2 products and fire-resolving modelling are discussed in application to the 2010 events. It is shown that the new emission estimates have similar patterns but are lower than the IS4FIRESv1 values.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H31A1467K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H31A1467K"><span>Quantification of Uncertainty in the Flood Frequency Analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.</p> <p>2017-12-01</p> <p>Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017TCry...11.2727C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017TCry...11.2727C"><span>Comparison of different methods to retrieve optical-equivalent snow grain size in central Antarctica</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Carlsen, Tim; Birnbaum, Gerit; Ehrlich, André; Freitag, Johannes; Heygster, Georg; Istomina, Larysa; Kipfstuhl, Sepp; Orsi, Anaïs; Schäfer, Michael; Wendisch, Manfred</p> <p>2017-11-01</p> <p>The optical-equivalent snow grain size affects the reflectivity of snow surfaces and, thus, the local surface energy budget in particular in polar regions. Therefore, the specific surface area (SSA), from which the optical snow grain size is derived, was observed for a 2-month period in central Antarctica (Kohnen research station) during austral summer 2013/14. The data were retrieved on the basis of ground-based spectral surface albedo measurements collected by the COmpact RAdiation measurement System (CORAS) and airborne observations with the Spectral Modular Airborne Radiation measurement sysTem (SMART). The snow grain size and pollution amount (SGSP) algorithm, originally developed to analyze spaceborne reflectance measurements by the MODerate Resolution Imaging Spectroradiometer (MODIS), was modified in order to reduce the impact of the solar zenith angle on the retrieval results and to cover measurements in overcast conditions. Spectral ratios of surface albedo at 1280 and 1100 nm wavelength were used to reduce the retrieval uncertainty. The retrieval was applied to the ground-based and airborne observations and validated against optical in situ observations of SSA utilizing an IceCube device. The SSA retrieved from CORAS observations varied between 27 and 89 m2 kg-1. Snowfall events caused distinct relative maxima of the SSA which were followed by a gradual decrease in SSA due to snow metamorphism and wind-induced transport of freshly fallen ice crystals. The ability of the modified algorithm to include measurements in overcast conditions improved the data coverage, in particular at times when precipitation events occurred and the SSA changed quickly. SSA retrieved from measurements with CORAS and MODIS agree with the in situ observations within the ranges given by the measurement uncertainties. However, SSA retrieved from the airborne SMART data slightly underestimated the ground-based results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.H13C1556H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.H13C1556H"><span>Inter-event variability in urban stormwater runoff response associated with hydrologic connectivity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hondula, K. L.</p> <p>2015-12-01</p> <p>Urbanization alters the magnitude and composition of hydrologic and biogeochemical fluxes from watersheds, with subsequent deleterious consequences for receiving waters. Projected changes in storm characteristics such as rainfall intensity and event size are predicted to amplify these impacts and render current regulations inadequate for protecting surface water quality. As stormwater management practices (BMPs) are increasingly being relied upon to reduce excess nutrient pollution in runoff from residential development, empirical investigation of their performance across a range of conditions is warranted. Despite substantial investment in urban and suburban BMPs, significant knowledge gaps exist in understanding how landscape structure and precipitation event characteristics influence the amount of stormwater runoff and associated nutrient loads from these complex catchments. Increasing infiltration of stormwater before it enters the sewer network (source control) is hypothesized to better mimic natural hydrologic and biogeochemical fluxes compared to more centralized BMPs at sewer outlets such as wet and dry ponds. Rainfall and runoff quality and quantity were monitored in four small (1-5 ha) residential catchments in Maryland to test the efficacy of infiltration-based stormwater management practices in comparison to end-of-pipe BMPs. Results indicated that reduced hydrologic connectivity associated with infiltration-based practices affected the relationship between the magnitude of rainfall events and water yield , but only for small precipitation events: compared to end-of-pipe BMPs, source control was associated with both lower runoff ratios and lower nutrient export per area for a given rainfall event size. We found variability in stormwater runoff responses (water yield, quality, and nutrient loads) was associated with precipitation event size, antecedent rainfall, and hydrologic connectivity as quantified by a modified directional connectivity index. Accounting for the interactive effects of landscape structure and precipitation event characteristics can reduce the uncertainty surrounding stormwater runoff responses in complex urban watersheds.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.H33E1577R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.H33E1577R"><span>Integrating Local Experiential and Hydrometeorological Data to Understand Knowledge Uncertainties and to Build Resilience to Flooding in Two Puerto Rican Communities.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ramsey, M.; Nytch, C. J.; Branoff, B.</p> <p>2016-12-01</p> <p>Socio-hydrological studies that explore feedbacks between social and biophysical processes related to flood risk can help managers identify strategies that increase a community's freshwater security. However, knowledge uncertainty due to coarse spatio-temporal coverage of hydrological monitoring data, missing riverine discharge and precipitation records, assumptions of flood risk models, and effects of urbanization, can limit the ability of these studies to isolate hydrological responses to social drivers of flooding and a changing climate. Local experiential knowledge can provide much needed information about 1) actual flood spatio-temporal patterns, 2) human impacts and perceptions of flood events, and 3) mechanisms to validate flood risk studies and understand key social elements of the system. We addressed these knowledge gaps by comparing the location and timing of flood events described in resident interviews and resident drawn maps (total = 97) from two San Juan communities with NOAA and USGS precipitation and riverine discharge data archives, and FEMA flood maps. Analyses of five focal flood events revealed 1) riverine monitoring data failed to record a major flood event caused by localized blockage of the river, 2) residents did not mention multiple extreme riverine discharge events, 3) resident and FEMA flood maps matched closely but resident maps provided finer spatial information about frequency of flooding, and 4) only a small percentage of residents remembered the dates of flood events. Local knowledge provided valuable social data about flood impacts on human economic and physical/psychological wellbeing, perceptions about factors causing flooding, and what residents use as sources of flood information. A simple mechanism or tool for residents to record their flood experiences in real-time will address the uncertainties in local knowledge and improve social memory. The integration of local experiential knowledge with simulated and empirical hydro-meteorological data can be a powerful approach to increase the quality of socio-hydrological studies about flooding and freshwater security.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19800014518','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19800014518"><span>Uncertainty in estimates of the number of extraterrestrial civilizations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Sturrock, P. A.</p> <p>1980-01-01</p> <p>An estimation of the number N of communicative civilizations is made by means of Drake's formula which involves the combination of several quantities, each of which is to some extent uncertain. It is shown that the uncertainty in any quantity may be represented by a probability distribution function, even if that quantity is itself a probability. The uncertainty of current estimates of N is derived principally from uncertainty in estimates of the lifetime of advanced civilizations. It is argued that this is due primarily to uncertainty concerning the existence of a Galactic Federation which is in turn contingent upon uncertainty about whether the limitations of present-day physics are absolute or (in the event that there exists a yet undiscovered hyperphysics) transient. It is further argued that it is advantageous to consider explicitly these underlying assumptions in order to compare the probable numbers of civilizations operating radio beacons, permitting radio leakage, dispatching probes for radio surveillance for dispatching vehicles for manned surveillance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.B32C..06K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.B32C..06K"><span>How much do different global GPP products agree in distribution and magnitude of GPP extremes?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, S.; Ryu, Y.; Jiang, C.</p> <p>2016-12-01</p> <p>To evaluate uncertainty of global Gross Primary Productivity (GPP) extremes, we compare three global GPP datasets derived from different data processing methods (e.g. MPI-BGC: machine-learning, MODIS GPP (MOD17): semi-empirical, Breathing Earth System Simulator (BESS): process based). We preprocess the datasets following the method from Zscheischler et al., (2012) to detect GPP extremes which occur in less than 1% of the number of whole pixels, and to identify 3D-connected spatiotemporal GPP extremes. We firstly analyze global patterns and the magnitude of GPP extremes with MPI-BGC, MOD17, and BESS over 2001-2011. For consistent analysis in the three products, spatial and temporal resolution were set at 50 km and a monthly scale, respectively. Our results indicated that the global patterns of GPP extremes derived from MPI-BGC and BESS agreed with each other by showing hotspots in Northeastern Brazil and Eastern Texas. However, the extreme events detected from MOD17 were concentrated in tropical forests (e.g. Southeast Asia and South America). The amount of GPP reduction caused by climate extremes considerably differed across the products. For example, Russian heatwave in 2010 led to 100 Tg C uncertainty (198.7 Tg C in MPI-BGC, 305.6 Tg C in MOD17, and 237.8 Tg C in BESS). Moreover, the duration of extreme events differ among the three GPP datasets for the Russian heatwave (MPI-BGC: May-Sep, MOD17: Jun-Aug, and BESS: May-Aug). To test whether Sun induced Fluorescence (SiF), a proxy of GPP, can capture GPP extremes, we investigate global distribution of GPP extreme events in BESS, MOD17 and GOME-2 SiF between 2008 and 2014 when SiF data is available. We found that extreme GPP events in GOME-2 SiF and MOD17 appear in tropical forests whereas those in BESS emerged in Northeastern Brazil and Eastern Texas. The GPP extremes by severe 2011 US drought were detected by BESS and MODIS, but not by SiF. Our findings highlight that different GPP datasets could result in varying duration and intensity of GPP extremes and distribution of hotspots, and this study could contribute to quantifying uncertainties in GPP extremes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4600346','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4600346"><span>Predicting Response to Reassurances and Uncertainties in Bioterrorism Communications for Urban Populations in New York and California</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Vaughan, Elaine; Tinker, Tim L.; Truman, Benedict I.; Edelson, Paul; Morse, Stephen S.</p> <p>2015-01-01</p> <p>Recent national plans for recovery from bioterrorism acts perpetrated in densely populated urban areas acknowledge the formidable technical and social challenges of consequence management. Effective risk and crisis communication is one priority to strengthen the U.S.’s response and resilience. However, several notable risk events since September 11, 2001, have revealed vulnerabilities in risk/crisis communication strategies and infrastructure of agencies responsible for protecting civilian populations. During recovery from a significant biocontamination event, 2 goals are essential: (1) effective communication of changing risk circumstances and uncertainties related to cleanup, restoration, and reoccupancy; and (2) adequate responsiveness to emerging information needs and priorities of diverse populations in high-threat, vulnerable locations. This telephone survey study explored predictors of public reactions to uncertainty communications and reassurances from leaders related to the remediation stage of an urban-based bioterrorism incident. African American and Hispanic adults (N = 320) were randomly sampled from 2 ethnically and socioeconomically diverse geographic areas in New York and California assessed as high threat, high vulnerability for terrorism and other public health emergencies. Results suggest that considerable heterogeneity exists in risk perspectives and information needs within certain sociodemographic groups; that success of risk/crisis communication during recovery is likely to be uneven; that common assumptions about public responsiveness to particular risk communications need further consideration; and that communication effectiveness depends partly on preexisting values and risk perceptions and prior trust in leaders. Needed improvements in communication strategies are possible with recognition of where individuals start as a reference point for reasoning about risk information, and comprehension of how this influences subsequent interpretation of agencies’ actions and communications. PMID:22582813</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PApGe.175..341G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PApGe.175..341G"><span>Improvements to Earthquake Location with a Fuzzy Logic Approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gökalp, Hüseyin</p> <p>2018-01-01</p> <p>In this study, improvements to the earthquake location method were investigated using a fuzzy logic approach proposed by Lin and Sanford (Bull Seismol Soc Am 91:82-93, 2001). The method has certain advantages compared to the inverse methods in terms of eliminating the uncertainties of arrival times and reading errors. In this study, adopting this approach, epicentral locations were determined based on the results of a fuzzy logic space concerning the uncertainties in the velocity models. To map the uncertainties in arrival times into the fuzzy logic space, a trapezoidal membership function was constructed by directly using the travel time difference between the two stations for the P- and S-arrival times instead of the P- and S-wave models to eliminate the need for obtaining information concerning the velocity structure of the study area. The results showed that this method worked most effectively when earthquakes occurred away from a network or when the arrival time data contained phase reading errors. In this study, to resolve the problems related to determining the epicentral locations of the events, a forward modeling method like the grid search technique was used by applying different logical operations (i.e., intersection, union, and their combination) with a fuzzy logic approach. The locations of the events were depended on results of fuzzy logic outputs in fuzzy logic space by searching in a gridded region. The process of location determination with the defuzzification of only the grid points with the membership value of 1 obtained by normalizing all the maximum fuzzy output values of the highest values resulted in more reliable epicentral locations for the earthquakes than the other approaches. In addition, throughout the process, the center-of-gravity method was used as a defuzzification operation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22582813','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22582813"><span>Predicting response to reassurances and uncertainties in bioterrorism communications for urban populations in New York and California.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Vaughan, Elaine; Tinker, Tim L; Truman, Benedict I; Edelson, Paul; Morse, Stephen S</p> <p>2012-06-01</p> <p>Recent national plans for recovery from bioterrorism acts perpetrated in densely populated urban areas acknowledge the formidable technical and social challenges of consequence management. Effective risk and crisis communication is one priority to strengthen the U.S.'s response and resilience. However, several notable risk events since September 11, 2001, have revealed vulnerabilities in risk/crisis communication strategies and infrastructure of agencies responsible for protecting civilian populations. During recovery from a significant biocontamination event, 2 goals are essential: (1) effective communication of changing risk circumstances and uncertainties related to cleanup, restoration, and reoccupancy; and (2) adequate responsiveness to emerging information needs and priorities of diverse populations in high-threat, vulnerable locations. This telephone survey study explored predictors of public reactions to uncertainty communications and reassurances from leaders related to the remediation stage of an urban-based bioterrorism incident. African American and Hispanic adults (N=320) were randomly sampled from 2 ethnically and socioeconomically diverse geographic areas in New York and California assessed as high threat, high vulnerability for terrorism and other public health emergencies. Results suggest that considerable heterogeneity exists in risk perspectives and information needs within certain sociodemographic groups; that success of risk/crisis communication during recovery is likely to be uneven; that common assumptions about public responsiveness to particular risk communications need further consideration; and that communication effectiveness depends partly on preexisting values and risk perceptions and prior trust in leaders. Needed improvements in communication strategies are possible with recognition of where individuals start as a reference point for reasoning about risk information, and comprehension of how this influences subsequent interpretation of agencies' actions and communications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.S53B2828M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.S53B2828M"><span>Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.</p> <p>2015-12-01</p> <p>Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012JHyd..416..157V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012JHyd..416..157V"><span>Use of radar rainfall estimates and forecasts to prevent flash flood in real time by using a road inundation warning system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Versini, Pierre-Antoine</p> <p>2012-01-01</p> <p>SummaryImportant damages occur in small headwater catchments when they are hit by severe storms with complex spatio-temporal structure, sometimes resulting in flash floods. As these catchments are mostly not covered by sensor networks, it is difficult to forecast these floods. This is particularly true for road submersions, representing major concerns for flood event managers. The use of Quantitative Precipitation Estimates and Forecasts (QPE/QPF) especially based on radar measurements could particularly be adequate to evaluate rainfall-induced risks. Although their characteristic time and space scales would make them suitable for flash flood modelling, the impact of their uncertainties remain uncertain and have to be evaluated. The Gard region (France) has been chosen as case study. This area is frequently affected by severe flash floods, and an application devoted to the road network has also been recently developed for the North part of this region. This warning system combines distributed hydro-meteorological modelling and susceptibility analysis to provide warnings of road inundations. The warning system has been tested on the specific storm of the 29-30 September 2007. During this event, around 200 mm dropped on the South part of the Gard and many roads were submerged. Radar-based QPE and QPF have been used to forecast the exact location of road submersions and the results have been compared to the effective road submersions actually occurred during the event as listed by the emergency services. Used on an area it has not been calibrated, the results confirm that the road submersion warning system represents a promising tool for anticipating and quantifying the consequences of storm events at ground. It rates the submersion risk with an acceptable level of accuracy and demonstrates also the quality of high spatial and temporal resolution radar rainfall data in real time, and the possibility to use them despite their uncertainties. However because of the quality of rainfall forecasts falls drastically with time, it is not often sufficient to provide valuable information for lead times exceeding 1 h.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..1110000L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..1110000L"><span>Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lari, S.; Frattini, P.; Crosta, G. B.</p> <p>2009-04-01</p> <p>We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26284986','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26284986"><span>Updated methods for assessing the impacts of nearby gas drilling and production on neighborhood air quality and human health.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Olaguer, Eduardo P; Erickson, Matthew; Wijesinghe, Asanga; Neish, Brad; Williams, Jeff; Colvin, John</p> <p>2016-02-01</p> <p>An explosive growth in natural gas production within the last decade has fueled concern over the public health impacts of air pollutant emissions from oil and gas sites in the Barnett and Eagle Ford shale regions of Texas. Commonly acknowledged sources of uncertainty are the lack of sustained monitoring of ambient concentrations of pollutants associated with gas mining, poor quantification of their emissions, and inability to correlate health symptoms with specific emission events. These uncertainties are best addressed not by conventional monitoring and modeling technology, but by increasingly available advanced techniques for real-time mobile monitoring, microscale modeling and source attribution, and real-time broadcasting of air quality and human health data over the World Wide Web. The combination of contemporary scientific and social media approaches can be used to develop a strategy to detect and quantify emission events from oil and gas facilities, alert nearby residents of these events, and collect associated human health data, all in real time or near-real time. The various technical elements of this strategy are demonstrated based on the results of past, current, and planned future monitoring studies in the Barnett and Eagle Ford shale regions. Resources should not be invested in expanding the conventional air quality monitoring network in the vicinity of oil and gas exploration and production sites. Rather, more contemporary monitoring and data analysis techniques should take the place of older methods to better protect the health of nearby residents and maintain the integrity of the surrounding environment.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMGC14A..01S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMGC14A..01S"><span>Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Samaras, C.; Cook, L.</p> <p>2015-12-01</p> <p>Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4651906','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4651906"><span>Uncertainty in the Timing of Origin of Animals and the Limits of Precision in Molecular Timescales</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>dos Reis, Mario; Thawornwattana, Yuttapong; Angelis, Konstantinos; Telford, Maximilian J.; Donoghue, Philip C.J.; Yang, Ziheng</p> <p>2015-01-01</p> <p>Summary The timing of divergences among metazoan lineages is integral to understanding the processes of animal evolution, placing the biological events of species divergences into the correct geological timeframe. Recent fossil discoveries and molecular clock dating studies have suggested a divergence of bilaterian phyla >100 million years before the Cambrian, when the first definite crown-bilaterian fossils occur. Most previous molecular clock dating studies, however, have suffered from limited data and biases in methodologies, and virtually all have failed to acknowledge the large uncertainties associated with the fossil record of early animals, leading to inconsistent estimates among studies. Here we use an unprecedented amount of molecular data, combined with four fossil calibration strategies (reflecting disparate and controversial interpretations of the metazoan fossil record) to obtain Bayesian estimates of metazoan divergence times. Our results indicate that the uncertain nature of ancient fossils and violations of the molecular clock impose a limit on the precision that can be achieved in estimates of ancient molecular timescales. For example, although we can assert that crown Metazoa originated during the Cryogenian (with most crown-bilaterian phyla diversifying during the Ediacaran), it is not possible with current data to pinpoint the divergence events with sufficient accuracy to test for correlations between geological and biological events in the history of animals. Although a Cryogenian origin of crown Metazoa agrees with current geological interpretations, the divergence dates of the bilaterians remain controversial. Thus, attempts to build evolutionary narratives of early animal evolution based on molecular clock timescales appear to be premature. PMID:26603774</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017NHESS..17.1393L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017NHESS..17.1393L"><span>Vulnerability of bridges to scour: insights from an international expert elicitation workshop</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lamb, Rob; Aspinall, Willy; Odbert, Henry; Wagener, Thorsten</p> <p>2017-08-01</p> <p>Scour (localised erosion) during flood events is one of the most significant threats to bridges over rivers and estuaries, and has been the cause of numerous bridge failures, with damaging consequences. Mitigation of the risk of bridges being damaged by scour is therefore important to many infrastructure owners, and is supported by industry guidance. Even after mitigation, some residual risk remains, though its extent is difficult to quantify because of the uncertainties inherent in the prediction of scour and the assessment of the scour risk. This paper summarises findings from an international expert workshop on bridge scour risk assessment that explores uncertainties about the vulnerability of bridges to scour. Two specialised structured elicitation methods were applied to explore the factors that experts in the field consider important when assessing scour risk and to derive pooled expert judgements of bridge failure probabilities that are conditional on a range of assumed scenarios describing flood event severity, bridge and watercourse types and risk mitigation protocols. The experts' judgements broadly align with industry good practice, but indicate significant uncertainty about quantitative estimates of bridge failure probabilities, reflecting the difficulty in assessing the residual risk of failure. The data and findings presented here could provide a useful context for the development of generic scour fragility models and their associated uncertainties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFM.H31H1224N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFM.H31H1224N"><span>2D Flood Modelling Using Advanced Terrain Analysis Techniques And A Fully Continuous DEM-Based Rainfall-Runoff Algorithm</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nardi, F.; Grimaldi, S.; Petroselli, A.</p> <p>2012-12-01</p> <p>Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PhDT.......260K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PhDT.......260K"><span>Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kwag, Shinyoung</p> <p></p> <p>Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1304161-measurement-tt-production-cross-section-using-events-tagged-jets-pp-collisions-atlas-detector','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1304161-measurement-tt-production-cross-section-using-events-tagged-jets-pp-collisions-atlas-detector"><span>Measurement of the t t ¯ production cross-section using eμ events with b-tagged jets in pp collisions at s = 13 TeV with the ATLAS detector</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Aaboud, M.; Aad, G.; Abbott, B.</p> <p></p> <p>Here, this paper describes a measurement of the inclusive top quark pair production cross-section ( σmore » $$t\\bar{t}$$ with a data sample of 3.2fb –1 of proton–proton collisions at a centre-of-mass energy of √s=13TeV, collected in 2015 by the ATLAS detector at the LHC. This measurement uses events with an opposite-charge electron–muon pair in the final state. Jets containing b-quarks are tagged using an algorithm based on track impact parameters and reconstructed secondary vertices. The numbers of events with exactly one and exactly two b -tagged jets are counted and used to determine simultaneously σ$$t\\bar{t}$$ and the efficiency to reconstruct and b-tag a jet from a top quark decay, thereby minimising the associated systematic uncertainties.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17362658','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17362658"><span>Two-tier Haddon matrix approach to fault analysis of accidents and cybernetic search for relationship to effect operational control: a case study at a large construction site.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mazumdar, Atmadeep; Sen, Krishna Nirmalya; Lahiri, Balendra Nath</p> <p>2007-01-01</p> <p>The Haddon matrix is a potential tool for recognizing hazards in any operating engineering system. This paper presents a case study of operational hazards at a large construction site. The fish bone structure helps to visualize and relate the chain of events, which led to the failure of the system. The two-tier Haddon matrix approach helps to analyze the problem and subsequently prescribes preventive steps. The cybernetic approach has been undertaken to establish the relationship among event variables and to identify the ones with most potential. Those event variables in this case study, based on the cybernetic concepts like control responsiveness and controllability salience, are (a) uncontrolled swing of sheet contributing to energy, (b) slippage of sheet from anchor, (c) restricted longitudinal and transverse swing or rotation about the suspension, (d) guilt or uncertainty of the crane driver, (e) safe working practices and environment.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1304161-measurement-tt-production-cross-section-using-events-tagged-jets-pp-collisions-atlas-detector','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1304161-measurement-tt-production-cross-section-using-events-tagged-jets-pp-collisions-atlas-detector"><span>Measurement of the t t ¯ production cross-section using eμ events with b-tagged jets in pp collisions at s = 13 TeV with the ATLAS detector</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Aaboud, M.; Aad, G.; Abbott, B.; ...</p> <p>2016-08-16</p> <p>Here, this paper describes a measurement of the inclusive top quark pair production cross-section ( σmore » $$t\\bar{t}$$ with a data sample of 3.2fb –1 of proton–proton collisions at a centre-of-mass energy of √s=13TeV, collected in 2015 by the ATLAS detector at the LHC. This measurement uses events with an opposite-charge electron–muon pair in the final state. Jets containing b-quarks are tagged using an algorithm based on track impact parameters and reconstructed secondary vertices. The numbers of events with exactly one and exactly two b -tagged jets are counted and used to determine simultaneously σ$$t\\bar{t}$$ and the efficiency to reconstruct and b-tag a jet from a top quark decay, thereby minimising the associated systematic uncertainties.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1918057S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1918057S"><span>Development of a flash flood warning system based on real-time radar data and process-based erosion modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schindewolf, Marcus; Kaiser, Andreas; Buchholtz, Arno; Schmidt, Jürgen</p> <p>2017-04-01</p> <p>Extreme rainfall events and resulting flash floods led to massive devastations in Germany during spring 2016. The study presented aims on the development of a early warning system, which allows the simulation and assessment of negative effects on infrastructure by radar-based heavy rainfall predictions, serving as input data for the process-based soil loss and deposition model EROSION 3D. Our approach enables a detailed identification of runoff and sediment fluxes in agricultural used landscapes. In a first step, documented historical events were analyzed concerning the accordance of measured radar rainfall and large scale erosion risk maps. A second step focused on a small scale erosion monitoring via UAV of source areas of heavy flooding events and a model reconstruction of the processes involved. In all examples damages were caused to local infrastructure. Both analyses are promising in order to detect runoff and sediment delivering areas even in a high temporal and spatial resolution. Results prove the important role of late-covering crops such as maize, sugar beet or potatoes in runoff generation. While e.g. winter wheat positively affects extensive runoff generation on undulating landscapes, massive soil loss and thus muddy flows are observed and depicted in model results. Future research aims on large scale model parameterization and application in real time, uncertainty estimation of precipitation forecast and interface developments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H11O..04S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H11O..04S"><span>High-resolution multimodel projections of soil moisture drought in Europe under 1.5, 2 and 3 degree global warming</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Samaniego, L. E.; Kumar, R.; Zink, M.; Pan, M.; Wanders, N.; Marx, A.; Sheffield, J.; Wood, E. F.; Thober, S.</p> <p>2017-12-01</p> <p>Droughts are creeping hydro-meteorological events that may bring societies and natural systems to their limits by inducing significant environmental changes and large socio-economic losses. Little is know about the effects of varios degrees of warming (i.e., 1.5 , 2 and 3 K) and their respective uncertainties on extreme characteristics such as drought duration and area under drought in general, and in Europe in particular. In this study we investigate the evolution of droughts characteristics under three levels of warming using an unprecedented high-resolution multi-model hydrologic ensemble over the Pan-EU domain at a scale of 5x5 km2 from 1950 until 2100. This multi-model ensemble comprises four hydrologic models (HMs: mHM, Noah-MP, PCR-GLOBWB, VIC) which are forced by five CMIP-5 Global Climate Models (GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, NorESM1-M) under three RCP scenarios 2.6, 6.0, and 8.5. This results in a 60-member ensemble. The contribution GCM/HM uncertainties were analyzed based on a sequential sampling algorithm proposed by Samaniego et al. 2016. This study is carried out within the EDgE project funded by the Copernicus Climate Change Service (edge.climate.copernicus.eu) and the HOKLIM project funded by the German Ministry of Education (BMBF)(www.ufz.de/hoklim). The changes under three levels of warming indicate significant increase (more than 10%) of the number of droughts and area under drought with respect to 30-year climatological means obtained with E-OBS observations. Furthermore, we found that: 1) the number of drought events exhibit significant regional changes. Largest changes are observed in the Mediterrinian where frequency of droughts increases from 25% under 1.5 K to 33% under 2 K, and to more than 50% under 3 K warming. Minor changes are seen in Central-Europe and the British Isles. 2) The GCMs/HMs uncertainties have marked regional differences too, with GCM uncertainty appear to be larger everywhere. The uncertainty of HMs are, however, similar to those of the GCMs in the Iberian peninsula due to different representation of evapotranspiration and soil moisture dynamics. And, 3) despite the large uncertainty in the full ensemble, significant positive trends have been observed in all drought characteristics that intensify with increased global warming.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22922435','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22922435"><span>Novel health economic evaluation of a vaccination strategy to prevent HPV-related diseases: the BEST study.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Favato, Giampiero; Baio, Gianluca; Capone, Alessandro; Marcellusi, Andrea; Costa, Silvano; Garganese, Giorgia; Picardo, Mauro; Drummond, Mike; Jonsson, Bengt; Scambia, Giovanni; Zweifel, Peter; Mennini, Francesco S</p> <p>2012-12-01</p> <p>The development of human papillomavirus (HPV)-related diseases is not understood perfectly and uncertainties associated with commonly utilized probabilistic models must be considered. The study assessed the cost-effectiveness of a quadrivalent-based multicohort HPV vaccination strategy within a Bayesian framework. A full Bayesian multicohort Markov model was used, in which all unknown quantities were associated with suitable probability distributions reflecting the state of currently available knowledge. These distributions were informed by observed data or expert opinion. The model cycle lasted 1 year, whereas the follow-up time horizon was 90 years. Precancerous cervical lesions, cervical cancers, and anogenital warts were considered as outcomes. The base case scenario (2 cohorts of girls aged 12 and 15 y) and other multicohort vaccination strategies (additional cohorts aged 18 and 25 y) were cost-effective, with a discounted cost per quality-adjusted life-year gained that corresponded to €12,013, €13,232, and €15,890 for vaccination programs based on 2, 3, and 4 cohorts, respectively. With multicohort vaccination strategies, the reduction in the number of HPV-related events occurred earlier (range, 3.8-6.4 y) when compared with a single cohort. The analysis of the expected value of information showed that the results of the model were subject to limited uncertainty (cost per patient = €12.6). This methodological approach is designed to incorporate the uncertainty associated with HPV vaccination. Modeling the cost-effectiveness of a multicohort vaccination program with Bayesian statistics confirmed the value for money of quadrivalent-based HPV vaccination. The expected value of information gave the most appropriate and feasible representation of the true value of this program.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26877771','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26877771"><span>Attribution of extreme weather and climate-related events.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Stott, Peter A; Christidis, Nikolaos; Otto, Friederike E L; Sun, Ying; Vanderlinden, Jean-Paul; van Oldenborgh, Geert Jan; Vautard, Robert; von Storch, Hans; Walton, Peter; Yiou, Pascal; Zwiers, Francis W</p> <p>2016-01-01</p> <p>Extreme weather and climate-related events occur in a particular place, by definition, infrequently. It is therefore challenging to detect systematic changes in their occurrence given the relative shortness of observational records. However, there is a clear interest from outside the climate science community in the extent to which recent damaging extreme events can be linked to human-induced climate change or natural climate variability. Event attribution studies seek to determine to what extent anthropogenic climate change has altered the probability or magnitude of particular events. They have shown clear evidence for human influence having increased the probability of many extremely warm seasonal temperatures and reduced the probability of extremely cold seasonal temperatures in many parts of the world. The evidence for human influence on the probability of extreme precipitation events, droughts, and storms is more mixed. Although the science of event attribution has developed rapidly in recent years, geographical coverage of events remains patchy and based on the interests and capabilities of individual research groups. The development of operational event attribution would allow a more timely and methodical production of attribution assessments than currently obtained on an ad hoc basis. For event attribution assessments to be most useful, remaining scientific uncertainties need to be robustly assessed and the results clearly communicated. This requires the continuing development of methodologies to assess the reliability of event attribution results and further work to understand the potential utility of event attribution for stakeholder groups and decision makers. WIREs Clim Change 2016, 7:23-41. doi: 10.1002/wcc.380 For further resources related to this article, please visit the WIREs website.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.S22A..06C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.S22A..06C"><span>A Bayesian-Based Novel Methodology to Generate Reliable Site Response Mapping Sensitive to Data Uncertainties</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chakraborty, A.; Goto, H.</p> <p>2017-12-01</p> <p>The 2011 off the Pacific coast of Tohoku earthquake caused severe damage in many areas further inside the mainland because of site-amplification. Furukawa district in Miyagi Prefecture, Japan recorded significant spatial differences in ground motion even at sub-kilometer scales. The site responses in the damage zone far exceeded the levels in the hazard maps. A reason why the mismatch occurred is that mapping follow only the mean value at the measurement locations with no regard to the data uncertainties and thus are not always reliable. Our research objective is to develop a methodology to incorporate data uncertainties in mapping and propose a reliable map. The methodology is based on a hierarchical Bayesian modeling of normally-distributed site responses in space where the mean (μ), site-specific variance (σ2) and between-sites variance(s2) parameters are treated as unknowns with a prior distribution. The observation data is artificially created site responses with varying means and variances for 150 seismic events across 50 locations in one-dimensional space. Spatially auto-correlated random effects were added to the mean (μ) using a conditionally autoregressive (CAR) prior. The inferences on the unknown parameters are done using Markov Chain Monte Carlo methods from the posterior distribution. The goal is to find reliable estimates of μ sensitive to uncertainties. During initial trials, we observed that the tau (=1/s2) parameter of CAR prior controls the μ estimation. Using a constraint, s = 1/(k×σ), five spatial models with varying k-values were created. We define reliability to be measured by the model likelihood and propose the maximum likelihood model to be highly reliable. The model with maximum likelihood was selected using a 5-fold cross-validation technique. The results show that the maximum likelihood model (μ*) follows the site-specific mean at low uncertainties and converges to the model-mean at higher uncertainties (Fig.1). This result is highly significant as it successfully incorporates the effect of data uncertainties in mapping. This novel approach can be applied to any research field using mapping techniques. The methodology is now being applied to real records from a very dense seismic network in Furukawa district, Miyagi Prefecture, Japan to generate a reliable map of the site responses.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.3190W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.3190W"><span>Extension of volcanic forcing data back to 100 BC using the Analog method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wagner, Sebastian; Zorita, Eduardo</p> <p>2013-04-01</p> <p>Present reconstructions of volcanic forcing to be used for climate simulations so far extend back until 500 AD for stratospheric aerosol sulphate injection (Gao et al., 2008), and back until 800 AD for aerosol optical depth and effective radius (Crowley et al. 2012; ICI5 data set). Here, we aim to extent the volcanic data set of Crowley et al. (2012) back to 100 BC. This data sets originally starts in 800 AD, for aerosol optical depth and effective radius. The method we apply is the Analog method, using information in the already existing reconstruction and extending it back in time by using information of long volcanic sulphate contained in Greenland and Antarctic Ice cores published in previous studies. The reconstruction of the volcanic forcing in first millennium is based on the search of analogs in the second millennium. The pool of analogs includes the ICI5 data set for the period 800-2000 AD. The basic philosophy is to find volcanic events with the same or similar magnitude in terms of volcanic sulphate deposition in Greenland and Antarctic ice cores. For the Northern Hemisphere the estimated maximum total stratospheric sulphate loading from Zielinski (1995) is used. For the Southern Hemisphere the Plummer et al. (2012) data set and the Ferris et al. (2011) data set are used in terms of sulphate deposition. To ensure that the volcanic event was large enough in magnitude, a certain threshold is applied to the analog selection. The extension, i.e. the analog search, is carried out separately for the four different latitudinal bands of the ICI5 data set. The method can be applied when better records than the Zielinski et al. (1995), record for the Northern Hemisphere become available. The analogs are selected based on the comparison between the information contained in the ice cores in the pre-800 AD period and post-800 AD period. For each event in the pre-800 AD period (the target), the most similar event (the analog) in the post-800 AD pool in terms of ice-core sulphate is identified. The forcing data (effective radius and aerosol optical depth) of the ICI5 data set for that analog event is then used as a surrogate for the target event. In the case that the analog does not exactly match the amplitude of the pre-800 AD event a scaling correction factor is applied, taking into account the relative difference of ice-core sulphate between the analog and the target. Although the method does not take into account the specific structure of each volcanic event, the basic patterns are reproduced reasonably well for a validation period in the second millennium AD. The largest uncertainties relate to the dating of each volcanic event, including the season of the eruption, the synchronization of hemispheric versus global eruptions and the translation of the ice core volcanic sulphate concentrations into stratospheric aerosol loadings. However, these uncertainties will essentially remain using different methods based on the sulphate information contained in Antarctic and Greenland ice cores.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.ars.usda.gov/research/publications/publication/?seqNo115=332224','TEKTRAN'); return false;" href="http://www.ars.usda.gov/research/publications/publication/?seqNo115=332224"><span>Breaks in MODIS time series portend vegetation change: verification using long-term data in an arid grassland ecosystem</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ars.usda.gov/research/publications/find-a-publication/">USDA-ARS?s Scientific Manuscript database</a></p> <p></p> <p></p> <p>Frequency and severity of extreme climatic events are forecast to increase in the 21st century. Predicting how managed ecosystems may respond to climatic extremes is intensified by uncertainty associated with knowing when, where, and how long effects of the extreme events will be manifest in the eco...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.S12A..07W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.S12A..07W"><span>Sensitivity of Coulomb stress changes to slip models of source faults: A case study for the 2011 Mw 9.0 Tohoku-oki earthquake</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, J.; Xu, C.; Furlong, K.; Zhong, B.; Xiao, Z.; Yi, L.; Chen, T.</p> <p>2017-12-01</p> <p>Although Coulomb stress changes induced by earthquake events have been used to quantify stress transfers and to retrospectively explain stress triggering among earthquake sequences, realistic reliable prospective earthquake forecasting remains scarce. To generate a robust Coulomb stress map for earthquake forecasting, uncertainties in Coulomb stress changes associated with the source fault, receiver fault and friction coefficient and Skempton's coefficient need to be exhaustively considered. In this paper, we specifically explore the uncertainty in slip models of the source fault of the 2011 Mw 9.0 Tohoku-oki earthquake as a case study. This earthquake was chosen because of its wealth of finite-fault slip models. Based on the wealth of those slip models, we compute the coseismic Coulomb stress changes induced by this mainshock. Our results indicate that nearby Coulomb stress changes for each slip model can be quite different, both for the Coulomb stress map at a given depth and on the Pacific subducting slab. The triggering rates for three months of aftershocks of the mainshock, with and without considering the uncertainty in slip models, differ significantly, decreasing from 70% to 18%. Reliable Coulomb stress changes in the three seismogenic zones of Nanki, Tonankai and Tokai are insignificant, approximately only 0.04 bar. By contrast, the portions of the Pacific subducting slab at a depth of 80 km and beneath Tokyo received a positive Coulomb stress change of approximately 0.2 bar. The standard errors of the seismicity rate and earthquake probability based on the Coulomb rate-and-state model (CRS) decay much faster with elapsed time in stress triggering zones than in stress shadows, meaning that the uncertainties in Coulomb stress changes in stress triggering zones would not drastically affect assessments of the seismicity rate and earthquake probability based on the CRS in the intermediate to long term.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015QSRv..129....1G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015QSRv..129....1G"><span>Sequence of events from the onset to the demise of the Last Interglacial: Evaluating strengths and limitations of chronologies used in climatic archives</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Govin, A.; Capron, E.; Tzedakis, P. C.; Verheyden, S.; Ghaleb, B.; Hillaire-Marcel, C.; St-Onge, G.; Stoner, J. S.; Bassinot, F.; Bazin, L.; Blunier, T.; Combourieu-Nebout, N.; El Ouahabi, A.; Genty, D.; Gersonde, R.; Jimenez-Amat, P.; Landais, A.; Martrat, B.; Masson-Delmotte, V.; Parrenin, F.; Seidenkrantz, M.-S.; Veres, D.; Waelbroeck, C.; Zahn, R.</p> <p>2015-12-01</p> <p>The Last Interglacial (LIG) represents an invaluable case study to investigate the response of components of the Earth system to global warming. However, the scarcity of absolute age constraints in most archives leads to extensive use of various stratigraphic alignments to different reference chronologies. This feature sets limitations to the accuracy of the stratigraphic assignment of the climatic sequence of events across the globe during the LIG. Here, we review the strengths and limitations of the methods that are commonly used to date or develop chronologies in various climatic archives for the time span (∼140-100 ka) encompassing the penultimate deglaciation, the LIG and the glacial inception. Climatic hypotheses underlying record alignment strategies and the interpretation of tracers are explicitly described. Quantitative estimates of the associated absolute and relative age uncertainties are provided. Recommendations are subsequently formulated on how best to define absolute and relative chronologies. Future climato-stratigraphic alignments should provide (1) a clear statement of climate hypotheses involved, (2) a detailed understanding of environmental parameters controlling selected tracers and (3) a careful evaluation of the synchronicity of aligned paleoclimatic records. We underscore the need to (1) systematically report quantitative estimates of relative and absolute age uncertainties, (2) assess the coherence of chronologies when comparing different records, and (3) integrate these uncertainties in paleoclimatic interpretations and comparisons with climate simulations. Finally, we provide a sequence of major climatic events with associated age uncertainties for the period 140-105 ka, which should serve as a new benchmark to disentangle mechanisms of the Earth system's response to orbital forcing and evaluate transient climate simulations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1711427B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1711427B"><span>Rainfall estimates for hydrological models: Comparing rain gauge, radar and microwave link data as input for the Wageningen Lowland Runoff Simulator (WALRUS)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Brauer, Claudia; Overeem, Aart; Uijlenhoet, Remko</p> <p>2015-04-01</p> <p>Several rainfall measurement techniques are available for hydrological applications, each with its own spatial and temporal resolution. We investigated the effect of differences in rainfall estimates on discharge simulations in a lowland catchment by forcing a novel rainfall-runoff model (WALRUS) with rainfall data from gauges, radars and microwave links. The hydrological model used for this analysis is the recently developed Wageningen Lowland Runoff Simulator (WALRUS). WALRUS is a rainfall-runoff model accounting for hydrological processes relevant to areas with shallow groundwater (e.g. groundwater-surface water feedback). Here, we used WALRUS for case studies in the Hupsel Brook catchment. We used two automatic rain gauges with hourly resolution, located inside the catchment (the base run) and 30 km northeast. Operational (real-time) and climatological (gauge-adjusted) C-band radar products and country-wide rainfall maps derived from microwave link data from a cellular telecommunication network were also used. Discharges simulated with these different inputs were compared to observations. Traditionally, the precipitation research community places emphasis on quantifying spatial errors and uncertainty, but for hydrological applications, temporal errors and uncertainty should be quantified as well. Its memory makes the hydrologic system sensitive to missed or badly timed rainfall events, but also emphasizes the effect of a bias in rainfall estimates. Systematic underestimation of rainfall by the uncorrected operational radar product leads to very dry model states and an increasing underestimation of discharge. Using the rain gauge 30 km northeast of the catchment yields good results for climatological studies, but not for forecasting individual floods. Simulating discharge using the maps derived from microwave link data and the gauge-adjusted radar product yields good results for both events and climatological studies. This indicates that these products can be used in catchments without gauges in or near the catchment. Uncertainty in rainfall forcing is a major source of uncertainty in discharge predictions, both with lumped and with distributed models. For lumped rainfall-runoff models, the main source of input uncertainty is associated with the way in which (effective) catchment-average rainfall is estimated. Improving rainfall measurements can improve the performance of rainfall-runoff models, indicating their potential for reducing flood damage through real-time control.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70190140','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70190140"><span>Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan</p> <p>2017-01-01</p> <p>The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. </p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H31A1474W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H31A1474W"><span>Flood risk analysis for flood control and sediment transportation: a case study in the catchments of the Loess Plateau, China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Y.; Chang, J.; Guo, A.</p> <p>2017-12-01</p> <p>Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on flood control systems. Given this focus, a univariate and copula-based bivariate hydrological risk framework focusing on flood control and sediment transport is proposed in the current work. Additionally, the conditional probabilities of occurrence of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula model. Moreover, a Monte Carlo-based algorithm is used to evaluate the uncertainties of univariate and bivariate hydrological risk. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The results indicate that (1) 2-day and 3-day consecutive rainfall are highly correlated with the annual maximum flood discharge (AMF) in UCX and UCH, respectively; and (2) univariate and bivariate return periods, risk and reliability for the purposes of flood control and sediment transport are successfully estimated. Sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the AMF, exceeding the design flood of downstream hydraulic structures in the UCX and UCH. Most importantly, there was considerable sampling uncertainty in the univariate and bivariate hydrologic risk analysis, which would greatly challenge measures of future flood mitigation. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA566208','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA566208"><span>Estimating the Uncertainty and Predictive Capabilities of Three-Dimensional Earth Models (Postprint)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2012-03-22</p> <p>www.isc.ac.uk). This global database includes more than 7,000 events whose epicentral location accuracy is known to at least 5 km. GT events with...region, which illustrates the difficulty of validating a model with travel times alone. However, the IASPEI REL database is currently the highest...S (right) paths in the IASPEI REL ground-truth database . Stations are represented by purple triangles and events by gray circles. Note the sparse</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018HESS...22..889F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018HESS...22..889F"><span>Searching for the optimal drought index and timescale combination to detect drought: a case study from the lower Jinsha River basin, China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fluixá-Sanmartín, Javier; Pan, Deng; Fischer, Luzia; Orlowsky, Boris; García-Hernández, Javier; Jordan, Frédéric; Haemmig, Christoph; Zhang, Fangwei; Xu, Jijun</p> <p>2018-02-01</p> <p>Drought indices based on precipitation are commonly used to identify and characterize droughts. Due to the general complexity of droughts, the comparison of index-identified events with droughts at different levels of the complete system, including soil humidity or river discharges, relies typically on model simulations of the latter, entailing potentially significant uncertainties. The present study explores the potential of using precipitation-based indices to reproduce observed droughts in the lower part of the Jinsha River basin (JRB), proposing an innovative approach for a catchment-wide drought detection and characterization. Two indicators, namely the Overall Drought Extension (ODE) and the Overall Drought Indicator (ODI), have been defined. These indicators aim at identifying and characterizing drought events on the basin scale, using results from four meteorological drought indices (standardized precipitation index, SPI; rainfall anomaly index, RAI; percent of normal precipitation, PN; deciles, DEC) calculated at different locations of the basin and for different timescales. Collected historical information on drought events is used to contrast results obtained with the indicators. This method has been successfully applied to the lower Jinsha River basin in China, a region prone to frequent and severe droughts. Historical drought events that occurred from 1960 to 2014 have been compiled and cataloged from different sources, in a challenging process. The analysis of the indicators shows a good agreement with the recorded historical drought events on the basin scale. It has been found that the timescale that best reproduces observed events across all the indices is the 6-month timescale.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ClDy..tmp.2340R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ClDy..tmp.2340R"><span>Observed increase in extreme daily rainfall in the French Mediterranean</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ribes, Aurélien; Thao, Soulivanh; Vautard, Robert; Dubuisson, Brigitte; Somot, Samuel; Colin, Jeanne; Planton, Serge; Soubeyroux, Jean-Michel</p> <p>2018-04-01</p> <p>We examine long-term trends in the historical record of extreme precipitation events occurring over the French Mediterranean area. Extreme events are considered in terms of their intensity, frequency, extent and precipitated volume. Changes in intensity are analysed via an original statistical approach where the annual maximum rainfall amounts observed at each measurement station are aggregated into a univariate time-series according to their dependence. The mean intensity increase is significant and estimated at + 22% (+ 7 to + 39% at the 90% confidence level) over the 1961-2015 period. Given the observed warming over the considered area, this increase is consistent with a rate of about one to three times that implied by the Clausius-Clapeyron relationship. Changes in frequency and other spatial features are investigated through a Generalised Linear Model. Changes in frequency for events exceeding high thresholds (about 200 mm in 1 day) are found to be significant, typically near a doubling of the frequency, but with large uncertainties in this change ratio. The area affected by severe events and the water volume precipitated during those events also exhibit significant trends, with an increase by a factor of about 4 for a 200 mm threshold, again with large uncertainties. All diagnoses consistently point toward an intensification of the most extreme events over the last decades. We argue that it is difficult to explain the diagnosed trends without invoking the human influence on climate.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1415910-linking-models-human-behaviour-climate-alters-projected-climate-change','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1415910-linking-models-human-behaviour-climate-alters-projected-climate-change"><span>Linking models of human behaviour and climate alters projected climate change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Beckage, Brian; Gross, Louis J.; Lacasse, Katherine; ...</p> <p>2018-01-01</p> <p>Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4–6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with themore » largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Lastly, our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018NatCC...8...79B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018NatCC...8...79B"><span>Linking models of human behaviour and climate alters projected climate change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Beckage, Brian; Gross, Louis J.; Lacasse, Katherine; Carr, Eric; Metcalf, Sara S.; Winter, Jonathan M.; Howe, Peter D.; Fefferman, Nina; Franck, Travis; Zia, Asim; Kinzig, Ann; Hoffman, Forrest M.</p> <p>2018-01-01</p> <p>Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4-6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with the largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1415910-linking-models-human-behaviour-climate-alters-projected-climate-change','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1415910-linking-models-human-behaviour-climate-alters-projected-climate-change"><span>Linking models of human behaviour and climate alters projected climate change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Beckage, Brian; Gross, Louis J.; Lacasse, Katherine</p> <p></p> <p>Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4–6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with themore » largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Lastly, our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20150019911','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20150019911"><span>Trending in Pc Measurements via a Bayesian Zero-Inflated Mixed Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Vallejo, Jonathon; Hejduk, Matthew; Stamey, James</p> <p>2015-01-01</p> <p>Two satellites predicted to come within close proximity of one another, usually a high-value satellite and a piece of space debris moving the active satellite is a means of reducing collision risk but reduces satellite lifetime, perturbs satellite mission, and introduces its own risks. So important to get a good statement of the risk of collision in order to determine whether a maneuver is truly necessary. Two aspects of this Calculation of the Probability of Collision (Pc) based on the most recent set of position velocity and uncertainty data for both satellites. Examination of the changes in the Pc value as the event develops. Events should follow a canonical development (Pc vs time to closest approach (TCA)). Helpful to be able to guess where the present data point fits in the canonical development in order to guide operational response.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.9923H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.9923H"><span>Uncertainties on the definition of critical rainfall patterns for debris-flows triggering. Results from the Rebaixader monitoring site (Central Pyrenees)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hürlimann, Marcel; Abancó, Clàudia; Moya, Jose; Berenguer, Marc</p> <p>2015-04-01</p> <p>Empirical rainfall thresholds are a widespread technique in debris-flow hazard assessment and can be established by statistical analysis of historic data. Typically, data from one or several rain gauges located nearby the affected catchment is used to define the triggering conditions. However, this procedure has been demonstrated not to be accurate enough due to the spatial variability of convective rainstorms. In 2009, a monitoring system was installed in the Rebaixader catchment, Central Pyrenees (Spain). Since then, 28 torrential flows (debris flows and debris floods) have occurred and rainfall data of 25 of them are available with a 5-minutes frequency of recording ("event rainfalls"). Other 142 rainfalls that did not trigger events ("no event rainfalls) were also collected and analysed. The goal of this work was threefold: a) characterize rainfall episodes in the Rebaixader catchment and compare rainfall data that triggered torrential events and others that did not; b) define and test Intensity-Duration (ID) thresholds using rainfall data measured inside the catchment; c) estimate the uncertainty derived from the use of rain gauges located outside the catchment based on the spatial correlation depicted by radar rainfall maps. The results of the statistical analysis showed that the parameters that more distinguish between the two populations of rainfalls are the rainfall intensities, the mean rainfall and the total precipitation. On the other side, the storm duration and the antecedent rainfall are not significantly different between "event rainfalls" and "no event rainfalls". Four different ID rainfall thresholds were derived based on the dataset of the first 5 years and tested using the 2014 dataset. The results of the test indicated that the threshold corresponding to the 90% percentile showed the best performance. Weather radar data was used to analyse the spatial variability of the triggering rainfalls. The analysis indicates that rain gauges outside the catchment may be considered useful or not to describe the rainfall depending on the type of rainfall. For widespread rainfalls, further rain gauges can give a reliable measurement, because the spatial correlation decreases slowly with the distance between the rain gauge and the debris-flow initiation area. Contrarily, local storm cells show higher space-time variability and, therefore, representative rainfall measurements are obtained only by the closest rain gauges. In conclusion, the definition of rainfall thresholds is a delicate task. When the rainfall records are coming from gauges that are outside the catchment under consideration, the data should be carefully analysed and crosschecked with radar data (especially for small convective cells).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.S32C..08Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.S32C..08Z"><span>Constraining the Long-Term Average of Earthquake Recurrence Intervals From Paleo- and Historic Earthquakes by Assimilating Information From Instrumental Seismicity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zoeller, G.</p> <p>2017-12-01</p> <p>Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015APS..APR.Y4008C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015APS..APR.Y4008C"><span>Entering the Two-Detector Phase of Double Chooz: First Near Detector Data and Prospects for Future Analyses</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Carr, Rachel; Double Chooz Collaboration</p> <p>2015-04-01</p> <p>In 2011, Double Chooz reported the first evidence for θ13-driven reactor antineutrino oscillation, derived from observations of inverse beta decay (IBD) events in a single detector located ~ 1 km from two nuclear reactors. Since then, the collaboration has honed the precision of its sin2 2θ13 measurement by reducing backgrounds, improving detection efficiency and systematics, and including additional statistics from IBD events with neutron captures on hydrogen. By 2014, the overwhelmingly dominant contribution to sin2 2θ13 uncertainty was reactor flux uncertainty, which is irreducible in a single-detector experiment. Now, as Double Chooz collects the first data with a near detector, we can begin to suppress that uncertainty and approach the experiment's full potential. In this talk, we show quality checks on initial data from the near detector. We also present our two-detector sensitivity to both sin2 2θ13 and sterile neutrino mixing, which are enhanced by analysis strategies developed in our single-detector phase. In particular, we discuss prospects for the first two-detector results from Double Chooz, expected in 2015.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19890019087','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19890019087"><span>Current models of the intensely ionizing particle environment in space</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Adams, James H., Jr.</p> <p>1988-01-01</p> <p>The Cosmic Ray Effects on MicroElectronics (CREME) model that is currently in use to estimate single event effect rates in spacecraft is described. The CREME model provides a description of the radiation environment in interplanetary space near the orbit of the earth that contains no major deficiencies. The accuracy of the galactic cosmic ray model is limited by the uncertainties in solar modulation. The model for solar energetic particles could be improved by making use of all the data that has been collected on solar energetic particle events. There remain major uncertainties about the environment within the earth's magnetosphere, because of the uncertainties over the charge states of the heavy ions in the anomalous component and solar flares, and because of trapped heavy ions. The present CREME model is valid only at 1 AU, but it could be extended to other parts of the heliosphere. There is considerable data on the radiation environment from 0.2 to 35 AU in the ecliptic plane. This data could be used to extend the CREME model.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMOS21B..04W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMOS21B..04W"><span>Analyzing extreme sea levels for broad-scale impact and adaptation studies</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.</p> <p>2017-12-01</p> <p>Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias. Finally, ESL uncertainties need to be integrated with SLR uncertainties. Otherwise, important improvements in providing more robust SLR projections are of less benefit for broad-scale impact and adaptation studies and decision processes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1713271N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1713271N"><span>Assessing the importance of rainfall uncertainty on hydrological models with different spatial and temporal scale</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nossent, Jiri; Pereira, Fernando; Bauwens, Willy</p> <p>2015-04-01</p> <p>Precipitation is one of the key inputs for hydrological models. As long as the values of the hydrological model parameters are fixed, a variation of the rainfall input is expected to induce a change in the model output. Given the increased awareness of uncertainty on rainfall records, it becomes more important to understand the impact of this input - output dynamic. Yet, modellers often still have the intention to mimic the observed flow, whatever the deviation of the employed records from the actual rainfall might be, by recklessly adapting the model parameter values. But is it actually possible to vary the model parameter values in such a way that a certain (observed) model output can be generated based on inaccurate rainfall inputs? Thus, how important is the rainfall uncertainty for the model output with respect to the model parameter importance? To address this question, we apply the Sobol' sensitivity analysis method to assess and compare the importance of the rainfall uncertainty and the model parameters on the output of the hydrological model. In order to be able to treat the regular model parameters and input uncertainty in the same way, and to allow a comparison of their influence, a possible approach is to represent the rainfall uncertainty by a parameter. To tackle the latter issue, we apply so called rainfall multipliers on hydrological independent storm events, as a probabilistic parameter representation of the possible rainfall variation. As available rainfall records are very often point measurements at a discrete time step (hourly, daily, monthly,…), they contain uncertainty due to a latent lack of spatial and temporal variability. The influence of the latter variability can also be different for hydrological models with different spatial and temporal scale. Therefore, we perform the sensitivity analyses on a semi-distributed model (SWAT) and a lumped model (NAM). The assessment and comparison of the importance of the rainfall uncertainty and the model parameters is achieved by considering different scenarios for the included parameters and the state of the models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1390441-quantifying-black-carbon-deposition-over-greenland-ice-sheet-from-forest-fires-canada-bc-deposition-from-forest-fires','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1390441-quantifying-black-carbon-deposition-over-greenland-ice-sheet-from-forest-fires-canada-bc-deposition-from-forest-fires"><span>Quantifying black carbon deposition over the Greenland ice sheet from forest fires in Canada: BC DEPOSITION FROM FOREST FIRES</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Thomas, J. L.; Polashenski, C. M.; Soja, A. J.</p> <p></p> <p>We identify an important Black Carbon (BC) aerosol deposition event that was observed in snow stratigraphy and dated to between 27 July 2013 – 2 August 2013. This event comprises a significant portion (~60%) of total deposition over a 10 month period (July 2013 – April 2014). Here we link this event to forest fires burning in Canada during summer 2013 using modeling and remote sensing tools. Aerosols were detected by both the CALIOP and MODIS instruments during transport between Canada and Greenland, confirming that this event involved emissions from forest fires in Canada. We use high-resolution regional chemical transportmore » mod-eling (WRF-Chem) combined with high-resolution fire emissions (FINNv1.5) to study aerosol emissions, transport, and deposition during this event. The model accurately captures the timing of the BC deposition event and shows that the major contribution to deposition during this event is emissions originating from fires in Canada. However, the model under-predicts aerosol deposition compared to measurements at all sites by a factor of 2–100. Under-prediction of modeled BC deposition originates from uncertainties in fire emissions combined with uncertainties in aerosol scavenging by clouds. This study suggests that it is possible to describe the transport of an exceptional smoke event on regional and continental scales. Improvements in model descriptions of precipitation scavenging and emissions from wildfires are needed to correctly predict deposition, which is critical for determining the climate impacts of aerosols that originate from fires.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1713145S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1713145S"><span>The application of Global Sensitivity Analysis to quantify the dominant input factors for hydraulic model simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten</p> <p>2015-04-01</p> <p>Predicting flood inundation extents using hydraulic models is subject to a number of critical uncertainties. For a specific event, these uncertainties are known to have a large influence on model outputs and any subsequent analyses made by risk managers. Hydraulic modellers often approach such problems by applying uncertainty analysis techniques such as the Generalised Likelihood Uncertainty Estimation (GLUE) methodology. However, these methods do not allow one to attribute which source of uncertainty has the most influence on the various model outputs that inform flood risk decision making. Another issue facing modellers is the amount of computational resource that is available to spend on modelling flood inundations that are 'fit for purpose' to the modelling objectives. Therefore a balance needs to be struck between computation time, realism and spatial resolution, and effectively characterising the uncertainty spread of predictions (for example from boundary conditions and model parameterisations). However, it is not fully understood how much of an impact each factor has on model performance, for example how much influence changing the spatial resolution of a model has on inundation predictions in comparison to other uncertainties inherent in the modelling process. Furthermore, when resampling fine scale topographic data in the form of a Digital Elevation Model (DEM) to coarser resolutions, there are a number of possible coarser DEMs that can be produced. Deciding which DEM is then chosen to represent the surface elevations in the model could also influence model performance. In this study we model a flood event using the hydraulic model LISFLOOD-FP and apply Sobol' Sensitivity Analysis to estimate which input factor, among the uncertainty in model boundary conditions, uncertain model parameters, the spatial resolution of the DEM and the choice of resampled DEM, have the most influence on a range of model outputs. These outputs include whole domain maximum inundation indicators and flood wave travel time in addition to temporally and spatially variable indicators. This enables us to assess whether the sensitivity of the model to various input factors is stationary in both time and space. Furthermore, competing models are assessed against observations of water depths from a historical flood event. Consequently we are able to determine which of the input factors has the most influence on model performance. Initial findings suggest the sensitivity of the model to different input factors varies depending on the type of model output assessed and at what stage during the flood hydrograph the model output is assessed. We have also found that initial decisions regarding the characterisation of the input factors, for example defining the upper and lower bounds of the parameter sample space, can be significant in influencing the implied sensitivities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.H41J..08M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.H41J..08M"><span>Systems Reliability Framework for Surface Water Sustainability and Risk Management</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Myers, J. R.; Yeghiazarian, L.</p> <p>2016-12-01</p> <p>With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability. With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006PhDT.......174A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006PhDT.......174A"><span>A simulation based optimization approach to model and design life support systems for manned space missions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Aydogan, Selen</p> <p></p> <p>This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4086686','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4086686"><span>Should health insurers target prevention of cardiovascular disease?: a cost-effectiveness analysis of an individualised programme in Germany based on routine data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2014-01-01</p> <p>Background Cardiovascular diseases are the main cause of death worldwide, making their prevention a major health care challenge. In 2006, a German statutory health insurance company presented a novel individualised prevention programme (KardioPro), which focused on coronary heart disease (CHD) screening, risk factor assessment, early detection and secondary prevention. This study evaluates KardioPro in CHD risk subgroups, and analyses the cost-effectiveness of different individualised prevention strategies. Methods The CHD risk subgroups were assembled based on routine data from the statutory health insurance company, making use of a quasi-beta regression model for risk prediction. The control group was selected via propensity score matching based on logistic regression and an approximate nearest neighbour approach. The main outcome was cost-effectiveness. Effectiveness was measured as event-free time, and events were defined as myocardial infarction, stroke and death. Incremental cost-effectiveness ratios comparing participants with non-participants were calculated for each subgroup. To assess the uncertainty of results, a bootstrapping approach was applied. Results The cost-effectiveness of KardioPro in the group at high risk of CHD was €20,901 per event-free year; in the medium-risk group, €52,323 per event-free year; in the low-risk group, €186,074 per event-free year; and in the group with known CHD, €26,456 per event-free year. KardioPro was associated with a significant health gain but also a significant cost increase. However, statistical significance could not be shown for all subgroups. Conclusion The cost-effectiveness of KardioPro differs substantially according to the group being targeted. Depending on the willingness-to-pay, it may be reasonable to only offer KardioPro to patients at high risk of further cardiovascular events. This high-risk group could be identified from routine statutory health insurance data. However, the long-term consequences of KardioPro still need to be evaluated. PMID:24938674</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015PhDT.......121D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015PhDT.......121D"><span>Nuclear Effects in Quasi-Elastic and Delta Resonance Production at Low Momentum Transfer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Demgen, John Gibney</p> <p></p> <p>Analysis of data collected by the MINERvA experiment is done by showing the distribution of charged hadron energy for interactions that have low momentum transfer. This distribution reveals major discrepancies between the detector data and the standard MINERvA interaction model with only a simple global Fermi gas model. Adding additional model elements, the random phase approximation (RPA), meson exchange current (MEC), and a reduction of resonance delta production improve this discrepancy. Special attention is paid to resonance delta production systematic uncertainties, which do not make up these discrepancies even when added with resolution and biasing systematic uncertainties. Eye- scanning of events in this region also show a discrepancy, but we were insensitive to two-proton events, the predicted signature of the MEC process.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015PhLB..743..198F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015PhLB..743..198F"><span>Uncertainty relation in Schwarzschild spacetime</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng</p> <p>2015-04-01</p> <p>We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2010-03-11/pdf/2010-5222.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2010-03-11/pdf/2010-5222.pdf"><span>75 FR 11589 - Order Extending Temporary Exemptions Under the Securities Exchange Act of 1934 in Connection with...</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2010-03-11</p> <p>... make payments under a CDS contract is triggered by a default or other credit event as to such entity or... CDS clearing by ICE Trust. We recognize, however, that there could be legal uncertainty in the event..., and the access to clearing services by independent CDS exchanges or CDS trading platforms.\\15\\ \\15...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title12-vol3/pdf/CFR-2010-title12-vol3-part225-appB.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title12-vol3/pdf/CFR-2010-title12-vol3-part225-appB.pdf"><span>12 CFR Appendix B to Part 225 - Capital Adequacy Guidelines for Bank Holding Companies and State Member Banks: Leverage Measure</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-01-01</p> <p>... uncertainty and from the possibility that, in the event an organization experiences financial difficulties... instruments to qualify as primary capital. —Allowance for possible loan and lease losses (exclusive of... dividends or interest payments in the event of a deterioration in the financial condition of the issuer. The...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012IJTJE..29..111Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012IJTJE..29..111Y"><span>Evidential Networks for Fault Tree Analysis with Imprecise Knowledge</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yang, Jianping; Huang, Hong-Zhong; Liu, Yu; Li, Yan-Feng</p> <p>2012-06-01</p> <p>Fault tree analysis (FTA), as one of the powerful tools in reliability engineering, has been widely used to enhance system quality attributes. In most fault tree analyses, precise values are adopted to represent the probabilities of occurrence of those events. Due to the lack of sufficient data or imprecision of existing data at the early stage of product design, it is often difficult to accurately estimate the failure rates of individual events or the probabilities of occurrence of the events. Therefore, such imprecision and uncertainty need to be taken into account in reliability analysis. In this paper, the evidential networks (EN) are employed to quantify and propagate the aforementioned uncertainty and imprecision in fault tree analysis. The detailed conversion processes of some logic gates to EN are described in fault tree (FT). The figures of the logic gates and the converted equivalent EN, together with the associated truth tables and the conditional belief mass tables, are also presented in this work. The new epistemic importance is proposed to describe the effect of ignorance degree of event. The fault tree of an aircraft engine damaged by oil filter plugs is presented to demonstrate the proposed method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28097904','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28097904"><span>Estimated burden of cardiovascular disease and value-based price range for evolocumab in a high-risk, secondary-prevention population in the US payer context.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Toth, Peter P; Danese, Mark; Villa, Guillermo; Qian, Yi; Beaubrun, Anne; Lira, Armando; Jansen, Jeroen P</p> <p>2017-06-01</p> <p>To estimate real-world cardiovascular disease (CVD) burden and value-based price range of evolocumab for a US-context, high-risk, secondary-prevention population. Burden of CVD was assessed using the UK-based Clinical Practice Research Datalink (CPRD) in order to capture complete CV burden including CV mortality. Patients on standard of care (SOC; high-intensity statins) in CPRD were selected based on eligibility criteria of FOURIER, a phase 3 CV outcomes trial of evolocumab, and categorized into four cohorts: high-risk prevalent atherosclerotic CVD (ASCVD) cohort (n = 1448), acute coronary syndrome (ACS) (n = 602), ischemic stroke (IS) (n = 151), and heart failure (HF) (n = 291) incident cohorts. The value-based price range for evolocumab was assessed using a previously published economic model. The model incorporated CPRD CV event rates and considered CV event reduction rate ratios per 1 mmol/L reduction in low-density lipoprotein-cholesterol (LDL-C) from a meta-analysis of statin trials by the Cholesterol Treatment Trialists Collaboration (CTTC), i.e. CTTC relationship. Multiple-event rates of composite CV events (ACS, IS, or coronary revascularization) per 100 patient-years were 12.3 for the high-risk prevalent ASCVD cohort, and 25.7, 13.3, and 23.3, respectively, for incident ACS, IS, and HF cohorts. Approximately one-half (42%) of the high-risk ASCVD patients with a new CV event during follow-up had a subsequent CV event. Combining these real-world event rates and the CTTC relationship in the economic model, the value-based price range (credible interval) under a willingness-to-pay threshold of $150,000/quality-adjusted life-year gained for evolocumab was $11,990 ($9,341-$14,833) to $16,856 ($12,903-$20,678) in ASCVD patients with baseline LDL-C levels ≥70 mg/dL and ≥100 mg/dL, respectively. Real-world CVD burden is substantial. Using the observed CVD burden in CPRD and the CTTC relationship, the cost-effectiveness analysis showed that, accounting for uncertainties, the expected value-based price for evolocumab is higher than its current annual cost, as long as the payer discount off list price is greater than 20%.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AGUFMNH53A1724P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AGUFMNH53A1724P"><span>Towards a Multi-Resolution Model of Seismic Risk in Central Asia. Challenge and perspectives</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pittore, M.; Wieland, M.; Bindi, D.; Parolai, S.</p> <p>2011-12-01</p> <p>Assessing seismic risk, defined as the probability of occurrence of economical and social losses as consequence of an earthquake, both at regional and at local scale is a challenging, multi-disciplinary task. In order to provide a reliable estimate, diverse information must be gathered by seismologists, geologists, engineers and civil authorities, and carefully integrated keeping into account the different levels of uncertainty. The research towards an integrated methodology, able to seamlessly describe seismic risk at different spatial scales is challenging, but discloses new application perspectives, particularly in those countries which suffer from a relevant seismic hazard but do not have resources for a standard assessment. Central Asian countries in particular, which exhibit one of the highest seismic hazard in the world, are experiencing a steady demographic growth, often accompanied by informal settlement and urban sprawling. A reliable evaluation of how these factors affect the seismic risk, together with a realistic assessment of the assets exposed to seismic hazard and their structural vulnerability is of particular importance, in order to undertake proper mitigation actions and to promptly and efficiently react to a catastrophic event. New strategies are needed to efficiently cope with systematic lack of information and uncertainties. An original approach is presented to assess seismic risk based on integration of information coming from remote-sensing and ground-based panoramic imaging, in situ measurements, expert knowledge and already available data. Efficient sampling strategies based on freely available medium-resolution multi-spectral satellite images are adopted to optimize data collection and validation, in a multi-scale approach. Panoramic imaging is also considered as a valuable ground-based visual data collection technique, suitable both for manual and automatic analysis. A full-probabilistic framework based on Bayes Network is proposed to integrate available information taking into account both aleatory and epistemic uncertainties. An improved risk model for the capital of Kyrgyz Republic, Biskek, has been developed following this approach and tested based on different earthquake scenarios. Preliminary results will be presented and discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110001661','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110001661"><span>Biological Based Risk Assessment for Space Exploration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Cucinotta, Francis A.</p> <p>2011-01-01</p> <p>Exposures from galactic cosmic rays (GCR) - made up of high-energy protons and high-energy and charge (HZE) nuclei, and solar particle events (SPEs) - comprised largely of low- to medium-energy protons are the primary health concern for astronauts for long-term space missions. Experimental studies have shown that HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation, making risk assessments for cancer and degenerative risks, such as central nervous system effects and heart disease, highly uncertain. The goal for space radiation protection at NASA is to be able to reduce the uncertainties in risk assessments for Mars exploration to be small enough to ensure acceptable levels of risks are not exceeded and to adequately assess the efficacy of mitigation measures such as shielding or biological countermeasures. We review the recent BEIR VII and UNSCEAR-2006 models of cancer risks and their uncertainties. These models are shown to have an inherent 2-fold uncertainty as defined by ratio of the 95% percent confidence level to the mean projection, even before radiation quality is considered. In order to overcome the uncertainties in these models, new approaches to risk assessment are warranted. We consider new computational biology approaches to modeling cancer risks. A basic program of research that includes stochastic descriptions of the physics and chemistry of radiation tracks and biochemistry of metabolic pathways, to emerging biological understanding of cellular and tissue modifications leading to cancer is described.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70161946','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70161946"><span>Uncertainty in spatially explicit animal dispersal models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Mooij, Wolf M.; DeAngelis, Donald L.</p> <p>2003-01-01</p> <p>Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1196193','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1196193"><span>Review of the GMD Benchmark Event in TPL-007-1</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Backhaus, Scott N.; Rivera, Michael Kelly</p> <p>2015-07-21</p> <p>Los Alamos National Laboratory (LANL) examined the approaches suggested in NERC Standard TPL-007-1 for defining the geo-electric field for the Benchmark Geomagnetic Disturbance (GMD) Event. Specifically; 1. Estimating 100-year exceedance geo-electric field magnitude; The scaling of the GMD Benchmark Event to geomagnetic latitudes below 60 degrees north; and 3. The effect of uncertainties in earth conductivity data on the conversion from geomagnetic field to geo-electric field. This document summarizes the review and presents recommendations for consideration</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23692228','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23692228"><span>A short note on probability in clinical medicine.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Upshur, Ross E G</p> <p>2013-06-01</p> <p>Probability claims are ubiquitous in clinical medicine, yet exactly how clinical events relate to interpretations of probability has been not been well explored. This brief essay examines the major interpretations of probability and how these interpretations may account for the probabilistic nature of clinical events. It is argued that there are significant problems with the unquestioned application of interpretation of probability to clinical events. The essay concludes by suggesting other avenues to understand uncertainty in clinical medicine. © 2013 John Wiley & Sons Ltd.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19980027029','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19980027029"><span>Uncertainty Modeling for Structural Control Analysis and Synthesis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Campbell, Mark E.; Crawley, Edward F.</p> <p>1996-01-01</p> <p>The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>