Fuller, Robert William; Wong, Tony E; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.
Wong, Tony E.; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095
Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian
2008-07-08
This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrencemore » relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.« less
Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels
NASA Astrophysics Data System (ADS)
Kurniasih, R.; Sujadi, I.
2017-09-01
This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.
Probabilistic tsunami hazard analysis: Multiple sources and global applications
Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie
2017-01-01
Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.
Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications
NASA Astrophysics Data System (ADS)
Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie
2017-12-01
Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.
Lloyd, G T; Bapst, D W; Friedman, M; Davis, K E
2016-11-01
Branch lengths-measured in character changes-are an essential requirement of clock-based divergence estimation, regardless of whether the fossil calibrations used represent nodes or tips. However, a separate set of divergence time approaches are typically used to date palaeontological trees, which may lack such branch lengths. Among these methods, sophisticated probabilistic approaches have recently emerged, in contrast with simpler algorithms relying on minimum node ages. Here, using a novel phylogenetic hypothesis for Mesozoic dinosaurs, we apply two such approaches to estimate divergence times for: (i) Dinosauria, (ii) Avialae (the earliest birds) and (iii) Neornithes (crown birds). We find: (i) the plausibility of a Permian origin for dinosaurs to be dependent on whether Nyasasaurus is the oldest dinosaur, (ii) a Middle to Late Jurassic origin of avian flight regardless of whether Archaeopteryx or Aurornis is considered the first bird and (iii) a Late Cretaceous origin for Neornithes that is broadly congruent with other node- and tip-dating estimates. Demonstrating the feasibility of probabilistic time-scaling further opens up divergence estimation to the rich histories of extinct biodiversity in the fossil record, even in the absence of detailed character data. © 2016 The Authors.
Probabilistic assessment method of the non-monotonic dose-responses-Part I: Methodological approach.
Chevillotte, Grégoire; Bernard, Audrey; Varret, Clémence; Ballet, Pascal; Bodin, Laurent; Roudot, Alain-Claude
2017-08-01
More and more studies aim to characterize non-monotonic dose response curves (NMDRCs). The greatest difficulty is to assess the statistical plausibility of NMDRCs from previously conducted dose response studies. This difficulty is linked to the fact that these studies present (i) few doses tested, (ii) a low sample size per dose, and (iii) the absence of any raw data. In this study, we propose a new methodological approach to probabilistically characterize NMDRCs. The methodology is composed of three main steps: (i) sampling from summary data to cover all the possibilities that may be presented by the responses measured by dose and to obtain a new raw database, (ii) statistical analysis of each sampled dose-response curve to characterize the slopes and their signs, and (iii) characterization of these dose-response curves according to the variation of the sign in the slope. This method allows characterizing all types of dose-response curves and can be applied both to continuous data and to discrete data. The aim of this study is to present the general principle of this probabilistic method which allows to assess the non-monotonic dose responses curves, and to present some results. Copyright © 2017 Elsevier Ltd. All rights reserved.
Risk Assessment Guidance for Superfund (RAGS) Volume III: Part A
EPA's Risk Assessment Guidance for Superfund (RAGS) Volume 3A provides policies and guiding principles on the application of probabilistic risk assessment (PRA) methods to human health and ecological risk assessment in the EPA Superfund Program.
Koutsoumanis, Konstantinos; Angelidis, Apostolos S
2007-08-01
Among the new microbiological criteria that have been incorporated in EU Regulation 2073/2005, of particular interest are those concerning Listeria monocytogenes in ready-to eat (RTE) foods, because for certain food categories, they no longer require zero tolerance but rather specify a maximum allowable concentration of 100 CFU/g or ml. This study presents a probabilistic modeling approach for evaluating the compliance of RTE sliced meat products with the new safety criteria for L. monocytogenes. The approach was based on the combined use of (i) growth/no growth boundary models, (ii) kinetic growth models, (iii) product characteristics data (pH, a(w), shelf life) collected from 160 meat products from the Hellenic retail market, and (iv) storage temperature data recorded from 50 retail stores in Greece. This study shows that probabilistic analysis of the above components using Monte Carlo simulation, which takes into account the variability of factors affecting microbial growth, can lead to a realistic estimation of the behavior of L. monocytogenes throughout the food supply chain, and the quantitative output generated can be further used by food managers as a decision-making tool regarding the design or modification of a product's formulation or its "use-by" date in order to ensure its compliance with the new safety criteria. The study also argues that compliance of RTE foods with the new safety criteria should not be considered a parameter with a discrete and binary outcome because it depends on factors such as product characteristics, storage temperature, and initial contamination level, which display considerable variability even among different packages of the same RTE product. Rather, compliance should be expressed and therefore regulated in a more probabilistic fashion.
Brain function during probabilistic learning in relation to IQ and level of education.
van den Bos, Wouter; Crone, Eveline A; Güroğlu, Berna
2012-02-15
Knowing how to adapt your behavior based on feedback lies at the core of successful learning. We investigated the relation between brain function, grey matter volume, educational level and IQ in a Dutch adolescent sample. In total 45 healthy volunteers between ages 13 and 16 were recruited from schools for pre-vocational and pre-university education. For each individual, IQ was estimated using two subtests from the WISC-III-R (similarities and block design). While in the magnetic resonance imaging (MRI) scanner, participants performed a probabilistic learning task. Behavioral comparisons showed that participants with higher IQ used a more adaptive learning strategy after receiving positive feedback. Analysis of neural activation revealed that higher IQ was associated with increased activation in DLPFC and dACC when receiving positive feedback, specifically for rules with low reward probability (i.e., unexpected positive feedback). Furthermore, VBM analyses revealed that IQ correlated positively with grey matter volume within these regions. These results provide support for IQ-related individual differences in the developmental time courses of neural circuitry supporting feedback-based learning. Current findings are interpreted in terms of a prolonged window of flexibility and opportunity for adolescents with higher IQ scores. Copyright © 2011 Elsevier Ltd. All rights reserved.
Middlebrooks, E H; Tuna, I S; Grewal, S S; Almeida, L; Heckman, M G; Lesser, E R; Foote, K D; Okun, M S; Holanda, V M
2018-06-01
Although globus pallidus internus deep brain stimulation is a widely accepted treatment for Parkinson disease, there is persistent variability in outcomes that is not yet fully understood. In this pilot study, we aimed to investigate the potential role of globus pallidus internus segmentation using probabilistic tractography as a supplement to traditional targeting methods. Eleven patients undergoing globus pallidus internus deep brain stimulation were included in this retrospective analysis. Using multidirection diffusion-weighted MR imaging, we performed probabilistic tractography at all individual globus pallidus internus voxels. Each globus pallidus internus voxel was then assigned to the 1 ROI with the greatest number of propagated paths. On the basis of deep brain stimulation programming settings, the volume of tissue activated was generated for each patient using a finite element method solution. For each patient, the volume of tissue activated within each of the 10 segmented globus pallidus internus regions was calculated and examined for association with a change in the Unified Parkinson Disease Rating Scale, Part III score before and after treatment. Increasing volume of tissue activated was most strongly correlated with a change in the Unified Parkinson Disease Rating Scale, Part III score for the primary motor region (Spearman r = 0.74, P = .010), followed by the supplementary motor area/premotor cortex (Spearman r = 0.47, P = .15). In this pilot study, we assessed a novel method of segmentation of the globus pallidus internus based on probabilistic tractography as a supplement to traditional targeting methods. Our results suggest that our method may be an independent predictor of deep brain stimulation outcome, and evaluation of a larger cohort or prospective study is warranted to validate these findings. © 2018 by American Journal of Neuroradiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; Draxl, Caroline; Hopson, Thomas
Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less
Paixão, Enny S; Harron, Katie; Andrade, Kleydson; Teixeira, Maria Glória; Fiaccone, Rosemeire L; Costa, Maria da Conceição N; Rodrigues, Laura C
2017-07-17
Due to the increasing availability of individual-level information across different electronic datasets, record linkage has become an efficient and important research tool. High quality linkage is essential for producing robust results. The objective of this study was to describe the process of preparing and linking national Brazilian datasets, and to compare the accuracy of different linkage methods for assessing the risk of stillbirth due to dengue in pregnancy. We linked mothers and stillbirths in two routinely collected datasets from Brazil for 2009-2010: for dengue in pregnancy, notifications of infectious diseases (SINAN); for stillbirths, mortality (SIM). Since there was no unique identifier, we used probabilistic linkage based on maternal name, age and municipality. We compared two probabilistic approaches, each with two thresholds: 1) a bespoke linkage algorithm; 2) a standard linkage software widely used in Brazil (ReclinkIII), and used manual review to identify further links. Sensitivity and positive predictive value (PPV) were estimated using a subset of gold-standard data created through manual review. We examined the characteristics of false-matches and missed-matches to identify any sources of bias. From records of 678,999 dengue cases and 62,373 stillbirths, the gold-standard linkage identified 191 cases. The bespoke linkage algorithm with a conservative threshold produced 131 links, with sensitivity = 64.4% (68 missed-matches) and PPV = 92.5% (8 false-matches). Manual review of uncertain links identified an additional 37 links, increasing sensitivity to 83.7%. The bespoke algorithm with a relaxed threshold identified 132 true matches (sensitivity = 69.1%), but introduced 61 false-matches (PPV = 68.4%). ReclinkIII produced lower sensitivity and PPV than the bespoke linkage algorithm. Linkage error was not associated with any recorded study variables. Despite a lack of unique identifiers for linking mothers and stillbirths, we demonstrate a high standard of linkage of large routine databases from a middle income country. Probabilistic linkage and manual review were essential for accurately identifying cases for a case-control study, but this approach may not be feasible for larger databases or for linkage of more common outcomes.
NASA Astrophysics Data System (ADS)
Dai, Hong-Yi; Kuang, Le-Man; Li, Cheng-Zu
2005-07-01
We propose a scheme to probabilistically teleport an unknown arbitrary three-level two-particle state by using two partial entangled two-particle states of three-level as the quantum channel. The classical communication cost required in the ideal probabilistic teleportation process is also calculated. This scheme can be directly generalized to teleport an unknown and arbitrary three-level K-particle state by using K partial entangled two-particle states of three-level as the quantum channel. The project supported by National Fundamental Research Program of China under Grant No. 2001CB309310, National Natural Science Foundation of China under Grant Nos. 10404039 and 10325523
Hanson, Stanley L.; Perkins, David M.
1995-01-01
The construction of a probabilistic ground-motion hazard map for a region follows a sequence of analyses beginning with the selection of an earthquake catalog and ending with the mapping of calculated probabilistic ground-motion values (Hanson and others, 1992). An integral part of this process is the creation of sources used for the calculation of earthquake recurrence rates and ground motions. These sources consist of areas and lines that are representative of geologic or tectonic features and faults. After the design of the sources, it is necessary to arrange the coordinate points in a particular order compatible with the input format for the SEISRISK-III program (Bender and Perkins, 1987). Source zones are usually modeled as a point-rupture source. Where applicable, linear rupture sources are modeled with articulated lines, representing known faults, or a field of parallel lines, representing a generalized distribution of hypothetical faults. Based on the distribution of earthquakes throughout the individual source zones (or a collection of several sources), earthquake recurrence rates are computed for each of the sources, and a minimum and maximum magnitude is assigned. Over a period of time from 1978 to 1980 several conferences were held by the USGS to solicit information on regions of the United States for the purpose of creating source zones for computation of probabilistic ground motions (Thenhaus, 1983). As a result of these regional meetings and previous work in the Pacific Northwest, (Perkins and others, 1980), California continental shelf, (Thenhaus and others, 1980), and the Eastern outer continental shelf, (Perkins and others, 1979) a consensus set of source zones was agreed upon and subsequently used to produce a national ground motion hazard map for the United States (Algermissen and others, 1982). In this report and on the accompanying disk we provide a complete list of source areas and line sources as used for the 1982 and later 1990 seismic hazard maps for the conterminous U.S. and Alaska. These source zones are represented in the input form required for the hazard program SEISRISK-III, and they include the attenuation table and several other input parameter lines normally found at the beginning of an input data set for SEISRISK-III.
Robust Control Design for Systems With Probabilistic Uncertainty
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.
2005-01-01
This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
Modeling and Control of State-Affine Probabilistic Systems for Atomic-Scale Dynamics
2007-06-01
Analysis for Nonlinear Systems. New York: Spring -Verlag, 1987. [5] M. Itoh, "Atomic-scale homoepitaxial growth simulations of reconstructed III-V surfaces...Ridge, NJ; 1995. -- SSC function of the Node 182 [2] M. K. Weldon , K. T. Queeney, J. Eng Jr., K. Raghavachari, and Y. J. Chabal, "The surface science
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling
Franke, Michael; Degen, Judith
2016-01-01
Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers’ and listeners’ pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data
Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel
2014-01-01
This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198
Zhang, Kejiang; Achari, Gopal; Li, Hua
2009-11-03
Traditionally, uncertainty in parameters are represented as probabilistic distributions and incorporated into groundwater flow and contaminant transport models. With the advent of newer uncertainty theories, it is now understood that stochastic methods cannot properly represent non random uncertainties. In the groundwater flow and contaminant transport equations, uncertainty in some parameters may be random, whereas those of others may be non random. The objective of this paper is to develop a fuzzy-stochastic partial differential equation (FSPDE) model to simulate conditions where both random and non random uncertainties are involved in groundwater flow and solute transport. Three potential solution techniques namely, (a) transforming a probability distribution to a possibility distribution (Method I) then a FSPDE becomes a fuzzy partial differential equation (FPDE), (b) transforming a possibility distribution to a probability distribution (Method II) and then a FSPDE becomes a stochastic partial differential equation (SPDE), and (c) the combination of Monte Carlo methods and FPDE solution techniques (Method III) are proposed and compared. The effects of these three methods on the predictive results are investigated by using two case studies. The results show that the predictions obtained from Method II is a specific case of that got from Method I. When an exact probabilistic result is needed, Method II is suggested. As the loss or gain of information during a probability-possibility (or vice versa) transformation cannot be quantified, their influences on the predictive results is not known. Thus, Method III should probably be preferred for risk assessments.
A closed-loop neurobotic system for fine touch sensing
NASA Astrophysics Data System (ADS)
Bologna, L. L.; Pinoteau, J.; Passot, J.-B.; Garrido, J. A.; Vogel, J.; Ros Vidal, E.; Arleo, A.
2013-08-01
Objective. Fine touch sensing relies on peripheral-to-central neurotransmission of somesthetic percepts, as well as on active motion policies shaping tactile exploration. This paper presents a novel neuroengineering framework for robotic applications based on the multistage processing of fine tactile information in the closed action-perception loop. Approach. The integrated system modules focus on (i) neural coding principles of spatiotemporal spiking patterns at the periphery of the somatosensory pathway, (ii) probabilistic decoding mechanisms mediating cortical-like tactile recognition and (iii) decision-making and low-level motor adaptation underlying active touch sensing. We probed the resulting neural architecture through a Braille reading task. Main results. Our results on the peripheral encoding of primary contact features are consistent with experimental data on human slow-adapting type I mechanoreceptors. They also suggest second-order processing by cuneate neurons may resolve perceptual ambiguities, contributing to a fast and highly performing online discrimination of Braille inputs by a downstream probabilistic decoder. The implemented multilevel adaptive control provides robustness to motion inaccuracy, while making the number of finger accelerations covariate with Braille character complexity. The resulting modulation of fingertip kinematics is coherent with that observed in human Braille readers. Significance. This work provides a basis for the design and implementation of modular neuromimetic systems for fine touch discrimination in robotics.
Mittmann, Nicole; Chan, Brian C; Craven, B Cathy; Isogai, Pierre K; Houghton, Pamela
2011-06-01
To evaluate the incremental cost-effectiveness of electrical stimulation (ES) plus standard wound care (SWC) as compared with SWC only in a spinal cord injury (SCI) population with grade III/IV pressure ulcers (PUs) from the public payer perspective. A decision analytic model was constructed for a 1-year time horizon to determine the incremental cost-effectiveness of ES plus SWC to SWC in a cohort of participants with SCI and grade III/IV PUs. Model inputs for clinical probabilities were based on published literature. Model inputs, namely clinical probabilities and direct health system and medical resources were based on a randomized controlled trial of ES plus SWC versus SWC. Costs (Can $) included outpatient (clinic, home care, health professional) and inpatient management (surgery, complications). One way and probabilistic sensitivity (1000 Monte Carlo iterations) analyses were conducted. The perspective of this analysis is from a Canadian public health system payer. Model target population was an SCI cohort with grade III/IV PUs. Not applicable. Incremental cost per PU healed. ES plus SWC were associated with better outcomes and lower costs. There was a 16.4% increase in the PUs healed and a cost savings of $224 at 1 year. ES plus SWC were thus considered a dominant economic comparator. Probabilistic sensitivity analysis resulted in economic dominance for ES plus SWC in 62%, with another 35% having incremental cost-effectiveness ratios of $50,000 or less per PU healed. The largest driver of the economic model was the percentage of PU healed with ES plus SWC. The addition of ES to SWC improved healing in grade III/IV PU and reduced costs in an SCI population. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
One-dimensional statistical parametric mapping in Python.
Pataky, Todd C
2012-01-01
Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czejdo, Bogdan; Bhattacharya, Sambit; Ferragut, Erik M
2012-01-01
This paper describes the syntax and semantics of multi-level state diagrams to support probabilistic behavior of cooperating robots. The techniques are presented to analyze these diagrams by querying combined robots behaviors. It is shown how to use state abstraction and transition abstraction to create, verify and process large probabilistic state diagrams.
NASA Astrophysics Data System (ADS)
Yuan, J.; Kopp, R. E.
2017-12-01
Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO-CORDEX are generally lower than those of GCMs, while the drying trends in precipitation of EURO-CORDEX are smaller than those of GCMs. Climate indices are significantly affected by bias-correction and downscaling process. Our study provides valuable information for selecting climate indices in different regions over Europe.
Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes
NASA Astrophysics Data System (ADS)
Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.
2015-12-01
Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Orhan, A Emin; Ma, Wei Ji
2017-07-26
Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.
Comparison of probabilistic and deterministic fiber tracking of cranial nerves.
Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H
2017-09-01
OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p < 0.001; Wilcoxon signed-rank test). The false-positive error of the last obtained depiction was also significantly lower in probabilistic than in deterministic tracking (p < 0.001). The HCP data yielded significantly better results in terms of the Dice coefficient in probabilistic tracking (p < 0.001, Mann-Whitney U-test) and in deterministic tracking (p = 0.02). The false-positive errors were smaller in HCP data in deterministic tracking (p < 0.001) and showed a strong trend toward significance in probabilistic tracking (p = 0.06). In the clinical cases, the probabilistic method visualized 7 of 10 attempted CNs accurately, compared with 3 correct depictions with deterministic tracking. CONCLUSIONS High angular resolution DTI scans are preferable for the DTI-based depiction of the cranial nerves. Probabilistic tracking with a gradual PICo threshold increase is more effective for this task than the previously described deterministic tracking with a gradual FA threshold increase and might represent a method that is useful for depicting cranial nerves with DTI since it eliminates the erroneous fibers without manual intervention.
Expert judgments about RD&D and the future of nuclear energy.
Anadón, Laura D; Bosetti, Valentina; Bunn, Matthew; Catenacci, Michela; Lee, Audrey
2012-11-06
Probabilistic estimates of the cost and performance of future nuclear energy systems under different scenarios of government research, development, and demonstration (RD&D) spending were obtained from 30 U.S. and 30 European nuclear technology experts. We used a novel elicitation approach which combined individual and group elicitation. With no change from current RD&D funding levels, experts on average expected current (Gen. III/III+) designs to be somewhat more expensive in 2030 than they were in 2010, and they expected the next generation of designs (Gen. IV) to be more expensive still as of 2030. Projected costs of proposed small modular reactors (SMRs) were similar to those of Gen. IV systems. The experts almost unanimously recommended large increases in government support for nuclear RD&D (generally 2-3 times current spending). The majority expected that such RD&D would have only a modest effect on cost, but would improve performance in other areas, such as safety, waste management, and uranium resource utilization. The U.S. and E.U. experts were in relative agreement regarding how government RD&D funds should be allocated, placing particular focus on very high temperature reactors, sodium-cooled fast reactors, fuels and materials, and fuel cycle technologies.
Probabilistic models of cognition: conceptual foundations.
Chater, Nick; Tenenbaum, Joshua B; Yuille, Alan
2006-07-01
Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, 'sophisticated' probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, explore how the approach relates to studies of explicit probabilistic reasoning, and give a brief overview of the field as it stands today.
A New Scheme for Probabilistic Teleportation and Its Potential Applications
NASA Astrophysics Data System (ADS)
Wei, Jia-Hua; Dai, Hong-Yi; Zhang, Ming
2013-12-01
We propose a novel scheme to probabilistically teleport an unknown two-level quantum state when the information of the partially entangled state is only available for the sender. This is in contrast with the fact that the receiver must know the non-maximally entangled state in previous typical schemes for the teleportation. Additionally, we illustrate two potential applications of the novel scheme for probabilistic teleportation from a sender to a receiver with the help of an assistant, who plays distinct roles under different communication conditions, and our results show that the novel proposal could enlarge the applied range of probabilistic teleportation.
NASA Astrophysics Data System (ADS)
Haven, Emmanuel; Khrennikov, Andrei
2013-01-01
Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.
Probabilistic description of probable maximum precipitation
NASA Astrophysics Data System (ADS)
Ben Alaya, Mohamed Ali; Zwiers, Francis W.; Zhang, Xuebin
2017-04-01
Probable Maximum Precipitation (PMP) is the key parameter used to estimate probable Maximum Flood (PMF). PMP and PMF are important for dam safety and civil engineering purposes. Even if the current knowledge of storm mechanisms remains insufficient to properly evaluate limiting values of extreme precipitation, PMP estimation methods are still based on deterministic consideration, and give only single values. This study aims to provide a probabilistic description of the PMP based on the commonly used method, the so-called moisture maximization. To this end, a probabilistic bivariate extreme values model is proposed to address the limitations of traditional PMP estimates via moisture maximization namely: (i) the inability to evaluate uncertainty and to provide a range PMP values, (ii) the interpretation that a maximum of a data series as a physical upper limit (iii) and the assumption that a PMP event has maximum moisture availability. Results from simulation outputs of the Canadian Regional Climate Model CanRCM4 over North America reveal the high uncertainties inherent in PMP estimates and the non-validity of the assumption that PMP events have maximum moisture availability. This later assumption leads to overestimation of the PMP by an average of about 15% over North America, which may have serious implications for engineering design.
Reasoning about Probabilistic Security Using Task-PIOAs
NASA Astrophysics Data System (ADS)
Jaggard, Aaron D.; Meadows, Catherine; Mislove, Michael; Segala, Roberto
Task-structured probabilistic input/output automata (Task-PIOAs) are concurrent probabilistic automata that, among other things, have been used to provide a formal framework for the universal composability paradigms of protocol security. One of their advantages is that that they allow one to distinguish high-level nondeterminism that can affect the outcome of the protocol, from low-level choices, which can't. We present an alternative approach to analyzing the structure of Task-PIOAs that relies on ordered sets. We focus on two of the components that are required to define and apply Task-PIOAs: discrete probability theory and automata theory. We believe our development gives insight into the structure of Task-PIOAs and how they can be utilized to model crypto-protocols. We illustrate our approach with an example from anonymity, an area that has not previously been addressed using Task-PIOAs. We model Chaum's Dining Cryptographers Protocol at a level that does not require cryptographic primitives in the analysis. We show via this example how our approach can leverage a proof of security in the case a principal behaves deterministically to prove security when that principal behaves probabilistically.
System Risk Assessment and Allocation in Conceptual Design
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)
2003-01-01
As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.
Probabilistic brains: knowns and unknowns
Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E
2015-01-01
There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561
Impact of refining the assessment of dietary exposure to cadmium in the European adult population.
Ferrari, Pietro; Arcella, Davide; Heraud, Fanny; Cappé, Stefano; Fabiansson, Stefan
2013-01-01
Exposure assessment constitutes an important step in any risk assessment of potentially harmful substances present in food. The European Food Safety Authority (EFSA) first assessed dietary exposure to cadmium in Europe using a deterministic framework, resulting in mean values of exposure in the range of health-based guidance values. Since then, the characterisation of foods has been refined to better match occurrence and consumption data, and a new strategy to handle left-censoring in occurrence data was devised. A probabilistic assessment was performed and compared with deterministic estimates, using occurrence values at the European level and consumption data from 14 national dietary surveys. Mean estimates in the probabilistic assessment ranged from 1.38 (95% CI = 1.35-1.44) to 2.08 (1.99-2.23) µg kg⁻¹ bodyweight (bw) week⁻¹ across the different surveys, which were less than 10% lower than deterministic (middle bound) mean values that ranged from 1.50 to 2.20 µg kg⁻¹ bw week⁻¹. Probabilistic 95th percentile estimates of dietary exposure ranged from 2.65 (2.57-2.72) to 4.99 (4.62-5.38) µg kg⁻¹ bw week⁻¹, which were, with the exception of one survey, between 3% and 17% higher than middle-bound deterministic estimates. Overall, the proportion of subjects exceeding the tolerable weekly intake of 2.5 µg kg⁻¹ bw ranged from 14.8% (13.6-16.0%) to 31.2% (29.7-32.5%) according to the probabilistic assessment. The results of this work indicate that mean values of dietary exposure to cadmium in the European population were of similar magnitude using determinist or probabilistic assessments. For higher exposure levels, probabilistic estimates were almost consistently larger than deterministic counterparts, thus reflecting the impact of using the full distribution of occurrence values to determine exposure levels. It is considered prudent to use probabilistic methodology should exposure estimates be close to or exceeding health-based guidance values.
2013-01-01
Background Magnesium plays an essential role in the synthesis and metabolism of vitamin D and magnesium supplementation substantially reversed the resistance to vitamin D treatment in patients with magnesium-dependent vitamin-D-resistant rickets. We hypothesized that dietary magnesium alone, particularly its interaction with vitamin D intake, contributes to serum 25-hydroxyvitamin D (25(OH)D) levels, and the associations between serum 25(OH)D and risk of mortality may be modified by magnesium intake level. Methods We tested these novel hypotheses utilizing data from the National Health and Nutrition Examination Survey (NHANES) 2001 to 2006, a population-based cross-sectional study, and the NHANES III cohort, a population-based cohort study. Serum 25(OH)D was used to define vitamin D status. Mortality outcomes in the NHANES III cohort were determined by using probabilistic linkage with the National Death Index (NDI). Results High intake of total, dietary or supplemental magnesium was independently associated with significantly reduced risks of vitamin D deficiency and insufficiency respectively. Intake of magnesium significantly interacted with intake of vitamin D in relation to risk of both vitamin D deficiency and insufficiency. Additionally, the inverse association between total magnesium intake and vitamin D insufficiency primarily appeared among populations at high risk of vitamin D insufficiency. Furthermore, the associations of serum 25(OH)D with mortality, particularly due to cardiovascular disease (CVD) and colorectal cancer, were modified by magnesium intake, and the inverse associations were primarily present among those with magnesium intake above the median. Conclusions Our preliminary findings indicate it is possible that magnesium intake alone or its interaction with vitamin D intake may contribute to vitamin D status. The associations between serum 25(OH)D and risk of mortality may be modified by the intake level of magnesium. Future studies, including cohort studies and clinical trials, are necessary to confirm the findings. PMID:23981518
Koenig, Agnès; Bügler, Jürgen; Kirsch, Dieter; Köhler, Fritz; Weyermann, Céline
2015-01-01
An ink dating method based on solvent analysis was recently developed using thermal desorption followed by gas chromatography/mass spectrometry (GC/MS) and is currently implemented in several forensic laboratories. The main aims of this work were to implement this method in a new laboratory to evaluate whether results were comparable at three levels: (i) validation criteria, (ii) aging curves, and (iii) results interpretation. While the results were indeed comparable in terms of validation, the method proved to be very sensitive to maintenances. Moreover, the aging curves were influenced by ink composition, as well as storage conditions (particularly when the samples were not stored in "normal" room conditions). Finally, as current interpretation models showed limitations, an alternative model based on slope calculation was proposed. However, in the future, a probabilistic approach may represent a better solution to deal with ink sample inhomogeneity. © 2014 American Academy of Forensic Science.
NASA Astrophysics Data System (ADS)
Hoffmann, K.; Srouji, R. G.; Hansen, S. O.
2017-12-01
The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
NASA Technical Reports Server (NTRS)
Price J. M.; Ortega, R.
1998-01-01
Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.
NASA Astrophysics Data System (ADS)
Rumetshofer, M.; Heim, P.; Thaler, B.; Ernst, W. E.; Koch, M.; von der Linden, W.
2018-06-01
Ultrafast dynamical processes in photoexcited molecules can be observed with pump-probe measurements, in which information about the dynamics is obtained from the transient signal associated with the excited state. Background signals provoked by pump and/or probe pulses alone often obscure these excited-state signals. Simple subtraction of pump-only and/or probe-only measurements from the pump-probe measurement, as commonly applied, results in a degradation of the signal-to-noise ratio and, in the case of coincidence detection, the danger of overrated background subtraction. Coincidence measurements additionally suffer from false coincidences, requiring long data-acquisition times to keep erroneous signals at an acceptable level. Here we present a probabilistic approach based on Bayesian probability theory that overcomes these problems. For a pump-probe experiment with photoelectron-photoion coincidence detection, we reconstruct the interesting excited-state spectrum from pump-probe and pump-only measurements. This approach allows us to treat background and false coincidences consistently and on the same footing. We demonstrate that the Bayesian formalism has the following advantages over simple signal subtraction: (i) the signal-to-noise ratio is significantly increased, (ii) the pump-only contribution is not overestimated, (iii) false coincidences are excluded, (iv) prior knowledge, such as positivity, is consistently incorporated, (v) confidence intervals are provided for the reconstructed spectrum, and (vi) it is applicable to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, the Bayesian approach allows us to run experiments at higher ionization rates, resulting in a significant reduction of data acquisition times. The probabilistic approach is thoroughly scrutinized by challenging mock data. The application to pump-probe coincidence measurements on acetone molecules enables quantitative interpretations about the molecular decay dynamics and fragmentation behavior. All results underline the superiority of a consistent probabilistic approach over ad hoc estimations.
Probabilistic Learning by Rodent Grid Cells
Cheung, Allen
2016-01-01
Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population readout of a set of probabilistic spatial computations. PMID:27792723
NASA Astrophysics Data System (ADS)
Selva, Jacopo; Lorito, Stefano; Basili, Roberto; Tonini, Roberto; Tiberti, Mara Monica; Romano, Fabrizio; Perfetti, Paolo; Volpe, Manuela
2017-04-01
Most of the SPTHA studies and applications rely on several working assumptions: i) the - mostly offshore - tsunamigenic faults are sufficiently well known; ii) the subduction zone earthquakes dominate the hazard; iii) and their location and geometry is sufficiently well constrained. Hence, a probabilistic model is constructed as regards the magnitude-frequency distribution and sometimes the slip distribution of earthquakes occurring on assumed known faults. Then, tsunami scenarios are usually constructed for all earthquakes location, sizes, and slip distributions included in the probabilistic model, through deterministic numerical modelling of tsunami generation, propagation and impact on realistic bathymetries. Here, we adopt a different approach (Selva et al., GJI, 2016) that releases some of the above assumptions, considering that i) also non-subduction earthquakes may contribute significantly to SPTHA, depending on the local tectonic context; ii) that not all the offshore faults are known or sufficiently well constrained; iii) and that the faulting mechanism of future earthquakes cannot be considered strictly predictable. This approach uses as much as possible information from known faults which, depending on the amount of available information and on the local tectonic complexity, among other things, are either modelled as Predominant Seismicity (PS) or as Background Seismicity (BS). PS is used when it is possible to assume sufficiently known geometry and mechanism (e.g. for the main subduction zones). Conversely, within the BS approach information on faults is merged with that on past seismicity, dominant stress regime, and tectonic characterisation, to determine a probability density function for the faulting mechanism. To illustrate the methodology and its impact on the hazard estimates, we present an application in the NEAM region (Northeast Atlantic, Mediterranean and connected seas), initially designed during the ASTARTE project and now applied for the regional-scale SPTHA in the TSUMAPS-NEAM project funded by DG-ECHO.
NASA Astrophysics Data System (ADS)
Setiawan, R.
2018-03-01
In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.
Constructing Sample Space with Combinatorial Reasoning: A Mixed Methods Study
ERIC Educational Resources Information Center
McGalliard, William A., III.
2012-01-01
Recent curricular developments suggest that students at all levels need to be statistically literate and able to efficiently and accurately make probabilistic decisions. Furthermore, statistical literacy is a requirement to being a well-informed citizen of society. Research also recognizes that the ability to reason probabilistically is supported…
Paladino, Ombretta; Seyedsalehi, Mahdi; Massabò, Marco
2018-09-01
The use of fertilizers in greenhouse-grown crops can pose a threat to groundwater quality and, consequently, to human beings and subterranean ecosystem, where intensive farming produces pollutants leaching. Albenga plain (Liguria, Italy) is an alluvial area of about 45km 2 historically devoted to farming. Recently the crops have evolved to greenhouses horticulture and floriculture production. In the area high levels of nitrates in groundwater have been detected. Lysimeters with three types of reconstituted soils (loamy sand, sandy clay loam and sandy loam) collected from different areas of Albenga plain were used in this study to evaluate the leaching loss of nitrate (NO 3 - ) over a period of 12weeks. Leaf lettuce (Lactuca sativa L.) was selected as a representative green-grown crop. Each of the soil samples was treated with a slow release fertilizer, simulating the real fertilizing strategy of the tillage. In order to estimate the potential risk for aquifers as well as for organisms exposed via pore water, nitrate concentrations in groundwater were evaluated by applying a simplified attenuation model to the experimental data. Results were refined and extended from comparison of single effects and exposure values (Tier I level) up to the evaluation of probabilistic distributions of exposure and related effects (Tier II, III IV levels). HHRA suggested HI >1 and about 20% probability of exceeding RfD for all the greenhouses, regardless of the soil. ERA suggested HQ>100 for all the greenhouses; 93% probability of PNEC exceedance for greenhouses containing sand clay loam. The probability of exceeding LC50 for 5% of the species was about 40% and the probability corresponding to DBQ of DEC/EC50>0.001 was >90% for all the greenhouses. The significantly high risk, related to the detected nitrate leaching loss, can be attributed to excessive and inappropriate fertigation strategies. Copyright © 2018 Elsevier B.V. All rights reserved.
DISCOUNTING OF DELAYED AND PROBABILISTIC LOSSES OVER A WIDE RANGE OF AMOUNTS
Green, Leonard; Myerson, Joel; Oliveira, Luís; Chang, Seo Eun
2014-01-01
The present study examined delay and probability discounting of hypothetical monetary losses over a wide range of amounts (from $20 to $500,000) in order to determine how amount affects the parameters of the hyperboloid discounting function. In separate conditions, college students chose between immediate payments and larger, delayed payments and between certain payments and larger, probabilistic payments. The hyperboloid function accurately described both types of discounting, and amount of loss had little or no systematic effect on the degree of discounting. Importantly, the amount of loss also had little systematic effect on either the rate parameter or the exponent of the delay and probability discounting functions. The finding that the parameters of the hyperboloid function remain relatively constant across a wide range of amounts of delayed and probabilistic loss stands in contrast to the robust amount effects observed with delayed and probabilistic rewards. At the individual level, the degree to which delayed losses were discounted was uncorrelated with the degree to which probabilistic losses were discounted, and delay and probability loaded on two separate factors, similar to what is observed with delayed and probabilistic rewards. Taken together, these findings argue that although delay and probability discounting involve fundamentally different decision-making mechanisms, nevertheless the discounting of delayed and probabilistic losses share an insensitivity to amount that distinguishes it from the discounting of delayed and probabilistic gains. PMID:24745086
Probabilistic Teleportation of One-Particle State of S-level
NASA Astrophysics Data System (ADS)
Yan, Feng-Li; Bai, Yan-Kui
2003-09-01
A scheme for probabilistically teleporting an unknown one-particle state of S-level by a group of pairs of partially entangled 2-level particle state is proposed. In this scheme unitary transformation and local measurement take the place of Bell state measurement, then proper unitary transformation and the measurement on an auxiliary qubit with the aid of classical communication are performed. In this way the unknown one-particle state of S-level can be transferred onto a group of remote 2-level particles with certain probability. Furthermore, the receiver can recover the initial signal state on an S-level particle at his hand. The project supported by Natural Science Foundation of Hebei Province of China
An investigation into the probabilistic combination of quasi-static and random accelerations
NASA Technical Reports Server (NTRS)
Schock, R. W.; Tuell, L. P.
1984-01-01
The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.
Sensor Analytics: Radioactive gas Concentration Estimation and Error Propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dale N.; Fagan, Deborah K.; Suarez, Reynold
2007-04-15
This paper develops the mathematical statistics of a radioactive gas quantity measurement and associated error propagation. The probabilistic development is a different approach to deriving attenuation equations and offers easy extensions to more complex gas analysis components through simulation. The mathematical development assumes a sequential process of three components; I) the collection of an environmental sample, II) component gas extraction from the sample through the application of gas separation chemistry, and III) the estimation of radioactivity of component gases.
Application of cause-and-effect analysis to potentiometric titration.
Kufelnicki, A; Lis, S; Meinrath, G
2005-08-01
A first attempt has been made to interpret physicochemical data from potentiometric titration analysis in accordance with the complete measurement-uncertainty budget approach (bottom-up) of ISO and Eurachem. A cause-and-effect diagram is established and discussed. Titration data for arsenazo III are used as a basis for this discussion. The commercial software Superquad is used and applied within a computer-intensive resampling framework. The cause-and-effect diagram is applied to evaluation of seven protonation constants of arsenazo III in the pH range 2-10.7. The data interpretation is based on empirical probability distributions and their analysis by second-order correct confidence estimates. The evaluated data are applied in the calculation of a speciation diagram including uncertainty estimates using the probabilistic speciation software Ljungskile.
Nonlinear probabilistic finite element models of laminated composite shells
NASA Technical Reports Server (NTRS)
Engelstad, S. P.; Reddy, J. N.
1993-01-01
A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.
Dynamic Probabilistic Instability of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2009-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.
Huisman, J.A.; Breuer, L.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.; Willems, P.
2009-01-01
An ensemble of 10 hydrological models was applied to the same set of land use change scenarios. There was general agreement about the direction of changes in the mean annual discharge and 90% discharge percentile predicted by the ensemble members, although a considerable range in the magnitude of predictions for the scenarios and catchments under consideration was obvious. Differences in the magnitude of the increase were attributed to the different mean annual actual evapotranspiration rates for each land use type. The ensemble of model runs was further analyzed with deterministic and probabilistic ensemble methods. The deterministic ensemble method based on a trimmed mean resulted in a single somewhat more reliable scenario prediction. The probabilistic reliability ensemble averaging (REA) method allowed a quantification of the model structure uncertainty in the scenario predictions. It was concluded that the use of a model ensemble has greatly increased our confidence in the reliability of the model predictions. ?? 2008 Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Yan, Wang-Ji; Ren, Wei-Xin
2018-01-01
This study applies the theoretical findings of circularly-symmetric complex normal ratio distribution Yan and Ren (2016) [1,2] to transmissibility-based modal analysis from a statistical viewpoint. A probabilistic model of transmissibility function in the vicinity of the resonant frequency is formulated in modal domain, while some insightful comments are offered. It theoretically reveals that the statistics of transmissibility function around the resonant frequency is solely dependent on 'noise-to-signal' ratio and mode shapes. As a sequel to the development of the probabilistic model of transmissibility function in modal domain, this study poses the process of modal identification in the context of Bayesian framework by borrowing a novel paradigm. Implementation issues unique to the proposed approach are resolved by Lagrange multiplier approach. Also, this study explores the possibility of applying Bayesian analysis in distinguishing harmonic components and structural ones. The approaches are verified through simulated data and experimentally testing data. The uncertainty behavior due to variation of different factors is also discussed in detail.
Probabilistic composite micromechanics
NASA Technical Reports Server (NTRS)
Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.
1988-01-01
Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.
2016-12-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.
2017-12-01
The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Løvholt, Finn
2017-04-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning.
Chung, Michael Jae-Yoon; Friesen, Abram L; Fox, Dieter; Meltzoff, Andrew N; Rao, Rajesh P N
2015-01-01
A fundamental challenge in robotics today is building robots that can learn new skills by observing humans and imitating human actions. We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation. Our approach allows an autonomous agent to: (i) learn probabilistic models of actions through self-discovery and experience, (ii) utilize these learned models for inferring the goals of human actions, and (iii) perform goal-based imitation for robotic learning and human-robot collaboration. Such an approach allows a robot to leverage its increasing repertoire of learned behaviors to interpret increasingly complex human actions and use the inferred goals for imitation, even when the robot has very different actuators from humans. We demonstrate our approach using two different scenarios: (i) a simulated robot that learns human-like gaze following behavior, and (ii) a robot that learns to imitate human actions in a tabletop organization task. In both cases, the agent learns a probabilistic model of its own actions, and uses this model for goal inference and goal-based imitation. We also show that the robotic agent can use its probabilistic model to seek human assistance when it recognizes that its inferred actions are too uncertain, risky, or impossible to perform, thereby opening the door to human-robot collaboration.
A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning
Chung, Michael Jae-Yoon; Friesen, Abram L.; Fox, Dieter; Meltzoff, Andrew N.; Rao, Rajesh P. N.
2015-01-01
A fundamental challenge in robotics today is building robots that can learn new skills by observing humans and imitating human actions. We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation. Our approach allows an autonomous agent to: (i) learn probabilistic models of actions through self-discovery and experience, (ii) utilize these learned models for inferring the goals of human actions, and (iii) perform goal-based imitation for robotic learning and human-robot collaboration. Such an approach allows a robot to leverage its increasing repertoire of learned behaviors to interpret increasingly complex human actions and use the inferred goals for imitation, even when the robot has very different actuators from humans. We demonstrate our approach using two different scenarios: (i) a simulated robot that learns human-like gaze following behavior, and (ii) a robot that learns to imitate human actions in a tabletop organization task. In both cases, the agent learns a probabilistic model of its own actions, and uses this model for goal inference and goal-based imitation. We also show that the robotic agent can use its probabilistic model to seek human assistance when it recognizes that its inferred actions are too uncertain, risky, or impossible to perform, thereby opening the door to human-robot collaboration. PMID:26536366
Probabilistic Models for Solar Particle Events
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.
2009-01-01
Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.
A probabilistic analysis of the crystal oscillator behavior at low drive levels
NASA Astrophysics Data System (ADS)
Shmaliy, Yuriy S.; Brendel, Rémi
2008-03-01
The paper discusses a probabilistic model of a crystal oscillator at low drive levels where the noise intensity is comparable with the oscillation amplitude. The stationary probability density of the oscillations envelope is derived and investigated for the nonlinear resonator loses. A stochastic explanation is given for the well-known phenomenon termed sleeping sickness associated with losing a facility of self-excitation by a crystal oscillator after a long storage without a power supply. It is shown that, with low drive levels leading to an insufficient feedback, a crystal oscillator generates the noise-induced oscillations rather than it absolutely "falls in sleep".
A probabilistic approach to composite micromechanics
NASA Technical Reports Server (NTRS)
Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.
1988-01-01
Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
2017-08-31
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
Yue, Meng; Wang, Xiaoyu
2015-07-01
It is well-known that responsive battery energy storage systems (BESSs) are an effective means to improve the grid inertial response to various disturbances including the variability of the renewable generation. One of the major issues associated with its implementation is the difficulty in determining the required BESS capacity mainly due to the large amount of inherent uncertainties that cannot be accounted for deterministically. In this study, a probabilistic approach is proposed to properly size the BESS from the perspective of the system inertial response, as an application of probabilistic risk assessment (PRA). The proposed approach enables a risk-informed decision-making processmore » regarding (1) the acceptable level of solar penetration in a given system and (2) the desired BESS capacity (and minimum cost) to achieve an acceptable grid inertial response with a certain confidence level.« less
NASA Astrophysics Data System (ADS)
Rovinelli, Andrea; Guilhem, Yoann; Proudhon, Henry; Lebensohn, Ricardo A.; Ludwig, Wolfgang; Sangid, Michael D.
2017-06-01
Microstructurally small cracks exhibit large variability in their fatigue crack growth rate. It is accepted that the inherent variability in microstructural features is related to the uncertainty in the growth rate. However, due to (i) the lack of cycle-by-cycle experimental data, (ii) the complexity of the short crack growth phenomenon, and (iii) the incomplete physics of constitutive relationships, only empirical damage metrics have been postulated to describe the short crack driving force metric (SCDFM) at the mesoscale level. The identification of the SCDFM of polycrystalline engineering alloys is a critical need, in order to achieve more reliable fatigue life prediction and improve material design. In this work, the first steps in the development of a general probabilistic framework are presented, which uses experimental result as an input, retrieves missing experimental data through crystal plasticity (CP) simulations, and extracts correlations utilizing machine learning and Bayesian networks (BNs). More precisely, experimental results representing cycle-by-cycle data of a short crack growing through a beta-metastable titanium alloy, VST-55531, have been acquired via phase and diffraction contrast tomography. These results serve as an input for FFT-based CP simulations, which provide the micromechanical fields influenced by the presence of the crack, complementing the information available from the experiment. In order to assess the correlation between postulated SCDFM and experimental observations, the data is mined and analyzed utilizing BNs. Results show the ability of the framework to autonomously capture relevant correlations and the equivalence in the prediction capability of different postulated SCDFMs for the high cycle fatigue regime.
Sánchez-Vizcaíno, Fernando; Perez, Andrés; Martínez-López, Beatriz; Sánchez-Vizcaíno, José Manuel
2012-08-01
Trade of animals and animal products imposes an uncertain and variable risk for exotic animal diseases introduction into importing countries. Risk analysis provides importing countries with an objective, transparent, and internationally accepted method for assessing that risk. Over the last decades, European Union countries have conducted probabilistic risk assessments quite frequently to quantify the risk for rare animal diseases introduction into their territories. Most probabilistic animal health risk assessments have been typically classified into one-level and multilevel binomial models. One-level models are more simple than multilevel models because they assume that animals or products originate from one single population. However, it is unknown whether such simplification may result in substantially different results compared to those obtained through the use of multilevel models. Here, data used on a probabilistic multilevel binomial model formulated to assess the risk for highly pathogenic avian influenza introduction into Spain were reanalyzed using a one-level binomial model and their outcomes were compared. An alternative ordinal model is also proposed here, which makes use of simpler assumptions and less information compared to those required by traditional one-level and multilevel approaches. Results suggest that, at least under certain circumstances, results of the one-level and ordinal approaches are similar to those obtained using multilevel models. Consequently, we argue that, when data are insufficient to run traditional probabilistic models, the ordinal approach presented here may be a suitable alternative to rank exporting countries in terms of the risk that they impose for the spread of rare animal diseases into disease-free countries. © 2012 Society for Risk Analysis.
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
NASA Technical Reports Server (NTRS)
Canfield, R. C.; Ricchiazzi, P. J.
1980-01-01
An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.
Simulation Assisted Risk Assessment: Blast Overpressure Modeling
NASA Technical Reports Server (NTRS)
Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael
2006-01-01
A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.
Assessment of global flood exposures - developing an appropriate approach
NASA Astrophysics Data System (ADS)
Millinship, Ian; Booth, Naomi
2015-04-01
Increasingly complex probabilistic catastrophe models have become the standard for quantitative flood risk assessments by re/insurance companies. On the one hand, probabilistic modelling of this nature is extremely useful; a large range of risk metrics can be output. However, they can be time consuming and computationally expensive to develop and run. Levels of uncertainty are persistently high despite, or perhaps because of, attempts to increase resolution and complexity. A cycle of dependency between modelling companies and re/insurers has developed whereby available models are purchased, models run, and both portfolio and model data 'improved' every year. This can lead to potential exposures in perils and territories that are not currently modelled being largely overlooked by companies, who may then face substantial and unexpected losses when large events occur in these areas. We present here an approach to assessing global flood exposures which reduces the scale and complexity of approach used and begins with the identification of hotspots where there is a significant exposure to flood risk. The method comprises four stages: i) compile consistent exposure information, ii) to apply reinsurance terms and conditions to calculate values exposed, iii) to assess the potential hazard using a global set of flood hazard maps, and iv) to identify potential risk 'hotspots' which include considerations of spatially and/or temporally clustered historical events, and local flood defences. This global exposure assessment is designed as a scoping exercise, and reveals areas or cities where the potential for accumulated loss is of significant interest to a reinsurance company, and for which there is no existing catastrophe model. These regions are then candidates for the development of deterministic scenarios, or probabilistic models. The key advantages of this approach will be discussed. These include simplicity and ability of business leaders to understand results, as well as ease and speed of analysis and the advantages this can offer in terms of monitoring changing exposures over time. Significantly, in many areas of the world, this increase in exposure is likely to have more of an impact on increasing catastrophe losses than potential anthropogenically driven changes in weather extremes.
Probabilistic Model Development
NASA Technical Reports Server (NTRS)
Adam, James H., Jr.
2010-01-01
Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.
2018-02-15
address the problem that probabilistic inference algorithms are diÿcult and tedious to implement, by expressing them in terms of a small number of...building blocks, which are automatic transformations on probabilistic programs. On one hand, our curation of these building blocks reflects the way human...reasoning with low-level computational optimization, so the speed and accuracy of the generated solvers are competitive with state-of-the-art systems. 15
A probabilistic approach to aircraft design emphasizing stability and control uncertainties
NASA Astrophysics Data System (ADS)
Delaurentis, Daniel Andrew
In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.
Affective and cognitive factors influencing sensitivity to probabilistic information.
Tyszka, Tadeusz; Sawicki, Przemyslaw
2011-11-01
In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.
Kamboj, Sunita; Yu, Charley; Johnson, Robert
2013-05-01
The Derived Concentration Guideline Levels for two building areas previously used in waste processing and storage at Argonne National Laboratory were developed using both probabilistic and deterministic radiological environmental pathway analysis. Four scenarios were considered. The two current uses considered were on-site industrial use and off-site residential use with farming. The two future uses (i.e., after an institutional control period of 100 y) were on-site recreational use and on-site residential use with farming. The RESRAD-OFFSITE code was used for the current-use off-site residential/farming scenario and RESRAD (onsite) was used for the other three scenarios. Contaminants of concern were identified from the past operations conducted in the buildings and the actual characterization done at the site. Derived Concentration Guideline Levels were developed for all four scenarios using deterministic and probabilistic approaches, which include both "peak-of-the-means" and "mean-of-the-peaks" analyses. The future-use on-site residential/farming scenario resulted in the most restrictive Derived Concentration Guideline Levels for most radionuclides.
Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.
2014-01-01
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514
Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G
2014-12-10
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter
2017-01-01
The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models.
A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.
Chiu, Weihsueh A; Slob, Wout
2015-12-01
When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter
2017-01-01
The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models. PMID:29062288
A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects
Slob, Wout
2015-01-01
Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063
Probalistic Models for Solar Particle Events
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Xapsos, Michael
2009-01-01
Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to describe the radiation environment that can be expected at a specified confidence level. The task of the designer is then to choose a design that will operate in the model radiation environment. Probabilistic models have already been developed for solar proton events that describe the peak flux, event-integrated fluence and missionintegrated fluence. In addition a probabilistic model has been developed that describes the mission-integrated fluence for the Z>2 elemental spectra. This talk will focus on completing this suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 element
Probabilistic assessment of smart composite structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael C.
1994-01-01
A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
Probabilistic micromechanics for metal matrix composites
NASA Astrophysics Data System (ADS)
Engelstad, S. P.; Reddy, J. N.; Hopkins, Dale A.
A probabilistic micromechanics-based nonlinear analysis procedure is developed to predict and quantify the variability in the properties of high temperature metal matrix composites. Monte Carlo simulation is used to model the probabilistic distributions of the constituent level properties including fiber, matrix, and interphase properties, volume and void ratios, strengths, fiber misalignment, and nonlinear empirical parameters. The procedure predicts the resultant ply properties and quantifies their statistical scatter. Graphite copper and Silicon Carbide Titanlum Aluminide (SCS-6 TI15) unidirectional plies are considered to demonstrate the predictive capabilities. The procedure is believed to have a high potential for use in material characterization and selection to precede and assist in experimental studies of new high temperature metal matrix composites.
A probabilistic damage model of stress-induced permeability anisotropy during cataclastic flow
NASA Astrophysics Data System (ADS)
Zhu, Wenlu; MontéSi, Laurent G. J.; Wong, Teng-Fong
2007-10-01
A fundamental understanding of the effect of stress on permeability evolution is important for many fault mechanics and reservoir engineering problems. Recent laboratory measurements demonstrate that in the cataclastic flow regime, the stress-induced anisotropic reduction of permeability in porous rocks can be separated into 3 different stages. In the elastic regime (stage I), permeability and porosity reduction are solely controlled by the effective mean stress, with negligible permeability anisotropy. Stage II starts at the onset of shear-enhanced compaction, when a critical yield stress is attained. In stage II, the deviatoric stress exerts primary control over permeability and porosity evolution. The increase in deviatoric stress results in drastic permeability and porosity reduction and considerable permeability anisotropy. The transition from stage II to stage III takes place progressively during the development of pervasive cataclastic flow. In stage III, permeability and porosity reduction becomes gradual again, and permeability anisotropy diminishes. Microstructural observations on deformed samples using laser confocal microscopy reveal that stress-induced microcracking and pore collapse are the primary forms of damage during cataclastic flow. A probabilistic damage model is formulated to characterize the effects of stress on permeability and its anisotropy. In our model, the effects of both effective mean stress and differential stress on permeability evolution are calculated. By introducing stress sensitivity coefficients, we propose a first-order description of the dependence of permeability evolution on different loading paths. Built upon the micromechanisms of deformation in porous rocks, this unified model provides new insight into the coupling of stress and permeability.
Probabilistic modelling of drought events in China via 2-dimensional joint copula
NASA Astrophysics Data System (ADS)
Ayantobo, Olusola O.; Li, Yi; Song, Songbai; Javed, Tehseen; Yao, Ning
2018-04-01
Probabilistic modelling of drought events is a significant aspect of water resources management and planning. In this study, popularly applied and several relatively new bivariate Archimedean copulas were employed to derive regional and spatial based copula models to appraise drought risk in mainland China over 1961-2013. Drought duration (Dd), severity (Ds), and peak (Dp), as indicated by Standardized Precipitation Evapotranspiration Index (SPEI), were extracted according to the run theory and fitted with suitable marginal distributions. The maximum likelihood estimation (MLE) and curve fitting method (CFM) were used to estimate the copula parameters of nineteen bivariate Archimedean copulas. Drought probabilities and return periods were analysed based on appropriate bivariate copula in sub-region I-VII and entire mainland China. The goodness-of-fit tests as indicated by the CFM showed that copula NN19 in sub-regions III, IV, V, VI and mainland China, NN20 in sub-region I and NN13 in sub-region VII are the best for modeling drought variables. Bivariate drought probability across mainland China is relatively high, and the highest drought probabilities are found mainly in the Northwestern and Southwestern China. Besides, the result also showed that different sub-regions might suffer varying drought risks. The drought risks as observed in Sub-region III, VI and VII, are significantly greater than other sub-regions. Higher probability of droughts of longer durations in the sub-regions also corresponds to shorter return periods with greater drought severity. These results may imply tremendous challenges for the water resources management in different sub-regions, particularly the Northwestern and Southwestern China.
A Multiatlas Segmentation Using Graph Cuts with Applications to Liver Segmentation in CT Scans
2014-01-01
An atlas-based segmentation approach is presented that combines low-level operations, an affine probabilistic atlas, and a multiatlas-based segmentation. The proposed combination provides highly accurate segmentation due to registrations and atlas selections based on the regions of interest (ROIs) and coarse segmentations. Our approach shares the following common elements between the probabilistic atlas and multiatlas segmentation: (a) the spatial normalisation and (b) the segmentation method, which is based on minimising a discrete energy function using graph cuts. The method is evaluated for the segmentation of the liver in computed tomography (CT) images. Low-level operations define a ROI around the liver from an abdominal CT. We generate a probabilistic atlas using an affine registration based on geometry moments from manually labelled data. Next, a coarse segmentation of the liver is obtained from the probabilistic atlas with low computational effort. Then, a multiatlas segmentation approach improves the accuracy of the segmentation. Both the atlas selections and the nonrigid registrations of the multiatlas approach use a binary mask defined by coarse segmentation. We experimentally demonstrate that this approach performs better than atlas selections and nonrigid registrations in the entire ROI. The segmentation results are comparable to those obtained by human experts and to other recently published results. PMID:25276219
NASA Astrophysics Data System (ADS)
Caglar, Mehmet Umut; Pal, Ranadip
2011-03-01
Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.
Gaissmaier, Wolfgang; Giese, Helge; Galesic, Mirta; Garcia-Retamero, Rocio; Kasper, Juergen; Kleiter, Ingo; Meuth, Sven G; Köpke, Sascha; Heesen, Christoph
2018-01-01
A shared decision-making approach is suggested for multiple sclerosis (MS) patients. To properly evaluate benefits and risks of different treatment options accordingly, MS patients require sufficient numeracy - the ability to understand quantitative information. It is unknown whether MS affects numeracy. Therefore, we investigated whether patients' numeracy was impaired compared to a probabilistic national sample. As part of the larger prospective, observational, multicenter study PERCEPT, we assessed numeracy for a clinical study sample of German MS patients (N=725) with a standard test and compared them to a German probabilistic sample (N=1001), controlling for age, sex, and education. Within patients, we assessed whether disease variables (disease duration, disability, annual relapse rate, cognitive impairment) predicted numeracy beyond these demographics. MS patients showed a comparable level of numeracy as the probabilistic national sample (68.9% vs. 68.5% correct answers, P=0.831). In both samples, numeracy was higher for men and the highly educated. Disease variables did not predict numeracy beyond demographics within patients, and predictability was generally low. This sample of MS patients understood quantitative information on the same level as the general population. There is no reason to withhold quantitative information from MS patients. Copyright © 2017 Elsevier B.V. All rights reserved.
Probabilistic Analysis and Density Parameter Estimation Within Nessus
NASA Astrophysics Data System (ADS)
Godines, Cody R.; Manteufel, Randall D.
2002-12-01
This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.
Probabilistic Analysis and Density Parameter Estimation Within Nessus
NASA Technical Reports Server (NTRS)
Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)
2002-01-01
This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.
First USGS urban seismic hazard maps predict the effects of soils
Cramer, C.H.; Gomberg, J.S.; Schweig, E.S.; Waldron, B.A.; Tucker, K.
2006-01-01
Probabilistic and scenario urban seismic hazard maps have been produced for Memphis, Shelby County, Tennessee covering a six-quadrangle area of the city. The nine probabilistic maps are for peak ground acceleration and 0.2 s and 1.0 s spectral acceleration and for 10%, 5%, and 2% probability of being exceeded in 50 years. Six scenario maps for these three ground motions have also been generated for both an M7.7 and M6.2 on the southwest arm of the New Madrid seismic zone ending at Marked Tree, Arkansas. All maps include the effect of local geology. Relative to the national seismic hazard maps, the effect of the thick sediments beneath Memphis is to decrease 0.2 s probabilistic ground motions by 0-30% and increase 1.0 s probabilistic ground motions by ???100%. Probabilistic peak ground accelerations remain at levels similar to the national maps, although the ground motion gradient across Shelby County is reduced and ground motions are more uniform within the county. The M7.7 scenario maps show ground motions similar to the 5%-in-50-year probabilistic maps. As an effect of local geology, both M7.7 and M6.2 scenario maps show a more uniform seismic ground-motion hazard across Shelby County than scenario maps with constant site conditions (i.e., NEHRP B/C boundary).
Sanni, Steinar; Lyng, Emily; Pampanin, Daniela M
2017-06-01
Offshore oil and gas activities are required not to cause adverse environmental effects, and risk based management has been established to meet environmental standards. In some risk assessment schemes, Risk Indicators (RIs) are parameters to monitor the development of risk affecting factors. RIs have not yet been established in the Environmental Risk Assessment procedures for management of oil based discharges offshore. This paper evaluates the usefulness of biomarkers as RIs, based on their properties, existing laboratory biomarker data and assessment methods. Data shows several correlations between oil concentrations and biomarker responses, and assessment principles exist that qualify biomarkers for integration into risk procedures. Different ways that these existing biomarkers and methods can be applied as RIs in a probabilistic risk assessment system when linked with whole organism responses are discussed. This can be a useful approach to integrate biomarkers into probabilistic risk assessment related to oil based discharges, representing a potential supplement to information that biomarkers already provide about environmental impact and risk related to these kind of discharges. Copyright © 2016 Elsevier Ltd. All rights reserved.
Nagai, Takashi; Horio, Takeshi; Yokoyama, Atsushi; Kamiya, Takashi; Takano, Hiroyuki; Makino, Tomoyuki
2012-06-01
On-site soil washing with iron(III) chloride reduces Cd levels in soil, and thus the human health risks caused by Cd in food. However, it may threaten aquatic organisms when soil washing effluent is discharged to open aquatic systems. Therefore, we conducted trial-scale on-site soil washing and ecological risk assessment in Nagano and Niigata prefectures, Japan. The ecological effect of effluent water was investigated by two methods. The first was bioassay using standard aquatic test organisms. Twice-diluted effluent water from the Nagano site and the original effluent water from the Niigata site had no significant effects on green algae, water flea, caddisfly, and fish. The safe dilution rates were estimated as 20 times and 10 times for the Nagano and Niigata sites, respectively, considering an assessment factor of 10. The second method was probabilistic effect analysis using chemical analysis and the species sensitivity distribution concept. The mixture effects of CaCl(2), Al, Zn, and Mn were considered by applying a response additive model. The safe dilution rates, assessed for a potentially affected fraction of species of 5%, were 7.1 times and 23.6 times for the Nagano and Niigata sites, respectively. The actual dilution rates of effluent water by river water at the Nagano and Niigata sites were 2200-67,000 times and 1300-110,000 times, respectively. These are much larger than the safe dilution rates derived from the two approaches. Consequently, the ecological risk to aquatic organisms of soil washing is evaluated as being below the concern level. Copyright © 2012 Elsevier Inc. All rights reserved.
Simons, Pascale A M; Ramaekers, Bram; Hoebers, Frank; Kross, Kenneth W; Marneffe, Wim; Pijls-Johannesma, Madelon; Vandijck, Dominique
2015-07-01
Compared with new technologies, the redesign of care processes is generally considered less attractive to improve patient outcomes. Nevertheless, it might result in better patient outcomes, without further increasing costs. Because early initiation of treatment is of vital importance for patients with head and neck cancer (HNC), these care processes were redesigned. This study aimed to assess patient outcomes and cost-effectiveness of this redesign. An economic (Markov) model was constructed to evaluate the biopsy process of suspicious lesion under local instead of general anesthesia, and combining computed tomography and positron emission tomography for diagnostics and radiotherapy planning. Patients treated for HNC were included in the model stratified by disease location (larynx, oropharynx, hypopharynx, and oral cavity) and stage (I-II and III-IV). Probabilistic sensitivity analyses were performed. Waiting time before treatment start reduced from 5 to 22 days for the included patient groups, resulting in 0.13 to 0.66 additional quality-adjusted life-years. The new workflow was cost-effective for all the included patient groups, using a ceiling ratio of €80,000 or €20,000. For patients treated for tumors located at the larynx and oral cavity, the new workflow resulted in additional quality-adjusted life-years, and costs decreased compared with the regular workflow. The health care payer benefited €14.1 million and €91.5 million, respectively, when individual net monetary benefits were extrapolated to an organizational level and a national level. The redesigned care process reduced the waiting time for the treatment of patients with HNC and proved cost-effective. Because care improved, implementation on a wider scale should be considered. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Fayssal, Safie; Weldon, Danny
2008-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.
A novel probabilistic framework for event-based speech recognition
NASA Astrophysics Data System (ADS)
Juneja, Amit; Espy-Wilson, Carol
2003-10-01
One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.
Inference in the brain: Statistics flowing in redundant population codes
Pitkow, Xaq; Angelaki, Dora E
2017-01-01
It is widely believed that the brain performs approximate probabilistic inference to estimate causal variables in the world from ambiguous sensory data. To understand these computations, we need to analyze how information is represented and transformed by the actions of nonlinear recurrent neural networks. We propose that these probabilistic computations function by a message-passing algorithm operating at the level of redundant neural populations. To explain this framework, we review its underlying concepts, including graphical models, sufficient statistics, and message-passing, and then describe how these concepts could be implemented by recurrently connected probabilistic population codes. The relevant information flow in these networks will be most interpretable at the population level, particularly for redundant neural codes. We therefore outline a general approach to identify the essential features of a neural message-passing algorithm. Finally, we argue that to reveal the most important aspects of these neural computations, we must study large-scale activity patterns during moderately complex, naturalistic behaviors. PMID:28595050
Using phase II data for the analysis of phase III studies: An application in rare diseases.
Wandel, Simon; Neuenschwander, Beat; Röver, Christian; Friede, Tim
2017-06-01
Clinical research and drug development in orphan diseases are challenging, since large-scale randomized studies are difficult to conduct. Formally synthesizing the evidence is therefore of great value, yet this is rarely done in the drug-approval process. Phase III designs that make better use of phase II data can facilitate drug development in orphan diseases. A Bayesian meta-analytic approach is used to inform the phase III study with phase II data. It is particularly attractive, since uncertainty of between-trial heterogeneity can be dealt with probabilistically, which is critical if the number of studies is small. Furthermore, it allows quantifying and discounting the phase II data through the predictive distribution relevant for phase III. A phase III design is proposed which uses the phase II data and considers approval based on a phase III interim analysis. The design is illustrated with a non-inferiority case study from a Food and Drug Administration approval in herpetic keratitis (an orphan disease). Design operating characteristics are compared to those of a traditional design, which ignores the phase II data. An analysis of the phase II data reveals good but insufficient evidence for non-inferiority, highlighting the need for a phase III study. For the phase III study supported by phase II data, the interim analysis is based on half of the patients. For this design, the meta-analytic interim results are conclusive and would justify approval. In contrast, based on the phase III data only, interim results are inconclusive and require further evidence. To accelerate drug development for orphan diseases, innovative study designs and appropriate methodology are needed. Taking advantage of randomized phase II data when analyzing phase III studies looks promising because the evidence from phase II supports informed decision-making. The implementation of the Bayesian design is straightforward with public software such as R.
Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian
2012-04-01
Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.
Saul: Towards Declarative Learning Based Programming
Kordjamshidi, Parisa; Roth, Dan; Wu, Hao
2015-01-01
We present Saul, a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction. PMID:26635465
Saul: Towards Declarative Learning Based Programming.
Kordjamshidi, Parisa; Roth, Dan; Wu, Hao
2015-07-01
We present Saul , a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction.
Campbell, Kieran R; Yau, Christopher
2017-03-15
Modeling bifurcations in single-cell transcriptomics data has become an increasingly popular field of research. Several methods have been proposed to infer bifurcation structure from such data, but all rely on heuristic non-probabilistic inference. Here we propose the first generative, fully probabilistic model for such inference based on a Bayesian hierarchical mixture of factor analyzers. Our model exhibits competitive performance on large datasets despite implementing full Markov-Chain Monte Carlo sampling, and its unique hierarchical prior structure enables automatic determination of genes driving the bifurcation process. We additionally propose an Empirical-Bayes like extension that deals with the high levels of zero-inflation in single-cell RNA-seq data and quantify when such models are useful. We apply or model to both real and simulated single-cell gene expression data and compare the results to existing pseudotime methods. Finally, we discuss both the merits and weaknesses of such a unified, probabilistic approach in the context practical bioinformatics analyses.
Non-Deterministic Dynamic Instability of Composite Shells
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2004-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics, and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties, in that order.
Biedermann, Alex; Taroni, Franco; Margot, Pierre
2012-01-31
Prior probabilities represent a core element of the Bayesian probabilistic approach to relatedness testing. This letter opinions on the commentary Use of prior odds for missing persons identifications by Budowle et al., published recently in this journal. Contrary to Budowle et al., we argue that the concept of prior probabilities (i) is not endowed with the notion of objectivity, (ii) is not a case for computation, and (iii) does not require new guidelines edited by the forensic DNA community--as long as probability is properly considered as an expression of personal belief.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps
Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha
2016-01-01
We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated ground deformation
Comparison of the economic impact of different wind power forecast systems for producers
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Davò, F.; Sperati, S.; Benini, M.; Delle Monache, L.
2014-05-01
Deterministic forecasts of wind production for the next 72 h at a single wind farm or at the regional level are among the main end-users requirement. However, for an optimal management of wind power production and distribution it is important to provide, together with a deterministic prediction, a probabilistic one. A deterministic forecast consists of a single value for each time in the future for the variable to be predicted, while probabilistic forecasting informs on probabilities for potential future events. This means providing information about uncertainty (i.e. a forecast of the PDF of power) in addition to the commonly provided single-valued power prediction. A significant probabilistic application is related to the trading of energy in day-ahead electricity markets. It has been shown that, when trading future wind energy production, using probabilistic wind power predictions can lead to higher benefits than those obtained by using deterministic forecasts alone. In fact, by using probabilistic forecasting it is possible to solve economic model equations trying to optimize the revenue for the producer depending, for example, on the specific penalties for forecast errors valid in that market. In this work we have applied a probabilistic wind power forecast systems based on the "analog ensemble" method for bidding wind energy during the day-ahead market in the case of a wind farm located in Italy. The actual hourly income for the plant is computed considering the actual selling energy prices and penalties proportional to the unbalancing, defined as the difference between the day-ahead offered energy and the actual production. The economic benefit of using a probabilistic approach for the day-ahead energy bidding are evaluated, resulting in an increase of 23% of the annual income for a wind farm owner in the case of knowing "a priori" the future energy prices. The uncertainty on price forecasting partly reduces the economic benefit gained by using a probabilistic energy forecast system.
Multi-Scale/Multi-Functional Probabilistic Composite Fatigue
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A multi-level (multi-scale/multi-functional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.
Probabilistic structural analysis methods and applications
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.
1988-01-01
An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.
Probabilistic, Decision-theoretic Disease Surveillance and Control
Wagner, Michael; Tsui, Fuchiang; Cooper, Gregory; Espino, Jeremy U.; Harkema, Hendrik; Levander, John; Villamarin, Ricardo; Voorhees, Ronald; Millett, Nicholas; Keane, Christopher; Dey, Anind; Razdan, Manik; Hu, Yang; Tsai, Ming; Brown, Shawn; Lee, Bruce Y.; Gallagher, Anthony; Potter, Margaret
2011-01-01
The Pittsburgh Center of Excellence in Public Health Informatics has developed a probabilistic, decision-theoretic system for disease surveillance and control for use in Allegheny County, PA and later in Tarrant County, TX. This paper describes the software components of the system and its knowledge bases. The paper uses influenza surveillance to illustrate how the software components transform data collected by the healthcare system into population level analyses and decision analyses of potential outbreak-control measures. PMID:23569617
Probabilistic modeling of the evolution of gene synteny within reconciled phylogenies
2015-01-01
Background Most models of genome evolution concern either genetic sequences, gene content or gene order. They sometimes integrate two of the three levels, but rarely the three of them. Probabilistic models of gene order evolution usually have to assume constant gene content or adopt a presence/absence coding of gene neighborhoods which is blind to complex events modifying gene content. Results We propose a probabilistic evolutionary model for gene neighborhoods, allowing genes to be inserted, duplicated or lost. It uses reconciled phylogenies, which integrate sequence and gene content evolution. We are then able to optimize parameters such as phylogeny branch lengths, or probabilistic laws depicting the diversity of susceptibility of syntenic regions to rearrangements. We reconstruct a structure for ancestral genomes by optimizing a likelihood, keeping track of all evolutionary events at the level of gene content and gene synteny. Ancestral syntenies are associated with a probability of presence. We implemented the model with the restriction that at most one gene duplication separates two gene speciations in reconciled gene trees. We reconstruct ancestral syntenies on a set of 12 drosophila genomes, and compare the evolutionary rates along the branches and along the sites. We compare with a parsimony method and find a significant number of results not supported by the posterior probability. The model is implemented in the Bio++ library. It thus benefits from and enriches the classical models and methods for molecular evolution. PMID:26452018
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
NASA Technical Reports Server (NTRS)
Abdi, Frank
1996-01-01
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.
Krejsa, Martin; Janas, Petr; Yilmaz, Işık; Marschalko, Marian; Bouchal, Tomas
2013-01-01
The load-carrying system of each construction should fulfill several conditions which represent reliable criteria in the assessment procedure. It is the theory of structural reliability which determines probability of keeping required properties of constructions. Using this theory, it is possible to apply probabilistic computations based on the probability theory and mathematic statistics. Development of those methods has become more and more popular; it is used, in particular, in designs of load-carrying structures with the required level or reliability when at least some input variables in the design are random. The objective of this paper is to indicate the current scope which might be covered by the new method—Direct Optimized Probabilistic Calculation (DOProC) in assessments of reliability of load-carrying structures. DOProC uses a purely numerical approach without any simulation techniques. This provides more accurate solutions to probabilistic tasks, and, in some cases, such approach results in considerably faster completion of computations. DOProC can be used to solve efficiently a number of probabilistic computations. A very good sphere of application for DOProC is the assessment of the bolt reinforcement in the underground and mining workings. For the purposes above, a special software application—“Anchor”—has been developed. PMID:23935412
Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans
2017-11-01
Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.
PySeqLab: an open source Python package for sequence labeling and segmentation.
Allam, Ahmed; Krauthammer, Michael
2017-11-01
Text and genomic data are composed of sequential tokens, such as words and nucleotides that give rise to higher order syntactic constructs. In this work, we aim at providing a comprehensive Python library implementing conditional random fields (CRFs), a class of probabilistic graphical models, for robust prediction of these constructs from sequential data. Python Sequence Labeling (PySeqLab) is an open source package for performing supervised learning in structured prediction tasks. It implements CRFs models, that is discriminative models from (i) first-order to higher-order linear-chain CRFs, and from (ii) first-order to higher-order semi-Markov CRFs (semi-CRFs). Moreover, it provides multiple learning algorithms for estimating model parameters such as (i) stochastic gradient descent (SGD) and its multiple variations, (ii) structured perceptron with multiple averaging schemes supporting exact and inexact search using 'violation-fixing' framework, (iii) search-based probabilistic online learning algorithm (SAPO) and (iv) an interface for Broyden-Fletcher-Goldfarb-Shanno (BFGS) and the limited-memory BFGS algorithms. Viterbi and Viterbi A* are used for inference and decoding of sequences. Using PySeqLab, we built models (classifiers) and evaluated their performance in three different domains: (i) biomedical Natural language processing (NLP), (ii) predictive DNA sequence analysis and (iii) Human activity recognition (HAR). State-of-the-art performance comparable to machine-learning based systems was achieved in the three domains without feature engineering or the use of knowledge sources. PySeqLab is available through https://bitbucket.org/A_2/pyseqlab with tutorials and documentation. ahmed.allam@yale.edu or michael.krauthammer@yale.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Schurz, Matthias; Tholen, Matthias G; Perner, Josef; Mars, Rogier B; Sallet, Jerome
2017-09-01
In this quantitative review, we specified the anatomical basis of brain activity reported in the Temporo-Parietal Junction (TPJ) in Theory of Mind (ToM) research. Using probabilistic brain atlases, we labeled TPJ peak coordinates reported in the literature. This was carried out for four different atlas modalities: (i) gyral-parcellation, (ii) sulco-gyral parcellation, (iii) cytoarchitectonic parcellation and (iv) connectivity-based parcellation. In addition, our review distinguished between two ToM task types (false belief and social animations) and a nonsocial task (attention reorienting). We estimated the mean probabilities of activation for each atlas label, and found that for all three task types part of TPJ activations fell into the same areas: (i) Angular Gyrus (AG) and Lateral Occpital Cortex (LOC) in terms of a gyral atlas, (ii) AG and Superior Temporal Sulcus (STS) in terms of a sulco-gyral atlas, (iii) areas PGa and PGp in terms of cytoarchitecture and (iv) area TPJp in terms of a connectivity-based parcellation atlas. Beside these commonalities, we also found that individual task types showed preferential activation for particular labels. Main findings for the right hemisphere were preferential activation for false belief tasks in AG/PGa, and in Supramarginal Gyrus (SMG)/PFm for attention reorienting. Social animations showed strongest selective activation in the left hemisphere, specifically in left Middle Temporal Gyrus (MTG). We discuss how our results (i.e., identified atlas structures) can provide a new reference for describing future findings, with the aim to integrate different labels and terminologies used for studying brain activity around the TPJ. Hum Brain Mapp 38:4788-4805, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.
Probabilistic Fiber Composite Micromechanics
NASA Technical Reports Server (NTRS)
Stock, Thomas A.
1996-01-01
Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.
Bazo-Alvarez, Juan Carlos; Bazo-Alvarez, Oscar Alfredo; Aguila, Jeins; Peralta, Frank; Mormontoy, Wilfredo; Bennett, Ian M
2016-01-01
Our aim was to evaluate the psychometric properties of the FACES-III among Peruvian high school students. This is a psychometric cross-sectional study. A probabilistic sampling was applied, defined by three stages: stratum one (school), stratum two (grade) and cluster (section). The participants were 910 adolescent students of both sexes, between 11 and 18 years of age. The instrument was also the object of study: the Olson's FACES-III. The analysis included a review of the structure / construct validity of the measure by factor analysis and assessment of internal consistency (reliability). The real-cohesion scale had moderately high reliability (Ω=.85) while the real-flexibility scale had moderate reliability (Ω=.74). The reliability found for the ideal-cohesion was moderately high (Ω=.89) like for the scale of ideal-flexibility (Ω=.86). Construct validity was confirmed by the goodness of fit of a two factor model (cohesion and flexibility) with 10 items each [Adjusted goodness of fit index (AGFI) = 0.96; Expected Cross Validation Index (ECVI) = 0.87; Normed fit index (NFI) = 0.93; Goodness of fit index (GFI) = 0.97; Root mean square error of approximation (RMSEA) = 0.06]. FACES-III has sufficient reliability and validity to be used in Peruvian adolescents for the purpose of group or individual assessment.
Hunt, James; Birch, Gavin; Warne, Michael St J
2010-05-01
Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.
Development of a Probabilistic Decision-Support Model to Forecast Coastal Resilience
NASA Astrophysics Data System (ADS)
Wilson, K.; Safak, I.; Brenner, O.; Lentz, E. E.; Hapke, C. J.
2016-02-01
Site-specific forecasts of coastal change are a valuable management tool in preparing for and assessing storm-driven impacts in coastal areas. More specifically, understanding the likelihood of storm impacts, recovery following events, and the alongshore variability of both is central in evaluating vulnerability and resiliency of barrier islands. We introduce a probabilistic modeling framework that integrates hydrodynamic, anthropogenic, and morphologic components of the barrier system to evaluate coastal change at Fire Island, New York. The model is structured on a Bayesian network (BN), which utilizes observations to learn statistical relationships between system variables. In addition to predictive ability, probabilistic models convey the level of confidence associated with a prediction, an important consideration for coastal managers. Our model predicts the likelihood of morphologic change on the upper beach based on several decades of beach monitoring data. A coupled hydrodynamic BN combines probabilistic and deterministic modeling approaches; by querying nearly two decades of nested-grid wave simulations that account for both distant swells and local seas, we produce scenarios of event and seasonal wave climates. The wave scenarios of total water level - a sum of run up, surge and tide - and anthropogenic modification are the primary drivers of morphologic change in our model structure. Preliminary results show the hydrodynamic BN is able to reproduce time series of total water levels, a critical validation process before generating scenarios, and forecasts of geomorphic change over three month intervals are up to 70% accurate. Predictions of storm-induced change and recovery are linked to evaluate zones of persistent vulnerability or resilience and will help managers target restoration efforts, identify areas most vulnerable to habitat degradation, and highlight resilient zones that may best support relocation of critical infrastructure.
NASA Astrophysics Data System (ADS)
Bandini, Filippo; Butts, Michael; Vammen Jacobsen, Torsten; Bauer-Gottwein, Peter
2016-04-01
Integrated hydrological models are generally calibrated against observations of river discharge and piezometric head in groundwater aquifers. Integrated hydrological models are rarely calibrated against spatially distributed water level observations measured by either in-situ stations or spaceborne platforms. Indeed in-situ observations derived from ground-based stations are generally spaced too far apart to capture spatial patterns in the water surface. On the other hand spaceborne observations have limited spatial resolution. Additionally satellite observations have a temporal resolution which is not ideal for observing the temporal patterns of the hydrological variables during extreme events. UAVs (Unmanned Aerial Vehicles) offer several advantages: i) high spatial resolution; ii) tracking of the water body better than any satellite technology; iii) timing of the sampling merely depending on the operators. In this case study the Mølleåen river (Denmark) and its catchment have been simulated through an integrated hydrological model (MIKE 11-MIKE SHE). This model was initially calibrated against observations of river discharge retrieved by in-situ stations and against piezometric head of the aquifers. Subsequently the hydrological model has been calibrated against dense spatially distributed water level observations, which could potentially be retrieved by UAVs. Error characteristics of synthetic UAV water level observations were taken from a recent proof-of-concept study. Since the technology for ranging water level is under development, UAV synthetic water level observations were extracted from another model of the river with higher spatial resolution (cross sections located every 10 m). This model with high resolution is assumed to be absolute truth for the purpose of this work. The river model with the coarser resolution has been calibrated against the synthetic water level observations through Differential Evolution Adaptive Metropolis (DREAM) algorithm, an efficient global Markov Chain Monte Carlo (MCMC) in high-dimensional spaces. Calibration against water level has demonstrated a significant improvement of the estimation of the exchange flow between groundwater and river branch. Groundwater flux and direction are now better simulated. Reliability and sharpness of the probabilistic forecasts are assessed with the sharpness, the interval skill score (ISS) of the 95{%} confidence interval, and with the root mean square error (RMSE) of the maximum a posteriori probability (MAP). The binary outcome (either gaining or loosing stream) of the flow direction is assessed with Brier score (BS). After water level calibration the sharpness of the estimations is approximately doubled with respect to the model calibrated only against discharge, ISS has improved from 2.4-7to 7.8-8 m^3/s\\cdot m, RMSE from 9.2-8 to 2.4-8 m^3/s\\cdot m^and BS is halved from 0.58 to 0.25.
Displaying uncertainty: investigating the effects of display format and specificity.
Bisantz, Ann M; Marsiglio, Stephanie Schinzing; Munch, Jessica
2005-01-01
We conducted four studies regarding the representation of probabilistic information. Experiments 1 through 3 compared performance on a simulated stock purchase task, in which information regarding stock profitability was probabilistic. Two variables were manipulated: display format for probabilistic information (blurred and colored icons, linguistic phrases, numeric expressions, and combinations) and specificity level (in which the number and size of discrete steps into which the probabilistic information was mapped differed). Results indicated few performance differences attributable to display format; however, performance did improve with greater specificity. Experiment 4, in which participants generated membership functions corresponding to three display formats, found a high degree of similarity in functions across formats and participants and a strong relationship between the shape of the membership function and the intended meaning of the representation. These results indicate that participants can successfully interpret nonnumeric representations of uncertainty and can use such representations in a manner similar to the way numeric expressions are used in a decision-making task. Actual or potential applications of this research include the use of graphical representations of uncertainty in systems such as command and control and situation displays.
Probabilistic verification of cloud fraction from three different products with CALIPSO
NASA Astrophysics Data System (ADS)
Jung, B. J.; Descombes, G.; Snyder, C.
2017-12-01
In this study, we present how Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) can be used for probabilistic verification of cloud fraction, and apply this probabilistic approach to three cloud fraction products: a) The Air Force Weather (AFW) World Wide Merged Cloud Analysis (WWMCA), b) Satellite Cloud Observations and Radiative Property retrieval Systems (SatCORPS) from NASA Langley Research Center, and c) Multi-sensor Advection Diffusion nowCast (MADCast) from NCAR. Although they differ in their details, both WWMCA and SatCORPS retrieve cloud fraction from satellite observations, mainly of infrared radiances. MADCast utilizes in addition a short-range forecast of cloud fraction (provided by the Model for Prediction Across Scales, assuming cloud fraction is advected as a tracer) and a column-by-column particle filter implemented within the Gridpoint Statistical Interpolation (GSI) data-assimilation system. The probabilistic verification considers the retrieved or analyzed cloud fractions as predicting the probability of cloud at any location within a grid cell and the 5-km vertical feature mask (VFM) from CALIPSO level-2 products as a point observation of cloud.
NASA Astrophysics Data System (ADS)
Sujadi, Imam; Kurniasih, Rini; Subanti, Sri
2017-05-01
In the era of 21st century learning, it needs to use technology as a learning media. Using Edmodo as a learning media is one of the options as the complement in learning process. However, this research focuses on the effectiveness of learning material using Edmodo. The aim of this research to determine whether the level of student's probabilistic thinking that use learning material with Edmodo is better than the existing learning materials (books) implemented to teach the subject of students grade 8th. This is quasi-experimental research using control group pretest and posttest. The population of this study was students grade 8 of SMPN 12 Surakarta and the sampling technique used random sampling. The analysis technique used to examine two independent sample using Kolmogorov-Smirnov test. The obtained value of test statistic is M=0.38, since 0.38 is the largest tabled critical one-tailed value M0.05=0.011. The result of the research is the learning materials with Edmodo more effectively to enhance the level of probabilistic thinking learners than the learning that use the existing learning materials (books). Therefore, learning material using Edmodo can be used in learning process. It can also be developed into another learning material through Edmodo.
Probabilistic Space Weather Forecasting: a Bayesian Perspective
NASA Astrophysics Data System (ADS)
Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.
2017-12-01
Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.
A Hybrid Probabilistic Model for Unified Collaborative and Content-Based Image Tagging.
Zhou, Ning; Cheung, William K; Qiu, Guoping; Xue, Xiangyang
2011-07-01
The increasing availability of large quantities of user contributed images with labels has provided opportunities to develop automatic tools to tag images to facilitate image search and retrieval. In this paper, we present a novel hybrid probabilistic model (HPM) which integrates low-level image features and high-level user provided tags to automatically tag images. For images without any tags, HPM predicts new tags based solely on the low-level image features. For images with user provided tags, HPM jointly exploits both the image features and the tags in a unified probabilistic framework to recommend additional tags to label the images. The HPM framework makes use of the tag-image association matrix (TIAM). However, since the number of images is usually very large and user-provided tags are diverse, TIAM is very sparse, thus making it difficult to reliably estimate tag-to-tag co-occurrence probabilities. We developed a collaborative filtering method based on nonnegative matrix factorization (NMF) for tackling this data sparsity issue. Also, an L1 norm kernel method is used to estimate the correlations between image features and semantic concepts. The effectiveness of the proposed approach has been evaluated using three databases containing 5,000 images with 371 tags, 31,695 images with 5,587 tags, and 269,648 images with 5,018 tags, respectively.
Valdor, Paloma F; Gómez, Aina G; Puente, Araceli
2015-01-15
Diffuse pollution from oil spills is a widespread problem in port areas (as a result of fuel supply, navigation and loading/unloading activities). This article presents a method to assess the environmental risk of oil handling facilities in port areas. The method is based on (i) identification of environmental hazards, (ii) characterization of meteorological and oceanographic conditions, (iii) characterization of environmental risk scenarios, and (iv) assessment of environmental risk. The procedure has been tested by application to the Tarragona harbor. The results show that the method is capable of representing (i) specific local pollution cases (i.e., discriminating between products and quantities released by a discharge source), (ii) oceanographic and meteorological conditions (selecting a representative subset data), and (iii) potentially affected areas in probabilistic terms. Accordingly, it can inform the design of monitoring plans to study and control the environmental impact of these facilities, as well as the design of contingency plans. Copyright © 2014 Elsevier Ltd. All rights reserved.
[The research protocol III. Study population].
Arias-Gómez, Jesús; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe
2016-01-01
The study population is defined as a set of cases, determined, limited, and accessible, that will constitute the subjects for the selection of the sample, and must fulfill several characteristics and distinct criteria. The objectives of this manuscript are focused on specifying each one of the elements required to make the selection of the participants of a research project, during the elaboration of the protocol, including the concepts of study population, sample, selection criteria and sampling methods. After delineating the study population, the researcher must specify the criteria that each participant has to comply. The criteria that include the specific characteristics are denominated selection or eligibility criteria. These criteria are inclusion, exclusion and elimination, and will delineate the eligible population. The sampling methods are divided in two large groups: 1) probabilistic or random sampling and 2) non-probabilistic sampling. The difference lies in the employment of statistical methods to select the subjects. In every research, it is necessary to establish at the beginning the specific number of participants to be included to achieve the objectives of the study. This number is the sample size, and can be calculated or estimated with mathematical formulas and statistic software.
Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L
2008-04-01
Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.
Reddy, Lena Felice; Waltz, James A; Green, Michael F; Wynn, Jonathan K; Horan, William P
2016-07-01
Although individuals with schizophrenia show impaired feedback-driven learning on probabilistic reversal learning (PRL) tasks, the specific factors that contribute to these deficits remain unknown. Recent work has suggested several potential causes including neurocognitive impairments, clinical symptoms, and specific types of feedback-related errors. To examine this issue, we administered a PRL task to 126 stable schizophrenia outpatients and 72 matched controls, and patients were retested 4 weeks later. The task involved an initial probabilistic discrimination learning phase and subsequent reversal phases in which subjects had to adjust their responses to sudden shifts in the reinforcement contingencies. Patients showed poorer performance than controls for both the initial discrimination and reversal learning phases of the task, and performance overall showed good test-retest reliability among patients. A subgroup analysis of patients (n = 64) and controls (n = 49) with good initial discrimination learning revealed no between-group differences in reversal learning, indicating that the patients who were able to achieve all of the initial probabilistic discriminations were not impaired in reversal learning. Regarding potential contributors to impaired discrimination learning, several factors were associated with poor PRL, including higher levels of neurocognitive impairment, poor learning from both positive and negative feedback, and higher levels of indiscriminate response shifting. The results suggest that poor PRL performance in schizophrenia can be the product of multiple mechanisms. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Composite Load Spectra for Select Space Propulsion Structural Components
NASA Technical Reports Server (NTRS)
Ho, Hing W.; Newell, James F.
1994-01-01
Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.
Bayesian probabilistic population projections for all countries.
Raftery, Adrian E; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K
2012-08-28
Projections of countries' future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950-1990 are used for estimation, and applied to predict 1990-2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20-64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacvarov, D.C.
1981-01-01
A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Bayesian Analysis of a Reduced-Form Air Quality Model
Numerical air quality models are being used for assessing emission control strategies for improving ambient pollution levels across the globe. This paper applies probabilistic modeling to evaluate the effectiveness of emission reduction scenarios aimed at lowering ground-level oz...
A SIMPLE CELLULAR AUTOMATON MODEL FOR HIGH-LEVEL VEGETATION DYNAMICS
We have produced a simple two-dimensional (ground-plan) cellular automata model of vegetation dynamics specifically to investigate high-level community processes. The model is probabilistic, with individual plant behavior determined by physiologically-based rules derived from a w...
Probabilistic/Fracture-Mechanics Model For Service Life
NASA Technical Reports Server (NTRS)
Watkins, T., Jr.; Annis, C. G., Jr.
1991-01-01
Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.
NASA Astrophysics Data System (ADS)
Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor
2017-04-01
Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.
Kwon, H G; Hong, J H; Jang, S H
2011-12-01
Little is known about the detailed anatomic location and somatotopic arrangement at the CP. Using DTT with FSL tools, we conducted an investigation of the anatomic location and somatotopic arrangement of the CST at the CP in the human brain. We recruited 43 healthy volunteers for this study. DTI was obtained by using 1.5T, and CSTs for the hand and leg were obtained by using the FSL tool. The somatotopic location of the CST was evaluated as the highest probabilistic location at the upper and lower midbrain. The posterior boundary was determined as the line between the interpeduncular fossa and the lateral sulcus; we then drew a rectangle on the basis of the boundary of the CP. In the mediolateral direction, the highest probabilistic locations for the hand and leg were an average of 60.46% and 69.98% from the medial boundary at the upper midbrain level and 53.44% and 62.76% at the lower midbrain level, respectively. As for the anteroposterior direction, the highest probabilistic locations for the hand and leg were an average of 28.26% and 32.03% from the anterior boundary at the upper midbrain level and 30.19% and 33.59% at the lower midbrain level, respectively. We found that the hand somatotopy for the CST is located at the middle portion of the CP and the leg somatotopy is located lateral to the hand somatotopy.
Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico
NASA Astrophysics Data System (ADS)
Wahl, Thomas; Plant, Nathaniel G.; Long, Joseph W.
2016-05-01
We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.
Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico
Plant, Nathaniel G.; Wahl, Thomas; Long, Joseph W.
2016-01-01
We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.
Probabilistic Mapping of Storm-induced Coastal Inundation for Climate Change Adaptation
NASA Astrophysics Data System (ADS)
Li, N.; Yamazaki, Y.; Roeber, V.; Cheung, K. F.; Chock, G.
2016-02-01
Global warming is posing an imminent threat to coastal communities worldwide. Under the IPCC RCP8.5 scenario, we utilize hurricane events downscaled from a CMIP5 global climate model using the stochastic-deterministic method of Emanuel (2013, Proc. Nat. Acad. Sci.) in a pilot study to develop an inundation map with projected sea-level rise for the urban Honolulu coast. The downscaling is performed for a 20-year period from 2081 to 2100 to capture the ENSO, which strongly influences the hurricane activity in the Pacific. A total of 50 simulations provide a quasi-stationary dataset of 1000 years for probabilistic analysis of the flood hazards toward the end of the century. We utilize the meta-model Hakou, which is based on precomputed hurricane scenarios using ADCIRC, SWAN, and a 1D Boussinesq model (Kennedy et al., 2012, Ocean Modelling), to estimate the annual maximum inundation along the project coastline at the present sea level. Screening of the preliminary results identifies the most severe three events for detailed inundation modeling using the package of Li et al. (2014, Ocean Modelling) at the projected sea level. For each event, the third generation spectral model WAVEWATCH III of Tolman (2008, Ocean Modelling) provides the hurricane waves and the circulation model NEOWAVE of Yamazaki et al. (2009, 2011, Int. J. Num. Meth. Fluids) computes the surge using a system of telescopic nested grids from the open ocean to the project coastline. The output defines the boundary conditions and initial still-water elevation for computation of phase-resolving surf-zone and inundation processes using the 2D Boussinesq model of Roeber and Cheung (2012, Coastal Engineering). Each computed inundation event corresponds to an annual maximum, and with 1000 years of data, has an occurrence probability of 0.1% in a given year. Barring the tail of the distribution, aggregation of the three computed events allow delineation of the inundation zone with annual exceedance probability equal to or greater than 0.2% (equivalent to a 500-year return period). An immediate application is to assess the inventory of buildings and structures in Honolulu that would be exposed to increased flood risks due to climate change and identify potential revisions to the building code as part of the adaptation process.
Nishita, Toshiho; Tomita, Yuichiro; Yorifuji, Daisuke; Orito, Kensuke; Ochiai, Hideharu; Arishima, Kazuyosi
2011-11-26
The developmental profile of chicken carbonic anhydrase-III (CA-III) blood levels has not been previously determined or reported. We isolated CA-III from chicken muscle and investigated age-related changes in the levels of CA-III in blood. CA-III was purified from chicken muscle. The levels of CA-III in plasma and erythrocytes from 278 female chickens (aged 1-93 weeks) and 68 male chickens (aged 3-59 weeks) were determined by ELISA. The mean level of CA-III in female chicken erythrocytes (1 week old) was 4.6 μg/g of Hb, and the CA-III level did not change until 16 weeks of age. The level then increased until 63 weeks of age (11.8 μg/g of Hb), decreased to 4.7 μg/g of Hb at 73 weeks of age, and increased again until 93 weeks of age (8.6 μg/g of Hb). The mean level of CA-III in erythrocytes from male chickens (3 weeks old) was 2.4 μg/g of Hb, and this level remained steady until 59 weeks of age. The mean plasma level of CA-III in 1-week-old female chickens was 60 ng/mL, and this level was increased at 3 weeks of age (141 ng/mL) and then remained steady until 80 weeks of age (122 ng/mL). The mean plasma level of CA-III in 3-week-old male chickens was 58 ng/mL, and this level remained steady until 59 weeks of age. We observed both developmental changes and sex differences in CA-III concentrations in White Leghorn (WL) chicken erythrocytes and plasma. Simple linear regression analysis showed a significant association between the erythrocyte CA-III level and egg-laying rate in WL-chickens 16-63 weeks of age (p < 0.01).
Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.
2018-01-01
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804
Bragman, Felix J.S.; McClelland, Jamie R.; Jacob, Joseph; Hurst, John R.; Hawkes, David J.
2017-01-01
A fully automated, unsupervised lobe segmentation algorithm is presented based on a probabilistic segmentation of the fissures and the simultaneous construction of a population model of the fissures. A two-class probabilistic segmentation segments the lung into candidate fissure voxels and the surrounding parenchyma. This was combined with anatomical information and a groupwise fissure prior to drive non-parametric surface fitting to obtain the final segmentation. The performance of our fissure segmentation was validated on 30 patients from the COPDGene cohort, achieving a high median F1-score of 0.90 and showed general insensitivity to filter parameters. We evaluated our lobe segmentation algorithm on the LOLA11 dataset, which contains 55 cases at varying levels of pathology. We achieved the highest score of 0.884 of the automated algorithms. Our method was further tested quantitatively and qualitatively on 80 patients from the COPDGene study at varying levels of functional impairment. Accurate segmentation of the lobes is shown at various degrees of fissure incompleteness for 96% of all cases. We also show the utility of including a groupwise prior in segmenting the lobes in regions of grossly incomplete fissures. PMID:28436850
Surrogate modeling of joint flood risk across coastal watersheds
NASA Astrophysics Data System (ADS)
Bass, Benjamin; Bedient, Philip
2018-03-01
This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.
Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A
2017-11-06
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.
Probabilistic properties of the Curve Number
NASA Astrophysics Data System (ADS)
Rutkowska, Agnieszka; Banasik, Kazimierz; Kohnova, Silvia; Karabova, Beata
2013-04-01
The determination of the Curve Number (CN) is fundamental for the hydrological rainfall-runoff SCS-CN method which assesses the runoff volume in small catchments. The CN depends on geomorphologic and physiographic properties of the catchment and traditionally it is assumed to be constant for each catchment. Many practitioners and researchers observe, however, that the parameter is characterized by a variability. This sometimes causes inconsistency in the river discharge prediction using the SCS-CN model. Hence probabilistic and statistical methods are advisable to investigate the CN as a random variable and to complement and improve the deterministic model. The results that will be presented contain determination of the probabilistic properties of the CNs for various Slovakian and Polish catchments using statistical methods. The detailed study concerns the description of empirical distributions (characteristics, QQ-plots and coefficients of goodness of fit, histograms), testing of the statistical hypotheses about some theoretical distributions (Kolmogorov-Smirnow, Anderson-Darling, Cramer-von Mises, χ2, Shapiro-Wilk), construction of confidence intervals and comparisons among catchments. The relationship between confidence intervals and the ARC soil classification will also be performed. The comparison between the border values of the confidence intervals and the ARC I and ARC III conditions is crucial for further modeling. The study of the response of the catchment to the stormy rainfall depth when the variability of the CN arises is also of special interest. ACKNOWLEDGMENTS The investigation described in the contribution has been initiated by first Author research visit to Technical University of Bratislava in 2012 within a STSM of the COST Action ES0901. Data used here have been provided by research project no. N N305 396238 founded by PL-Ministry of Science and Higher Education. The support provided by the organizations is gratefully acknowledged.
Probabilistic Analysis of Large-Scale Composite Structures Using the IPACS Code
NASA Technical Reports Server (NTRS)
Lemonds, Jeffrey; Kumar, Virendra
1995-01-01
An investigation was performed to ascertain the feasibility of using IPACS (Integrated Probabilistic Assessment of Composite Structures) for probabilistic analysis of a composite fan blade, the development of which is being pursued by various industries for the next generation of aircraft engines. A model representative of the class of fan blades used in the GE90 engine has been chosen as the structural component to be analyzed with IPACS. In this study, typical uncertainties are assumed in the level, and structural responses for ply stresses and frequencies are evaluated in the form of cumulative probability density functions. Because of the geometric complexity of the blade, the number of plies varies from several hundred at the root to about a hundred at the tip. This represents a extremely complex composites application for the IPACS code. A sensitivity study with respect to various random variables is also performed.
Probabilistic Methods for Structural Design and Reliability
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)
2002-01-01
This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.
Probabilistic vs linear blending approaches to shared control for wheelchair driving.
Ezeh, Chinemelu; Trautman, Pete; Devigne, Louise; Bureau, Valentin; Babel, Marie; Carlson, Tom
2017-07-01
Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.
Fast, noise-free memory for photon synchronization at room temperature.
Finkelstein, Ran; Poem, Eilon; Michel, Ohad; Lahad, Ohr; Firstenberg, Ofer
2018-01-01
Future quantum photonic networks require coherent optical memories for synchronizing quantum sources and gates of probabilistic nature. We demonstrate a fast ladder memory (FLAME) mapping the optical field onto the superposition between electronic orbitals of rubidium vapor. Using a ladder-level system of orbital transitions with nearly degenerate frequencies simultaneously enables high bandwidth, low noise, and long memory lifetime. We store and retrieve 1.7-ns-long pulses, containing 0.5 photons on average, and observe short-time external efficiency of 25%, memory lifetime (1/ e ) of 86 ns, and below 10 -4 added noise photons. Consequently, coupling this memory to a probabilistic source would enhance the on-demand photon generation probability by a factor of 12, the highest number yet reported for a noise-free, room temperature memory. This paves the way toward the controlled production of large quantum states of light from probabilistic photon sources.
Gofer-Levi, M; Silberg, T; Brezner, A; Vakil, E
2014-09-01
Children learn to engage their surroundings skillfully, adopting implicit knowledge of complex regularities and associations. Probabilistic classification learning (PCL) is a type of cognitive procedural learning in which different cues are probabilistically associated with specific outcomes. Little is known about the effects of developmental disorders on cognitive skill acquisition. Twenty-four children and adolescents with cerebral palsy (CP) were compared to 24 typically developing (TD) youth in their ability to learn probabilistic associations. Performance was examined in relation to general cognitive abilities, level of motor impairment and age. Improvement in PCL was observed for all participants, with no relation to IQ. An age effect was found only among TD children. Learning curves of children with CP on a cognitive procedural learning task differ from those of TD peers and do not appear to be age sensitive. Copyright © 2014 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... NUCLEAR REGULATORY COMMISSION [NRC-2013-0047] Compendium of Analyses To Investigate Select Level 1...) has issued for public comment a document entitled: Compendium of Analyses to Investigate Select Level... begin the search, select ``ADAMS Public Documents'' and then select ``Begin Web- based ADAMS Search...
Dudley, Robert W.; Hodgkins, Glenn A.; Dickinson, Jesse
2017-01-01
We present a logistic regression approach for forecasting the probability of future groundwater levels declining or maintaining below specific groundwater-level thresholds. We tested our approach on 102 groundwater wells in different climatic regions and aquifers of the United States that are part of the U.S. Geological Survey Groundwater Climate Response Network. We evaluated the importance of current groundwater levels, precipitation, streamflow, seasonal variability, Palmer Drought Severity Index, and atmosphere/ocean indices for developing the logistic regression equations. Several diagnostics of model fit were used to evaluate the regression equations, including testing of autocorrelation of residuals, goodness-of-fit metrics, and bootstrap validation testing. The probabilistic predictions were most successful at wells with high persistence (low month-to-month variability) in their groundwater records and at wells where the groundwater level remained below the defined low threshold for sustained periods (generally three months or longer). The model fit was weakest at wells with strong seasonal variability in levels and with shorter duration low-threshold events. We identified challenges in deriving probabilistic-forecasting models and possible approaches for addressing those challenges.
A model-based test for treatment effects with probabilistic classifications.
Cavagnaro, Daniel R; Davis-Stober, Clintin P
2018-05-21
Within modern psychology, computational and statistical models play an important role in describing a wide variety of human behavior. Model selection analyses are typically used to classify individuals according to the model(s) that best describe their behavior. These classifications are inherently probabilistic, which presents challenges for performing group-level analyses, such as quantifying the effect of an experimental manipulation. We answer this challenge by presenting a method for quantifying treatment effects in terms of distributional changes in model-based (i.e., probabilistic) classifications across treatment conditions. The method uses hierarchical Bayesian mixture modeling to incorporate classification uncertainty at the individual level into the test for a treatment effect at the group level. We illustrate the method with several worked examples, including a reanalysis of the data from Kellen, Mata, and Davis-Stober (2017), and analyze its performance more generally through simulation studies. Our simulations show that the method is both more powerful and less prone to type-1 errors than Fisher's exact test when classifications are uncertain. In the special case where classifications are deterministic, we find a near-perfect power-law relationship between the Bayes factor, derived from our method, and the p value obtained from Fisher's exact test. We provide code in an online supplement that allows researchers to apply the method to their own data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Isolation Rearing Effects on Probabilistic Learning and Cognitive Flexibility in Rats
AMITAI, Nurith; YOUNG, Jared W.; HIGA, Kerin; SHARP, Richard F.; GEYER, Mark A.; POWELL, Susan B.
2013-01-01
Isolation rearing is a neurodevelopmental manipulation that produces neurochemical, structural, and behavioral alterations in rodents that have consistencies with schizophrenia. Symptoms induced by isolation rearing that mirror clinically relevant aspects of schizophrenia, such as cognitive deficits, open up the possibility of testing putative therapeutics in isolation-reared animals prior to clinical development. We investigated what effect isolation rearing would have on cognitive flexibility, a cognitive function characteristically disrupted in schizophrenia. For this purpose, we assessed cognitive flexibility using between- and within-session probabilistic reversal-learning tasks based on clinical tests. Isolation-reared rats required more sessions, though not more task trials, to acquire criterion performance in the reversal phase of the task and were slower to adjust their task strategy after reward contingencies were switched. Isolation-reared rats also completed fewer trials and exhibited lower levels of overall activity in the probabilistic reversal-learning task compared to socially reared rats. This finding contrasted with the elevated levels of unconditioned investigatory activity and reduced levels of locomotor habituation that isolation-reared rats displayed in the behavioral pattern monitor. Finally, isolation-reared rats also exhibited sensorimotor gating deficits, reflected by decreased prepulse inhibition of the startle response, consistent with previous studies. We conclude that isolation rearing constitutes a valuable, noninvasive manipulation for modeling schizophrenia-like cognitive deficits and assessing putative therapeutics. PMID:23943516
Global probabilistic projections of extreme sea levels show intensification of coastal flood hazard.
Vousdoukas, Michalis I; Mentaschi, Lorenzo; Voukouvalas, Evangelos; Verlaan, Martin; Jevrejeva, Svetlana; Jackson, Luke P; Feyen, Luc
2018-06-18
Global warming is expected to drive increasing extreme sea levels (ESLs) and flood risk along the world's coastlines. In this work we present probabilistic projections of ESLs for the present century taking into consideration changes in mean sea level, tides, wind-waves, and storm surges. Between the year 2000 and 2100 we project a very likely increase of the global average 100-year ESL of 34-76 cm under a moderate-emission-mitigation-policy scenario and of 58-172 cm under a business as usual scenario. Rising ESLs are mostly driven by thermal expansion, followed by contributions from ice mass-loss from glaciers, and ice-sheets in Greenland and Antarctica. Under these scenarios ESL rise would render a large part of the tropics exposed annually to the present-day 100-year event from 2050. By the end of this century this applies to most coastlines around the world, implying unprecedented flood risk levels unless timely adaptation measures are taken.
Evaluation of Dynamic Coastal Response to Sea-level Rise Modifies Inundation Likelihood
NASA Technical Reports Server (NTRS)
Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.
2016-01-01
Sea-level rise (SLR) poses a range of threats to natural and built environments, making assessments of SLR-induced hazards essential for informed decision making. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30x30m resolution predictions for more than 38,000 sq km of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.
Rausch, Franziska; Mier, Daniela; Eifler, Sarah; Esslinger, Christine; Schilling, Claudia; Schirmbeck, Frederike; Englisch, Susanne; Meyer-Lindenberg, Andreas; Kirsch, Peter; Zink, Mathias
2014-07-01
Patients with schizophrenia suffer from deficits in monitoring and controlling their own thoughts. Within these so-called metacognitive impairments, alterations in probabilistic reasoning might be one cognitive phenomenon disposing to delusions. However, so far little is known about alterations in associated brain functionality. A previously established task for functional magnetic resonance imaging (fMRI), which requires a probabilistic decision after a variable amount of stimuli, was applied to 23 schizophrenia patients and 28 healthy controls matched for age, gender and educational levels. We compared activation patterns during decision-making under conditions of certainty versus uncertainty and evaluated the process of final decision-making in ventral striatum (VS) and ventral tegmental area (VTA). We replicated a pre-described extended cortical activation pattern during probabilistic reasoning. During final decision-making, activations in several fronto- and parietocortical areas, as well as in VS and VTA became apparent. In both of these regions schizophrenia patients showed a significantly reduced activation. These results further define the network underlying probabilistic decision-making. The observed hypo-activation in regions commonly associated with dopaminergic neurotransmission fits into current concepts of disrupted prediction error signaling in schizophrenia and suggests functional links to reward anticipation. Forthcoming studies with patients at risk for psychosis and drug-naive first episode patients are necessary to elucidate the development of these findings over time and the interplay with associated clinical symptoms. Copyright © 2014 Elsevier B.V. All rights reserved.
Bayesian probabilistic population projections for all countries
Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.
2012-01-01
Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249
Probabilistic Reasoning for Plan Robustness
NASA Technical Reports Server (NTRS)
Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.
2005-01-01
A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.
Ceriello, A; Giugliano, D; Quatraro, A; Marchi, E; Barbanti, M; Lefèbvre, P
1990-03-01
In the presence of increased levels of fibrinopeptide A, decreased antithrombin III biological activity, and thrombin-antithrombin III complex levels are seen in diabetic patients. Induced-hyperglycaemia in diabetic and normal subjects decreased antithrombin III activity and thrombin-antithrombin III levels, and increased fibrinopeptide A plasma levels, while antithrombin III concentration did not change; heparin was shown to reduced these phenomena. In diabetic patients, euglycaemia induced by insulin infusion restored antithrombin III activity, thrombin-antithrombin III complex and fibrinopeptide A concentrations; heparin administration had the same effects. These data stress the role of a hyperglycaemia-dependent decrease of antithrombin III activity in precipitating thrombin hyperactivity in diabetes mellitus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.
This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Termmore » Seismic Program.« less
Programming Probabilistic Structural Analysis for Parallel Processing Computer
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.
1991-01-01
The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.
Decision analysis with approximate probabilities
NASA Technical Reports Server (NTRS)
Whalen, Thomas
1992-01-01
This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.
NASA Astrophysics Data System (ADS)
Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.
2013-12-01
Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.
Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering
NASA Astrophysics Data System (ADS)
Hynes-Griffin, M. E.; Buege, L. L.
1983-09-01
Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.
Preston, Daniel L; Jacobs, Abigail Z; Orlofske, Sarah A; Johnson, Pieter T J
2014-03-01
Most food webs use taxonomic or trophic species as building blocks, thereby collapsing variability in feeding linkages that occurs during the growth and development of individuals. This issue is particularly relevant to integrating parasites into food webs because parasites often undergo extreme ontogenetic niche shifts. Here, we used three versions of a freshwater pond food web with varying levels of node resolution (from taxonomic species to life stages) to examine how complex life cycles and parasites alter web properties, the perceived trophic position of organisms, and the fit of a probabilistic niche model. Consistent with prior studies, parasites increased most measures of web complexity in the taxonomic species web; however, when nodes were disaggregated into life stages, the effects of parasites on several network properties (e.g., connectance and nestedness) were reversed, due in part to the lower trophic generality of parasite life stages relative to free-living life stages. Disaggregation also reduced the trophic level of organisms with either complex or direct life cycles and was particularly useful when including predation on parasites, which can inflate trophic positions when life stages are collapsed. Contrary to predictions, disaggregation decreased network intervality and did not enhance the fit of a probabilistic niche model to the food webs with parasites. Although the most useful level of biological organization in food webs will vary with the questions of interest, our results suggest that disaggregating species-level nodes may refine our perception of how parasites and other complex life cycle organisms influence ecological networks.
Alemani, Davide; Pappalardo, Francesco; Pennisi, Marzio; Motta, Santo; Brusic, Vladimir
2012-02-28
In the last decades the Lattice Boltzmann method (LB) has been successfully used to simulate a variety of processes. The LB model describes the microscopic processes occurring at the cellular level and the macroscopic processes occurring at the continuum level with a unique function, the probability distribution function. Recently, it has been tried to couple deterministic approaches with probabilistic cellular automata (probabilistic CA) methods with the aim to model temporal evolution of tumor growths and three dimensional spatial evolution, obtaining hybrid methodologies. Despite the good results attained by CA-PDE methods, there is one important issue which has not been completely solved: the intrinsic stochastic nature of the interactions at the interface between cellular (microscopic) and continuum (macroscopic) level. CA methods are able to cope with the stochastic phenomena because of their probabilistic nature, while PDE methods are fully deterministic. Even if the coupling is mathematically correct, there could be important statistical effects that could be missed by the PDE approach. For such a reason, to be able to develop and manage a model that takes into account all these three level of complexity (cellular, molecular and continuum), we believe that PDE should be replaced with a statistic and stochastic model based on the numerical discretization of the Boltzmann equation: The Lattice Boltzmann (LB) method. In this work we introduce a new hybrid method to simulate tumor growth and immune system, by applying Cellular Automata Lattice Boltzmann (CA-LB) approach. Copyright © 2011 Elsevier B.V. All rights reserved.
Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)
NASA Astrophysics Data System (ADS)
Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail
2014-05-01
Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach, Deterministic Approach, Historical Heritage, Cappadocia.
Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M
2014-01-01
Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.
2016-01-01
One of the most celebrated findings in complex systems in the last decade is that different indexes y (e.g. patents) scale nonlinearly with the population x of the cities in which they appear, i.e. y∼xβ,β≠1. More recently, the generality of this finding has been questioned in studies that used new databases and different definitions of city boundaries. In this paper, we investigate the existence of nonlinear scaling, using a probabilistic framework in which fluctuations are accounted for explicitly. In particular, we show that this allows not only to (i) estimate β and confidence intervals, but also to (ii) quantify the evidence in favour of β≠1 and (iii) test the hypothesis that the observations are compatible with the nonlinear scaling. We employ this framework to compare five different models to 15 different datasets and we find that the answers to points (i)–(iii) crucially depend on the fluctuations contained in the data, on how they are modelled, and on the fact that the city sizes are heavy-tailed distributed. PMID:27493764
Higgins, Helen M; Green, Laura E; Green, Martin J; Kaler, Jasmeet
2013-01-01
Footrot is a widespread, infectious cause of lameness in sheep, with major economic and welfare costs. The aims of this research were: (i) to quantify how veterinary surgeons' beliefs regarding the efficacy of two treatments for footrot changed following a review of the evidence (ii) to obtain a consensus opinion following group discussions (iii) to capture complementary qualitative data to place their beliefs within a broader clinical context. Grounded in a Bayesian statistical framework, probabilistic elicitation (roulette method) was used to quantify the beliefs of eleven veterinary surgeons during two one-day workshops. There was considerable heterogeneity in veterinary surgeons' beliefs before they listened to a review of the evidence. After hearing the evidence, seven participants quantifiably changed their beliefs. In particular, two participants who initially believed that foot trimming with topical oxytetracycline was the better treatment, changed to entirely favour systemic and topical oxytetracycline instead. The results suggest that a substantial amount of the variation in beliefs related to differences in veterinary surgeons' knowledge of the evidence. Although considerable differences in opinion still remained after the evidence review, with several participants having non-overlapping 95% credible intervals, both groups did achieve a consensus opinion. Two key findings from the qualitative data were: (i) veterinary surgeons believed that farmers are unlikely to actively seek advice on lameness, suggesting a proactive veterinary approach is required (ii) more attention could be given to improving the way in which veterinary advice is delivered to farmers. In summary this study has: (i) demonstrated a practical method for probabilistically quantifying how veterinary surgeons' beliefs change (ii) revealed that the evidence that currently exists is capable of changing veterinary opinion (iii) suggested that improved transfer of research knowledge into veterinary practice is needed (iv) identified some potential obstacles to the implementation of veterinary advice by farmers.
Higgins, Helen M.; Green, Laura E.; Green, Martin J.; Kaler, Jasmeet
2013-01-01
Footrot is a widespread, infectious cause of lameness in sheep, with major economic and welfare costs. The aims of this research were: (i) to quantify how veterinary surgeons’ beliefs regarding the efficacy of two treatments for footrot changed following a review of the evidence (ii) to obtain a consensus opinion following group discussions (iii) to capture complementary qualitative data to place their beliefs within a broader clinical context. Grounded in a Bayesian statistical framework, probabilistic elicitation (roulette method) was used to quantify the beliefs of eleven veterinary surgeons during two one-day workshops. There was considerable heterogeneity in veterinary surgeons’ beliefs before they listened to a review of the evidence. After hearing the evidence, seven participants quantifiably changed their beliefs. In particular, two participants who initially believed that foot trimming with topical oxytetracycline was the better treatment, changed to entirely favour systemic and topical oxytetracycline instead. The results suggest that a substantial amount of the variation in beliefs related to differences in veterinary surgeons’ knowledge of the evidence. Although considerable differences in opinion still remained after the evidence review, with several participants having non-overlapping 95% credible intervals, both groups did achieve a consensus opinion. Two key findings from the qualitative data were: (i) veterinary surgeons believed that farmers are unlikely to actively seek advice on lameness, suggesting a proactive veterinary approach is required (ii) more attention could be given to improving the way in which veterinary advice is delivered to farmers. In summary this study has: (i) demonstrated a practical method for probabilistically quantifying how veterinary surgeons’ beliefs change (ii) revealed that the evidence that currently exists is capable of changing veterinary opinion (iii) suggested that improved transfer of research knowledge into veterinary practice is needed (iv) identified some potential obstacles to the implementation of veterinary advice by farmers. PMID:23696869
Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach
NASA Astrophysics Data System (ADS)
Mwangi, M. W.
2015-12-01
Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.
Wei, Yaqiang; Dong, Yanhui; Yeh, Tian-Chyi J; Li, Xiao; Wang, Liheng; Zha, Yuanyuan
2017-11-01
There have been widespread concerns about solute transport problems in fractured media, e.g. the disposal of high-level radioactive waste in geological fractured rocks. Numerical simulation of particle tracking is gradually being employed to address these issues. Traditional predictions of radioactive waste transport using discrete fracture network (DFN) models often consider one particular realization of the fracture distribution based on fracture statistic features. This significantly underestimates the uncertainty of the risk of radioactive waste deposit evaluation. To adequately assess the uncertainty during the DFN modeling in a potential site for the disposal of high-level radioactive waste, this paper utilized the probabilistic distribution method (PDM). The method was applied to evaluate the risk of nuclear waste deposit in Beishan, China. Moreover, the impact of the number of realizations on the simulation results was analyzed. In particular, the differences between the modeling results of one realization and multiple realizations were demonstrated. Probabilistic distributions of 20 realizations at different times were also obtained. The results showed that the employed PDM can be used to describe the ranges of the contaminant particle transport. The high-possibility contaminated areas near the release point were more concentrated than the farther areas after 5E6 days, which was 25,400 m 2 .
NASA Astrophysics Data System (ADS)
Umut Caglar, Mehmet; Pal, Ranadip
2010-10-01
The central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid.'' However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of data in the cellular level and probabilistic nature of interactions. Probabilistic models like Stochastic Master Equation (SME) or deterministic models like differential equations (DE) can be used to analyze these types of interactions. SME models based on chemical master equation (CME) can provide detailed representation of genetic regulatory system, but their use is restricted by the large data requirements and computational costs of calculations. The differential equations models on the other hand, have low calculation costs and much more adequate to generate control procedures on the system; but they are not adequate to investigate the probabilistic nature of interactions. In this work the success of the mapping between SME and DE is analyzed, and the success of a control policy generated by DE model with respect to SME model is examined. Index Terms--- Stochastic Master Equation models, Differential Equation Models, Control Policy Design, Systems biology
Risk-based Spacecraft Fire Safety Experiments
NASA Technical Reports Server (NTRS)
Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.
1992-01-01
Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.
NASA Technical Reports Server (NTRS)
Stock, Thomas A.
1995-01-01
Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intraply level, and the related effects of these on composite properties.
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew
2018-01-01
This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4. PMID:29434562
Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A
2005-06-01
A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew
2018-01-01
This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.
Probabilistic Assessment of Cancer Risk from Solar Particle Events
NASA Astrophysics Data System (ADS)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.
Probabilistic Assessment of Cancer Risk from Solar Particle Events
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
2010-01-01
For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Ho, H. W.; Kurth, R. E.
1991-01-01
The work performed to develop composite load spectra (CLS) for the Space Shuttle Main Engine (SSME) using probabilistic methods. The three methods were implemented to be the engine system influence model. RASCAL was chosen to be the principal method as most component load models were implemented with the method. Validation of RASCAL was performed. High accuracy comparable to the Monte Carlo method can be obtained if a large enough bin size is used. Generic probabilistic models were developed and implemented for load calculations using the probabilistic methods discussed above. Each engine mission, either a real fighter or a test, has three mission phases: the engine start transient phase, the steady state phase, and the engine cut off transient phase. Power level and engine operating inlet conditions change during a mission. The load calculation module provides the steady-state and quasi-steady state calculation procedures with duty-cycle-data option. The quasi-steady state procedure is for engine transient phase calculations. In addition, a few generic probabilistic load models were also developed for specific conditions. These include the fixed transient spike model, the poison arrival transient spike model, and the rare event model. These generic probabilistic load models provide sufficient latitude for simulating loads with specific conditions. For SSME components, turbine blades, transfer ducts, LOX post, and the high pressure oxidizer turbopump (HPOTP) discharge duct were selected for application of the CLS program. They include static pressure loads and dynamic pressure loads for all four components, centrifugal force for the turbine blade, temperatures of thermal loads for all four components, and structural vibration loads for the ducts and LOX posts.
A framework for the probabilistic analysis of meteotsunamis
Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.
2014-01-01
A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.
NASA Astrophysics Data System (ADS)
Dugar, Sumit; Smith, Paul; Parajuli, Binod; Khanal, Sonu; Brown, Sarah; Gautam, Dilip; Bhandari, Dinanath; Gurung, Gehendra; Shakya, Puja; Kharbuja, RamGopal; Uprety, Madhab
2017-04-01
Operationalising effective Flood Early Warning Systems (EWS) in developing countries like Nepal poses numerous challenges, with complex topography and geology, sparse network of river and rainfall gauging stations and diverse socio-economic conditions. Despite these challenges, simple real-time monitoring based EWSs have been in place for the past decade. A key constraint of these simple systems is the very limited lead time for response - as little as 2-3 hours, especially for rivers originating from steep mountainous catchments. Efforts to increase lead time for early warning are focusing on imbedding forecasts into the existing early warning systems. In 2016, the Nepal Department of Hydrology and Meteorology (DHM) piloted an operational Probabilistic Flood Forecasting Model in major river basins across Nepal. This comprised a low data approach to forecast water levels, developed jointly through a research/practitioner partnership with Lancaster University and WaterNumbers (UK) and the International NGO Practical Action. Using Data-Based Mechanistic Modelling (DBM) techniques, the model assimilated rainfall and water levels to generate localised hourly flood predictions, which are presented as probabilistic forecasts, increasing lead times from 2-3 hours to 7-8 hours. The Nepal DHM has simultaneously started utilizing forecasts from the Global Flood Awareness System (GLoFAS) that provides streamflow predictions at the global scale based upon distributed hydrological simulations using numerical ensemble weather forecasts from the ECMWF (European Centre for Medium-Range Weather Forecasts). The aforementioned global and local models have already affected the approach to early warning in Nepal, being operational during the 2016 monsoon in the West Rapti basin in Western Nepal. On 24 July 2016, GLoFAS hydrological forecasts for the West Rapti indicated a sharp rise in river discharge above 1500 m3/sec (equivalent to the river warning level at 5 meters) with 53% probability of exceeding the Medium Level Alert in two days. Rainfall stations upstream of the West Rapti catchment recorded heavy rainfall on 26 July, and localized forecasts from the probabilistic model at 8 am suggested that the water level would cross a pre-determined warning level in the next 3 hours. The Flood Forecasting Section at DHM issued a flood advisory, and disseminated SMS flood alerts to more than 13,000 at-risk people residing along the floodplains. Water levels crossed the danger threshold (5.4 meters) at 11 am, peaking at 8.15 meters at 10 pm. Extension of the warning lead time from probabilistic forecasts was significant in minimising the risk to lives and livelihoods as communities gained extra time to prepare, evacuate and respond. Likewise, longer timescale forecasts from GLoFAS could be potentially linked with no-regret early actions leading to improved preparedness and emergency response. These forecasting tools have contributed to enhance the effectiveness and efficiency of existing community based systems, increasing the lead time for response. Nevertheless, extensive work is required on appropriate ways to interpret and disseminate probabilistic forecasts having longer (2-14 days) and shorter (3-5 hours) time horizon for operational deployment as there are numerous uncertainties associated with predictions.
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
Probabilistic load simulation: Code development status
NASA Astrophysics Data System (ADS)
Newell, J. F.; Ho, H.
1991-05-01
The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.
NASA Astrophysics Data System (ADS)
Feng, X.; Shen, S.
2014-12-01
The US coastline, over the past few years, has been overwhelmed by major storms including Hurricane Katrina (2005), Ike (2008), Irene (2011), and Sandy (2012). Supported by a growing and extensive body of evidence, a majority of research agrees hurricane activities have been enhanced due to climate change. However, the precise prediction of hurricane induced inundation remains a challenge. This study proposed a probabilistic inundation map based on a Statistically Modeled Storm Database (SMSD) to assess the probabilistic coastal inundation risk of Southwest Florida for near-future (20 years) scenario considering climate change. This map was processed through a Joint Probability Method with Optimal-Sampling (JPM-OS), developed by Condon and Sheng in 2012, and accompanied by a high resolution storm surge modeling system CH3D-SSMS. The probabilistic inundation map shows a 25.5-31.2% increase in spatially averaged inundation height compared to an inundation map of present-day scenario. To estimate climate change impacts on coastal communities, socioeconomic analyses were conducted using both the SMSD based probabilistic inundation map and the present-day inundation map. Combined with 2010 census data and 2012 parcel data from Florida Geographic Data Library, the differences of economic loss between the near-future and present day scenarios were used to generate an economic exposure map at census block group level to reflect coastal communities' exposure to climate change. The results show that climate change induced inundation increase has significant economic impacts. Moreover, the impacts are not equally distributed among different social groups considering their social vulnerability to hazards. Social vulnerability index at census block group level were obtained from Hazards and Vulnerability Research Institute. The demographic and economic variables in the index represent a community's adaptability to hazards. Local Moran's I was calculated to identify the clusters of highly exposed and vulnerable communities. The economic-exposure cluster map was overlapped with social-vulnerability cluster map to identify communities with low adaptive capability but high exposure. The result provides decision makers an intuitive tool to identify most susceptible communities for adaptation.
Quick probabilistic binary image matching: changing the rules of the game
NASA Astrophysics Data System (ADS)
Mustafa, Adnan A. Y.
2016-09-01
A Probabilistic Matching Model for Binary Images (PMMBI) is presented that predicts the probability of matching binary images with any level of similarity. The model relates the number of mappings, the amount of similarity between the images and the detection confidence. We show the advantage of using a probabilistic approach to matching in similarity space as opposed to a linear search in size space. With PMMBI a complete model is available to predict the quick detection of dissimilar binary images. Furthermore, the similarity between the images can be measured to a good degree if the images are highly similar. PMMBI shows that only a few pixels need to be compared to detect dissimilarity between images, as low as two pixels in some cases. PMMBI is image size invariant; images of any size can be matched at the same quick speed. Near-duplicate images can also be detected without much difficulty. We present tests on real images that show the prediction accuracy of the model.
A simulation-based probabilistic design method for arctic sea transport systems
NASA Astrophysics Data System (ADS)
Martin, Bergström; Ove, Erikstad Stein; Sören, Ehlers
2016-12-01
When designing an arctic cargo ship, it is necessary to consider multiple stochastic factors. This paper evaluates the merits of a simulation-based probabilistic design method specifically developed to deal with this challenge. The outcome of the paper indicates that the incorporation of simulations and probabilistic design parameters into the design process enables more informed design decisions. For instance, it enables the assessment of the stochastic transport capacity of an arctic ship, as well as of its long-term ice exposure that can be used to determine an appropriate level of ice-strengthening. The outcome of the paper also indicates that significant gains in transport system cost-efficiency can be obtained by extending the boundaries of the design task beyond the individual vessel. In the case of industrial shipping, this allows for instance the consideration of port-based cargo storage facilities allowing for temporary shortages in transport capacity and thus a reduction in the required fleet size / ship capacity.
Interrelation Between Safety Factors and Reliability
NASA Technical Reports Server (NTRS)
Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)
2001-01-01
An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.
Ertosun, Mehmet Günhan; Rubin, Daniel L
2015-01-01
Brain glioma is the most common primary malignant brain tumors in adults with different pathologic subtypes: Lower Grade Glioma (LGG) Grade II, Lower Grade Glioma (LGG) Grade III, and Glioblastoma Multiforme (GBM) Grade IV. The survival and treatment options are highly dependent of this glioma grade. We propose a deep learning-based, modular classification pipeline for automated grading of gliomas using digital pathology images. Whole tissue digitized images of pathology slides obtained from The Cancer Genome Atlas (TCGA) were used to train our deep learning modules. Our modular pipeline provides diagnostic quality statistics, such as precision, sensitivity and specificity, of the individual deep learning modules, and (1) facilitates training given the limited data in this domain, (2) enables exploration of different deep learning structures for each module, (3) leads to developing less complex modules that are simpler to analyze, and (4) provides flexibility, permitting use of single modules within the framework or use of other modeling or machine learning applications, such as probabilistic graphical models or support vector machines. Our modular approach helps us meet the requirements of minimum accuracy levels that are demanded by the context of different decision points within a multi-class classification scheme. Convolutional Neural Networks are trained for each module for each sub-task with more than 90% classification accuracies on validation data set, and achieved classification accuracy of 96% for the task of GBM vs LGG classification, 71% for further identifying the grade of LGG into Grade II or Grade III on independent data set coming from new patients from the multi-institutional repository.
Ertosun, Mehmet Günhan; Rubin, Daniel L.
2015-01-01
Brain glioma is the most common primary malignant brain tumors in adults with different pathologic subtypes: Lower Grade Glioma (LGG) Grade II, Lower Grade Glioma (LGG) Grade III, and Glioblastoma Multiforme (GBM) Grade IV. The survival and treatment options are highly dependent of this glioma grade. We propose a deep learning-based, modular classification pipeline for automated grading of gliomas using digital pathology images. Whole tissue digitized images of pathology slides obtained from The Cancer Genome Atlas (TCGA) were used to train our deep learning modules. Our modular pipeline provides diagnostic quality statistics, such as precision, sensitivity and specificity, of the individual deep learning modules, and (1) facilitates training given the limited data in this domain, (2) enables exploration of different deep learning structures for each module, (3) leads to developing less complex modules that are simpler to analyze, and (4) provides flexibility, permitting use of single modules within the framework or use of other modeling or machine learning applications, such as probabilistic graphical models or support vector machines. Our modular approach helps us meet the requirements of minimum accuracy levels that are demanded by the context of different decision points within a multi-class classification scheme. Convolutional Neural Networks are trained for each module for each sub-task with more than 90% classification accuracies on validation data set, and achieved classification accuracy of 96% for the task of GBM vs LGG classification, 71% for further identifying the grade of LGG into Grade II or Grade III on independent data set coming from new patients from the multi-institutional repository. PMID:26958289
Evaluation of dynamic coastal response to sea-level rise modifies inundation likelihood
Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.
2016-01-01
Sea-level rise (SLR) poses a range of threats to natural and built environments1, 2, making assessments of SLR-induced hazards essential for informed decision making3. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30 × 30 m resolution predictions for more than 38,000 km2 of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.
Long-term strength and damage accumulation in laminates
NASA Astrophysics Data System (ADS)
Dzenis, Yuris A.; Joshi, Shiv P.
1993-04-01
A modified version of the probabilistic model developed by authors for damage evolution analysis of laminates subjected to random loading is utilized to predict long-term strength of laminates. The model assumes that each ply in a laminate consists of a large number of mesovolumes. Probabilistic variation functions for mesovolumes stiffnesses as well as strengths are used in the analysis. Stochastic strains are calculated using the lamination theory and random function theory. Deterioration of ply stiffnesses is calculated on the basis of the probabilities of mesovolumes failures using the theory of excursions of random process beyond the limits. Long-term strength and damage accumulation in a Kevlar/epoxy laminate under tension and complex in-plane loading are investigated. Effects of the mean level and stochastic deviation of loading on damage evolution and time-to-failure of laminate are discussed. Long-term cumulative damage at the time of the final failure at low loading levels is more than at high loading levels. The effect of the deviation in loading is more pronounced at lower mean loading levels.
The probabilistic nature of preferential choice.
Rieskamp, Jörg
2008-11-01
Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.
Probabilistic evaluation of fuselage-type composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1992-01-01
A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.
NASA Technical Reports Server (NTRS)
Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.
1995-01-01
The application of the probabilistic risk assessment methodology to a Space Shuttle environment, particularly to the potential of losing the Shuttle during nominal operation is addressed. The different related concerns are identified and combined to determine overall program risks. A fault tree model is used to allocate system probabilities to the subsystem level. The loss of the vehicle due to failure to contain energetic gas and debris, to maintain proper propulsion and configuration is analyzed, along with the loss due to Orbiter, external tank failure, and landing failure or error.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coppersmith , Kevin J.; Bommer, Julian J.; Bryce, Robert W.
Under the sponsorship of the US Department of Energy (DOE) and the electric utility Energy Northwest, the Pacific Northwest National Laboratory (PNNL) is conducting a probabilistic seismic hazard analysis (PSHA) within the framework of a SSHAC Level 3 procedure (Senior Seismic Hazard Analysis Committee; Budnitz et al., 1997). Specifically, the project is being conducted following the guidelines and requirements specified in NUREG-2117 (USNRC, 2012b) and consistent with approach given in the American Nuclear Standard ANSI/ANS-2.29-2008 Probabilistic Seismic Hazard Analysis. The collaboration between DOE and Energy Northwest is spawned by the needs of both organizations for an accepted PSHA with highmore » levels of regulatory assurance that can be used for the design and safety evaluation of nuclear facilities. DOE committed to this study after performing a ten-year review of the existing PSHA, as required by DOE Order 420.1C. The study will also be used by Energy Northwest as a basis for fulfilling the NRC’s 10CFR50.54(f) requirement that the western US nuclear power plants conduct PSHAs in conformance with SSHAC Level 3 procedures. The study was planned and is being carried out in conjunction with a project Work Plan, which identifies the purpose of the study, the roles and responsibilities of all participants, tasks and their associated schedules, Quality Assurance (QA) requirements, and project deliverables. New data collection and analysis activities are being conducted as a means of reducing the uncertainties in key inputs to the PSHA. It is anticipated that the results of the study will provide inputs to the site response analyses at multiple nuclear facility sites within the Hanford Site and at the Columbia Generating Station.« less
Students’ difficulties in probabilistic problem-solving
NASA Astrophysics Data System (ADS)
Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.
2018-03-01
There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.
A probabilistic Hu-Washizu variational principle
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
Mathematical Fundamentals of Probabilistic Semantics for High-Level Fusion
2013-12-02
understanding of the fundamental aspects of uncertainty representation and reasoning that a theory of hard and soft high-level fusion must encompass...representation and reasoning that a theory of hard and soft high-level fusion must encompass. Successful completion requires an unbiased, in-depth...and soft information is the lack of a fundamental HLIF theory , backed by a consistent mathematical framework and supporting algorithms. Although there
Constraining ozone-precursor responsiveness using ambient measurements
This study develops probabilistic estimates of ozone (O3) sensitivities to precursoremissions by incorporating uncertainties in photochemical modeling and evaluating modelperformance based on ground-level observations of O3 and oxides of nitrogen (NOx).Uncertainties in model form...
Localization of the lumbar discs using machine learning and exact probabilistic inference.
Oktay, Ayse Betul; Akgul, Yusuf Sinan
2011-01-01
We propose a novel fully automatic approach to localize the lumbar intervertebral discs in MR images with PHOG based SVM and a probabilistic graphical model. At the local level, our method assigns a score to each pixel in target image that indicates whether it is a disc center or not. At the global level, we define a chain-like graphical model that represents the lumbar intervertebral discs and we use an exact inference algorithm to localize the discs. Our main contributions are the employment of the SVM with the PHOG based descriptor which is robust against variations of the discs and a graphical model that reflects the linear nature of the vertebral column. Our inference algorithm runs in polynomial time and produces globally optimal results. The developed system is validated on a real spine MRI dataset and the final localization results are favorable compared to the results reported in the literature.
Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.
2002-01-01
Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.
van Capelleveen, Julian C; Bernelot Moens, Sophie J; Yang, Xiaohong; Kastelein, John J P; Wareham, Nicholas J; Zwinderman, Aeilko H; Stroes, Erik S G; Witztum, Joseph L; Hovingh, G Kees; Khaw, Kay-Tee; Boekholdt, S Matthijs; Tsimikas, Sotirios
2017-06-01
Apolipoprotein C-III (apoC-III) is a key regulator of triglyceride metabolism. Elevated triglyceride-rich lipoproteins and apoC-III levels are causally linked to coronary artery disease (CAD) risk. The mechanism(s) through which apoC-III increases CAD risk remains largely unknown. The aim was to confirm the association between apoC-III plasma levels and CAD risk and to explore which lipoprotein subfractions contribute to this relationship between apoC-III and CAD risk. Plasma apoC-III levels were measured in baseline samples from a nested case-control study in the European Prospective Investigation of Cancer (EPIC)-Norfolk study. The study comprised 2711 apparently healthy study participants, of whom 832 subsequently developed CAD. We studied the association of baseline apoC-III levels with incident CAD risk, lipoprotein subfractions measured by nuclear magnetic resonance spectroscopy and inflammatory biomarkers. ApoC-III levels were significantly associated with CAD risk (odds ratio, 1.91; 95% confidence interval, 1.48-2.48 for highest compared with lowest quintile), retaining significance after adjustment for traditional CAD risk factors (odds ratio, 1.47; 95% confidence interval, 1.11-1.94). ApoC-III levels were positively correlated with triglyceride levels, ( r =0.39), particle numbers of very-low-density lipoprotein ( r =0.25), intermediate-density lipoprotein ( r =0.23), small dense low-density lipoprotein ( r =0.26), and high-sensitivity C-reactive protein ( r =0.15), whereas an inverse correlation was observed with large low-density lipoprotein particle number ( r =-0.11), P <0.001 for each. Mediation analysis indicated that the association between apoC-III and CAD risk could be explained by triglyceride elevation (triglyceride, very-low-density lipoprotein, and intermediate-density lipoprotein particles), small low-density lipoprotein particle size, and high-sensitivity C-reactive protein. ApoC-III levels are significantly associated with incident CAD risk. Elevated levels of remnant lipoproteins, small dense low-density lipoprotein, and low-grade inflammation may explain this association. © 2017 American Heart Association, Inc.
Probabilistic Aeroelastic Analysis of Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.
2004-01-01
A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.
Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.
Lau, Jey Han; Clark, Alexander; Lappin, Shalom
2017-07-01
The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.
Finley, B; Paustenbach, D
1994-02-01
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard "point" risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Probabilistic assessment of sea level during the last interglacial stage.
Kopp, Robert E; Simons, Frederik J; Mitrovica, Jerry X; Maloof, Adam C; Oppenheimer, Michael
2009-12-17
With polar temperatures approximately 3-5 degrees C warmer than today, the last interglacial stage (approximately 125 kyr ago) serves as a partial analogue for 1-2 degrees C global warming scenarios. Geological records from several sites indicate that local sea levels during the last interglacial were higher than today, but because local sea levels differ from global sea level, accurately reconstructing past global sea level requires an integrated analysis of globally distributed data sets. Here we present an extensive compilation of local sea level indicators and a statistical approach for estimating global sea level, local sea levels, ice sheet volumes and their associated uncertainties. We find a 95% probability that global sea level peaked at least 6.6 m higher than today during the last interglacial; it is likely (67% probability) to have exceeded 8.0 m but is unlikely (33% probability) to have exceeded 9.4 m. When global sea level was close to its current level (>or=-10 m), the millennial average rate of global sea level rise is very likely to have exceeded 5.6 m kyr(-1) but is unlikely to have exceeded 9.2 m kyr(-1). Our analysis extends previous last interglacial sea level studies by integrating literature observations within a probabilistic framework that accounts for the physics of sea level change. The results highlight the long-term vulnerability of ice sheets to even relatively low levels of sustained global warming.
ERIC Educational Resources Information Center
Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Zhang, Dake; Park, Joo Young; Tzur, Ron
2010-01-01
Estimating the difficulty level of math word problems is an important task for many educational applications. Identification of relevant and irrelevant sentences in math word problems is an important step for calculating the difficulty levels of such problems. This paper addresses a novel application of text categorization to identify two types of…
Using Response Times for Item Selection in Adaptive Testing
ERIC Educational Resources Information Center
van der Linden, Wim J.
2008-01-01
Response times on items can be used to improve item selection in adaptive testing provided that a probabilistic model for their distribution is available. In this research, the author used a hierarchical modeling framework with separate first-level models for the responses and response times and a second-level model for the distribution of the…
Anxiety towards Mathematics and Educational Level: A Study on Means Differences
ERIC Educational Resources Information Center
Escalera-Chávez, Milka Elena; García-Santillán, Arturo; Córdova-Rangel, Arturo; González-Gómez, Santiago; Tejada-Peña, Esmeralda
2016-01-01
The aim of this research work is to analyze whether there is a difference in the degree of anxiety towards mathematics among students of different educational levels. The study is not experimental and cross sectional, and it is based on difference of means between groups. The sample is not probabilistic, and consisted of 226 students from…
Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources
Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.
2009-01-01
The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.
Gilsenan, M B; Lambe, J; Gibney, M J
2003-11-01
A key component of a food chemical exposure assessment using probabilistic analysis is the selection of the most appropriate input distribution to represent exposure variables. The study explored the type of parametric distribution that could be used to model variability in food consumption data likely to be included in a probabilistic exposure assessment of food additives. The goodness-of-fit of a range of continuous distributions to observed data of 22 food categories expressed as average daily intakes among consumers from the North-South Ireland Food Consumption Survey was assessed using the BestFit distribution fitting program. The lognormal distribution was most commonly accepted as a plausible parametric distribution to represent food consumption data when food intakes were expressed as absolute intakes (16/22 foods) and as intakes per kg body weight (18/22 foods). Results from goodness-of-fit tests were accompanied by lognormal probability plots for a number of food categories. The influence on food additive intake of using a lognormal distribution to model food consumption input data was assessed by comparing modelled intake estimates with observed intakes. Results from the present study advise some level of caution about the use of a lognormal distribution as a mode of input for food consumption data in probabilistic food additive exposure assessments and the results highlight the need for further research in this area.
Long, Chengjiang; Hua, Gang; Kapoor, Ashish
2015-01-01
We present a noise resilient probabilistic model for active learning of a Gaussian process classifier from crowds, i.e., a set of noisy labelers. It explicitly models both the overall label noise and the expertise level of each individual labeler with two levels of flip models. Expectation propagation is adopted for efficient approximate Bayesian inference of our probabilistic model for classification, based on which, a generalized EM algorithm is derived to estimate both the global label noise and the expertise of each individual labeler. The probabilistic nature of our model immediately allows the adoption of the prediction entropy for active selection of data samples to be labeled, and active selection of high quality labelers based on their estimated expertise to label the data. We apply the proposed model for four visual recognition tasks, i.e., object category recognition, multi-modal activity recognition, gender recognition, and fine-grained classification, on four datasets with real crowd-sourced labels from the Amazon Mechanical Turk. The experiments clearly demonstrate the efficacy of the proposed model. In addition, we extend the proposed model with the Predictive Active Set Selection Method to speed up the active learning system, whose efficacy is verified by conducting experiments on the first three datasets. The results show our extended model can not only preserve a higher accuracy, but also achieve a higher efficiency. PMID:26924892
System Level Uncertainty Assessment for Collaborative RLV Design
NASA Technical Reports Server (NTRS)
Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew
2002-01-01
A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.
Trauma care at rural level III trauma centers in a state trauma system.
Helling, Thomas S
2007-02-01
Although much has been written about the benefits of trauma center care, most experiences are urban with large numbers of patients. Little is known about the smaller, rural trauma centers and how they function both independently and as part of a larger trauma system. The state of Missouri has designated three levels of trauma care. The cornerstone of rural trauma care is the state-designated Level III trauma center. These centers are required to have the presence of a trauma team and trauma surgeon but do not require orthopedic or neurosurgical coverage. The purpose of this retrospective study was to determine how Level III trauma centers compared with Level I and Level II centers in the Missouri trauma system and, secondly, how trauma surgeon experience at these centers might shape future educational efforts to optimize rural trauma care. During a 2-year period in 2002 and 2003, the state trauma registry was queried on all trauma admissions for centers in the trauma system. Demographics and patient care outcomes were assessed by level of designation. Trauma admissions to the Level III centers were examined for acuity, severity, and type of injury. The experiences with chest, abdominal, and neurologic trauma were examined in detail. A total of 24,392 patients from 26 trauma centers were examined, including all eight Level III centers. Acuity and severity of injuries were higher at Level I and II centers. A total of 2,910 patients were seen at the 8 Level III centers. Overall deaths were significantly lower at Level III centers (Level I, 4% versus Level II, 4% versus Level III, 2%, p < 0.001). Numbers of patients dying within 24 hours were no different among levels of trauma care (Level I, 37% versus Level II, 30% versus Level III, 32%). Among Level III centers 45 (1.5%) patients were admitted in shock, and 48 (2%) had a Glasgow Coma Scale score <9. Twenty-six patients had a surgical head injury (7 epidural, 19 subdural hematomas). Twenty-eight patients (1%) needed a chest or abdominal operation. There were 15 spleen and 12 liver injuries with an Abbreviated Injury Score of 4 or 5. Level III trauma centers performed as expected in a state trauma system. Acuity and severity were less as was corresponding mortality. There were a paucity of life-threatening head, chest, and abdominal injuries, which provide a challenge to the rural trauma surgeon to maintain necessary skills in management of these critical injuries.
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2000-01-01
Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.
NASA Astrophysics Data System (ADS)
Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.
2013-12-01
In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example of five watersheds located in the South of France. References : O. CAYLA : Probability calculation of design floods abd inflows - SPEED. Waterpower 1995, San Francisco, California 1995 CFGB : Design flood determination by the gradex method. Bulletin du Comité Français des Grands Barrages News 96, 18th congress CIGB-ICOLD n2, nov:108, 1994. F. GARAVAGLIA et al. : Introducing a rainfall compound distribution model based on weather patterns subsampling. Hydrology and Earth System Sciences, 14, 951-964, 2010. J. LAVABRE et al. : SHYREG : une méthode pour l'estimation régionale des débits de crue. application aux régions méditerranéennes françaises. Ingénierie EAT, 97-111, 2003. M. MARGOUM : Estimation des crues rares et extrêmes : le modèle AGREGEE. Conceptions et remières validations. PhD, Ecole des Mines de Paris, 1992. R. NAULET et al. : Flood frequency analysis on the Ardèche river using French documentary sources from the two last centuries. Journal of Hydrology, 313:58-78, 2005. E. PAQUET et al. : The SCHADEX method: A semi-continuous rainfall-runoff simulation for extreme flood estimation, Journal of Hydrology, 495, 23-37, 2013.
NASA Astrophysics Data System (ADS)
mouloud, Hamidatou
2016-04-01
The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.
Probabilistic classifiers with high-dimensional data
Kim, Kyung In; Simon, Richard
2011-01-01
For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946
A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’
2017-01-01
ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362
Chen, Wei-Yu; Lin, Hsing-Chieh
2018-05-01
Growing evidence indicates that ocean acidification has a significant impact on calcifying marine organisms. However, there is a lack of exposure risk assessments for aquatic organisms under future environmentally relevant ocean acidification scenarios. The objective of this study was to investigate the probabilistic effects of acidified seawater on the life-stage response dynamics of fertilization, larvae growth, and larvae mortality of the green sea urchin (Strongylocentrotus droebachiensis). We incorporated the regulation of primary body cavity (PBC) pH in response to seawater pH into the assessment by constructing an explicit model to assess effective life-stage response dynamics to seawater or PBC pH levels. The likelihood of exposure to ocean acidification was also evaluated by addressing the uncertainties of the risk characterization. For unsuccessful fertilization, the estimated 50% effect level of seawater acidification (EC50 SW ) was 0.55 ± 0.014 (mean ± SE) pH units. This life stage was more sensitive than growth inhibition and mortality, for which the EC50 values were 1.13 and 1.03 pH units, respectively. The estimated 50% effect levels of PBC pH (EC50 PBC ) were 0.99 ± 0.05 and 0.88 ± 0.006 pH units for growth inhibition and mortality, respectively. We also predicted the probability distributions for seawater and PBC pH levels in 2100. The level of unsuccessful fertilization had 50 and 90% probability risks of 5.07-24.51 (95% CI) and 0-6.95%, respectively. We conclude that this probabilistic risk analysis model is parsimonious enough to quantify the multiple vulnerabilities of the green sea urchin while addressing the systemic effects of ocean acidification. This study found a high potential risk of acidification affecting the fertilization of the green sea urchin, whereas there was no evidence for adverse effects on growth and mortality resulting from exposure to the predicted acidified environment.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wnek, Shawn M.; Kuhlman, Christopher L.; Camarillo, Jeannie M.
2011-11-15
Exposure of human bladder urothelial cells (UROtsa) to 50 nM of the arsenic metabolite, monomethylarsonous acid (MMA{sup III}), for 12 weeks results in irreversible malignant transformation. The ability of continuous, low-level MMA{sup III} exposure to cause an increase in genotoxic potential by inhibiting repair processes necessary to maintain genomic stability is unknown. Following genomic insult within cellular systems poly(ADP-ribose) polymerase-1 (PARP-1), a zinc finger protein, is rapidly activated and recruited to sites of DNA strand breaks. When UROtsa cells are continuously exposed to 50 nM MMA{sup III}, PARP-1 activity does not increase despite the increase in MMA{sup III}-induced DNA single-strandmore » breaks through 12 weeks of exposure. When UROtsa cells are removed from continuous MMA{sup III} exposure (2 weeks), PARP-1 activity increases coinciding with a subsequent decrease in DNA damage levels. Paradoxically, PARP-1 mRNA expression and protein levels are elevated in the presence of continuous MMA{sup III} indicating a possible mechanism to compensate for the inhibition of PARP-1 activity in the presence of MMA{sup III}. The zinc finger domains of PARP-1 contain vicinal sulfhydryl groups which may act as a potential site for MMA{sup III} to bind, displace zinc ion, and render PARP-1 inactive. Mass spectrometry analysis demonstrates the ability of MMA{sup III} to bind a synthetic peptide representing the zinc-finger domain of PARP-1, and displace zinc from the peptide in a dose-dependent manner. In the presence of continuous MMA{sup III} exposure, continuous 4-week zinc supplementation restored PARP-1 activity levels and reduced the genotoxicity associated with MMA{sup III}. Zinc supplementation did not produce an overall increase in PARP-1 protein levels, decrease the levels of MMA{sup III}-induced reactive oxygen species, or alter Cu-Zn superoxide dismutase levels. Overall, these results present two potential interdependent mechanisms in which MMA{sup III} may increase the susceptibility of UROtsa cells to genotoxic insult and/or malignant transformation: elevated levels of MMA{sup III}-induced DNA damage through the production of reactive oxygen species, and the direct MMA{sup III}-induced inhibition of PARP-1.« less
Sánchez-Muniz, F J; Bastida, S; Gutiérrez-García, O; Carbajal, A
2009-01-01
Concomitant intake of statins together with certain foods may affect their therapeutic effects. The purpose of this preliminary study was to determine the modulating effect of two culinary oils on the hypolipemic effect of statins. Twenty-five men with severe hypercholesterolemia and high estimate cardiovascular risk (> 20% according to the Adult Treatment Panel III of USA National Institutes of Health, ATP-III) were enrolled in an observational follow-up study to test lipoprotein profile changes after ix month 20-mg/d Simvastatin treatment. Thirteen volunteers using sunflower oil as the habitual culinary fat, and 12 using olive oil, were selected by non-probabilistic incidental sampling. Volunteers consent in follow their habitual diets and to maintain diet characteristics throughout the study. Diet was evaluated through the study by three 24-h recalls and a food frequency questionnaire. The energy contribution of fat (P = 0.019) and MUFA (P < 0.001) was higher in the olive oil-group while that of PUFA (P = 0.001) and alcohol (P = 0.005) was higher in the sunflower oil-group. TC/HDL-cholesterol and the ATP-III 10-year risk percent decreased more (P < 0.05) in the olive oil group. TC and the TC/HDL-cholesterol and the LDL-cholesterol/HDL-cholesterol ratios and the ATP-III 10-year risk percent decreased significantly more (P < 0.05) in the olive oil-group after BMI, energy and alcohol intakes were adjusted. Data suggest that although Simvastatin is a very effective hypolipemic drug, olive oil-diets in preference to sunflower oil-diets must be consumed in patients with high cardiovascular risk.
New ShakeMaps for Georgia Resulting from Collaboration with EMME
NASA Astrophysics Data System (ADS)
Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.
2015-12-01
Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.
Stepp, J.C.; Wong, I.; Whitney, J.; Quittmeyer, R.; Abrahamson, N.; Toro, G.; Young, S.R.; Coppersmith, K.; Savy, J.; Sullivan, T.
2001-01-01
Probabilistic seismic hazard analyses were conducted to estimate both ground motion and fault displacement hazards at the potential geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. The study is believed to be the largest and most comprehensive analyses ever conducted for ground-shaking hazard and is a first-of-a-kind assessment of probabilistic fault displacement hazard. The major emphasis of the study was on the quantification of epistemic uncertainty. Six teams of three experts performed seismic source and fault displacement evaluations, and seven individual experts provided ground motion evaluations. State-of-the-practice expert elicitation processes involving structured workshops, consensus identification of parameters and issues to be evaluated, common sharing of data and information, and open exchanges about the basis for preliminary interpretations were implemented. Ground-shaking hazard was computed for a hypothetical rock outcrop at -300 m, the depth of the potential waste emplacement drifts, at the designated design annual exceedance probabilities of 10-3 and 10-4. The fault displacement hazard was calculated at the design annual exceedance probabilities of 10-4 and 10-5.
Probabilistic multi-resolution human classification
NASA Astrophysics Data System (ADS)
Tu, Jun; Ran, H.
2006-02-01
Recently there has been some interest in using infrared cameras for human detection because of the sharply decreasing prices of infrared cameras. The training data used in our work for developing the probabilistic template consists images known to contain humans in different poses and orientation but having the same height. Multiresolution templates are performed. They are based on contour and edges. This is done so that the model does not learn the intensity variations among the background pixels and intensity variations among the foreground pixels. Each template at every level is then translated so that the centroid of the non-zero pixels matches the geometrical center of the image. After this normalization step, for each pixel of the template, the probability of it being pedestrian is calculated based on the how frequently it appears as 1 in the training data. We also use periodicity gait to verify the pedestrian in a Bayesian manner for the whole blob in a probabilistic way. The videos had quite a lot of variations in the scenes, sizes of people, amount of occlusions and clutter in the backgrounds as is clearly evident. Preliminary experiments show the robustness.
Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl
2015-11-01
Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.
Probabilistic exposure assessment to face and oral care cosmetic products by the French population.
Bernard, A; Dornic, N; Roudot, Ac; Ficheux, As
2018-01-01
Cosmetic exposure data for face and mouth are limited in Europe. The aim of the study was to assess the exposure to face cosmetics using recent French consumption data (Ficheux et al., 2016b, 2015). Exposure was assessed using a probabilistic method for thirty one face products from four lines of products: cleanser, care, make-up and make-up remover products and two oral care products. Probabilistic exposure was assessed for different subpopulation according to sex and age in adults and children. Pregnant women were also studied. The levels of exposure to moisturizing cream, lip balm, mascara, eyeliner, cream foundation, toothpaste and mouthwash were higher than the values currently used by the Scientific Committee on Consumer Safety (SCCS). Exposure values found for eye shadow, lipstick, lotion and milk (make-up remover) were lower than SCCS values. These new French exposure values will be useful for safety assessors and for safety agencies in order to protect the general population and the at risk populations. Copyright © 2017. Published by Elsevier Ltd.
Role of the He I and He II metastables in the resonance 2p 2P°1/2, 3/2 B III level population
NASA Astrophysics Data System (ADS)
Djeniže, S.; Srećković, A.; Bukvić, S.
2007-01-01
Aims:The aim of this work is to present atomic processes which lead to an extra population of the 2p ~^2P°1/2, 3/2 B III resonance levels in helium plasma generating intense radiation in the B III 206.578 nm and 206.723 nm lines. Methods: The line profiles were recorded using a step-by-step (7.3 pm) technique which provides monitoring of the line shapes continually during the plasma decay and gives the possibility to compare line shapes at various times in the same plasma. Results: On the basis of the line intensity decays of the doubly ionized boron resonance spectral lines in laboratory nitrogen and helium plasmas, we have found the existence of a permanent energy transfer from He I and He II metastables to the 2p ^2P°1/2, 3/2 B III resonance levels. The shapes of the mentioned lines are also observed. At electron temperatures of about 18 000 K and electron densities about 1.1× 1023 m-3, the Stark broadening was found as a main B III line broadening mechanism. The measured Stark widths (W) are compared with the Doppler width (W_D) and with the splitting in the hyperfine structure (Δ_hfs). Our measured W data are found to be much higher than results obtained by means of various theoretical approaches. Conclusions: . The He I and He II metastables over populate the B III resonance levels leading to populations higher than predicted by LTE model. Consequently, the emitted B III resonance lines are more intense than expected from LTE model. This fact can be of importance if B III resonance line intensities are used for abundance determination purposes in astrophysics. Similar behavior can be expected for some lines emitted by astrophysical interesting emitters: Al III, Si III, Sc III, Cr III, V III, Ti III, Fe III, Co III, Ni III, Ga III, Zr III, Y III, Nb III, In III, Sn III, Sb III, Au III, Pb III and Bi III in hot and dense helium plasmas.
Agreement With Conjoined NPs Reflects Language Experience.
Lorimor, Heidi; Adams, Nora C; Middleton, Erica L
2018-01-01
An important question within psycholinguistic research is whether grammatical features, such as number values on nouns, are probabilistic or discrete. Similarly, researchers have debated whether grammatical specifications are only set for individual lexical items, or whether certain types of noun phrases (NPs) also obtain number valuations at the phrasal level. Through a corpus analysis and an oral production task, we show that conjoined NPs can take both singular and plural verb agreement and that notional number (i.e., the numerosity of the referent of the subject noun phrase) plays an important role in agreement with conjoined NPs. In two written production tasks, we show that participants who are exposed to plural (versus singular or unmarked) agreement with conjoined NPs in a biasing story are more likely to produce plural agreement with conjoined NPs on a subsequent production task. This suggests that, in addition to their sensitivity to notional information, conjoined NPs have probabilistic grammatical specifications that reflect their distributional properties in language. These results provide important evidence that grammatical number reflects language experience, and that this language experience impacts agreement at the phrasal level, and not just the lexical level.
Agreement With Conjoined NPs Reflects Language Experience
Lorimor, Heidi; Adams, Nora C.; Middleton, Erica L.
2018-01-01
An important question within psycholinguistic research is whether grammatical features, such as number values on nouns, are probabilistic or discrete. Similarly, researchers have debated whether grammatical specifications are only set for individual lexical items, or whether certain types of noun phrases (NPs) also obtain number valuations at the phrasal level. Through a corpus analysis and an oral production task, we show that conjoined NPs can take both singular and plural verb agreement and that notional number (i.e., the numerosity of the referent of the subject noun phrase) plays an important role in agreement with conjoined NPs. In two written production tasks, we show that participants who are exposed to plural (versus singular or unmarked) agreement with conjoined NPs in a biasing story are more likely to produce plural agreement with conjoined NPs on a subsequent production task. This suggests that, in addition to their sensitivity to notional information, conjoined NPs have probabilistic grammatical specifications that reflect their distributional properties in language. These results provide important evidence that grammatical number reflects language experience, and that this language experience impacts agreement at the phrasal level, and not just the lexical level. PMID:29725311
Quantification of Campylobacter jejuni contamination on chicken carcasses in France.
Duqué, Benjamin; Daviaud, Samuel; Guillou, Sandrine; Haddad, Nabila; Membré, Jeanne-Marie
2018-04-01
Highly prevalent in poultry, Campylobacter is a foodborne pathogen which remains the primary cause of enteritis in humans. Several studies have determined prevalence and contamination level of this pathogen throughout the food chain. However it is generally performed in a deterministic way without considering heterogeneity of contamination level. The purpose of this study was to quantify, using probabilistic tools, the contamination level of Campylobacter spp. on chicken carcasses after air-chilling step in several slaughterhouses in France. From a dataset (530 data) containing censored data (concentration <10CFU/g), several factors were considered, including the month of sampling, the farming method (standard vs certified) and the sampling area (neck vs leg). All probabilistic analyses were performed in R using fitdistrplus, mc2d and nada packages. The uncertainty (i.e. error) generated by the presence of censored data was small (ca 1 log 10 ) in comparison to the variability (i.e. heterogeneity) of contamination level (3 log 10 or more), strengthening the probabilistic analysis and facilitating result interpretation. The sampling period and sampling area (neck/leg) had a significant effect on Campylobacter contamination level. More precisely, two "seasons" were distinguished: one from January to May, another one from June to December. During the June-to-December season, the mean Campylobacter concentration was estimated to 2.6 [2.4; 2.8] log 10 (CFU/g) and 1.8 [1.5; 2.0] log 10 (CFU/g) for neck and leg, respectively. The probability of having >1000CFU/g (higher limit of European microbial criterion) was estimated to 35.3% and 12.6%, for neck and leg, respectively. In contrast, during January-to-May season, the mean contamination level was estimated to 1.0 [0.6; 1.3] log 10 (CFU/g) and 0.6 [0.3; 0.9] log 10 (CFU/g) for neck and leg, respectively. The probability of having >1000CFU/g was estimated to 13.5% and 2.0% for neck and leg, respectively. An accurate quantification of contamination level enables industrials to better adapt their processing and hygiene practices. These results will also help in refining exposure assessment models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Different Types of Sensation Seeking: A Person-Oriented Approach in Sensation-Seeking Research
ERIC Educational Resources Information Center
Suranyi, Zsuzsanna; Hitchcock, David B.; Hittner, James B.; Vargha, Andras; Urban, Robert
2013-01-01
Previous research on sensation seeking (SS) was dominated by a variable-oriented approach indicating that SS level has a linear relation with a host of problem behaviors. Our aim was to provide a person-oriented methodology--a probabilistic clustering--that enables examination of both inter- and intra-individual differences in not only the level,…
Himuro, Nobuaki; Mishima, Reiko; Seshimo, Takashi; Morishima, Toshibumi; Kosaki, Keisuke; Ibe, Shigeharu; Asagai, Yoshimi; Minematsu, Koji; Kurita, Kazuhiro; Okayasu, Tsutomu; Shimura, Tsukasa; Hoshino, Kotaro; Suzuki, Toshiro; Yanagizono, Taiichiro
2018-04-07
The prognosis for mobility function by Gross Motor Function Classification System (GMFCS) level is vital as a guide to rehabilitation for people with cerebral palsy. This study sought to investigate change in mobility function and its causes in adults with cerebral palsy by GMFCS level. We conducted a cross-sectional questionnaire study. A total of 386 participants (26 y 8 m, SD 5 y 10 m) with cerebral palsy were analyzed. Participant numbers by GMFCS level were: I (53), II (139), III (74) and IV (120). The median age of participants with peak mobility function in GMFCS level III was younger than that in the other levels. 48% had experienced a decline in mobility. A Kaplan-Meier plot showed the risk of mobility decline increased in GMFCS level III; the hazard ratio was 1.97 (95% CI, 1.20-3.23) compared with level I. The frequently reported causes of mobility decline were changes in environment, and illness and injury in GMFCS level III, stiffness and deformity in level IV, and reduced physical activity in level II and III. Peak mobility function and mobility decline occurred at a younger age in GMFCS level III, with the cause of mobility decline differing by GMFCS level.
NASA Astrophysics Data System (ADS)
Gerd, Niestegge
2010-12-01
In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lüders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases.
Subseasonal to Seasonal Predictions of U.S. West Coast High Water Levels
NASA Astrophysics Data System (ADS)
Khouakhi, A.; Villarini, G.; Zhang, W.; Slater, L. J.
2017-12-01
Extreme sea levels pose a significant threat to coastal communities, ecosystems, and assets, as they are conducive to coastal flooding, coastal erosion and inland salt-water intrusion. As sea levels continue to rise, these sea level extremes - including occasional minor coastal flooding experienced during high tide (nuisance floods) - are of concern. Extreme sea levels are increasing at many locations around the globe and have been attributed largely to rising mean sea levels associated with intra-seasonal to interannual climate processes such as the El Niño-Southern Oscillation (ENSO). Here, intra-seasonal to seasonal probabilistic forecasts of high water levels are computed at the Toke Point tide gage station on the US west coast. We first identify the main climate drivers that are responsible for high water levels and examine their predictability using General Circulation Models (GCMs) from the North American Multi-Model Ensemble (NMME). These drivers are then used to develop a probabilistic framework for the seasonal forecasting of high water levels. We focus on the climate controls on the frequency of high water levels using the number of exceedances above the 99.5th percentile and above the nuisance flood level established by the National Weather Service. Our findings indicate good forecast skill at the shortest lead time, with the skill that decreases as we increase the lead time. In general, these models aptly capture the year-to-year variability in the observational records.
Is probabilistic bias analysis approximately Bayesian?
MacLehose, Richard F.; Gustafson, Paul
2011-01-01
Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311
Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects
NASA Technical Reports Server (NTRS)
Nagpal, V. K.
1985-01-01
A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eckert-Gallup, Aubrey Celia; Lewis, John R.; Brooks, Dusty Marie
This report describes the methods, results, and conclusions of the analysis of 11 scenarios defined to exercise various options available in the xLPR (Extremely Low Probability of Rupture) Version 2 .0 code. The scope of the scenario analysis is three - fold: (i) exercise the various options and components comprising xLPR v2.0 and defining each scenario; (ii) develop and exercise methods for analyzing and interpreting xLPR v2.0 outputs ; and (iii) exercise the various sampling options available in xLPR v2.0. The simulation workflow template developed during the course of this effort helps to form a basis for the application ofmore » the xLPR code to problems with similar inputs and probabilistic requirements and address in a systematic manner the three points covered by the scope.« less
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Antithrombin III in animal models of sepsis and organ failure.
Dickneite, G
1998-01-01
Antithrombin III (AT III) is the physiological inhibitor of thrombin and other serine proteases of the clotting cascade. In the development of sepsis, septic shock and organ failure, the plasma levels of AT III decrease considerably, suggesting the concept of a substitution therapy with the inhibitor. A decrease of AT III plasma levels might also be associated with other pathological disorders like trauma, burns, pancreatitis or preclampsia. Activation of coagulation and consumption of AT III is the consequence of a generalized inflammation called SIRS (systemic inflammatory response syndrome). The clotting cascade is also frequently activated after organ transplantation, especially if organs are grafted between different species (xenotransplantation). During the past years AT III has been investigated in numerous corresponding disease models in different animal species which will be reviewed here. The bulk of evidence suggests, that AT III substitution reduces morbidity and mortality in the diseased animals. While gaining more experience with AT III, the concept of substitution therapy to maximal baseline plasma levels (100%) appears to become insufficient. Evidence from clinical and preclinical studies now suggests to adjust the AT III plasma levels to about 200%, i.e., doubling the normal value. During the last few years several authors proposed that AT III might not only be an anti-thrombotic agent, but to have in addition an anti-inflammatory effect.
Frontal and Parietal Contributions to Probabilistic Association Learning
Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke
2011-01-01
Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842
Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System
2014-06-01
in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are
Probabilistic Evaluation of Blade Impact Damage
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Abumeri, G. H.
2003-01-01
The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
Straub, Jürg Oliver
2002-05-10
UV filters in sunscreens and cosmetics protect the skin from damage through UV radiation. Many tonnes per year of UV filters are being used in Europe and will be present, at least seasonally, in detectable concentrations in surface waters similar to common pharmaceutically active substances. Predicted environmental concentrations (PECs) of ethylhexyl methoxycinnamate (EHMC; CAS 5466-77-3) were extrapolated for Switzerland, taking into consideration substance-specific environmental fate data and marketing estimates, by crude worst-case reckoning and by applying two environmental models (Mackay Level III; USES 3.0), both configured for Swiss hydrological and area data. By worst-case reckoning the summer PEC is 70.8-81.3 ng/l while for the remaining 8 months of the year the PEC is 13.1-15.1 ng/l. The Level III model results in concentrations of 2.4 ng/l during the summer and 0.44 ng/l during the rest of the year, while the USES 3.0 model gives an average PEC for the whole year of 7.6 ng/l. Pooling summer monitoring data (90 single analyses) from the River Rhine below Basel in the year 1997 (Water Protection Board of Basel) and from Lakes Zurich and Hüttner in 1998 (Poiger et al., in preparation) allowed a derivation of a probabilistic median concentration of 4.6 ng/l, a 95th-percentile concentration of 18.6 ng/l and a 99th-percentile concentration of 33.5 ng/l. The 6-fold range from the median value to the maximum calls for caution in interpreting published monitoring concentrations. Comparison of modelled PECs with realistic median concentrations shows that crude reckoning overestimates actual concentrations by a factor of about 10, probably through insufficient consideration of (further) degradation of EHMC in sewage works, surface waters, sediments or river banks. Both computer models, in contrast, are within the same order of magnitude as the actual summer concentrations. Based on the available data, both these environmental fate and distribution models give realistic PECs.
NASA Astrophysics Data System (ADS)
Gromek, Katherine Emily
A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
Hong, J H; Kwon, H G; Jang, S H
2011-08-01
The STP has been regarded as the most plausible neural tract responsible for pathogenesis of central poststroke pain. The VPL nucleus has been a target for neurosurgical procedures for control of central poststroke pain. However, to our knowledge, no DTI studies have been conducted to investigate the somatotopic location of the STP at the VPL nucleus of the thalamus. In the current study, we attempted to investigate this location in the human brain by using a probabilistic tractography technique of DTI. DTI was performed at 1.5T by using a Synergy-L SENSE head coil. STPs for both the hand and leg were obtained by selection of fibers passing through 2 regions of interest (the area of the spinothalamic tract in the posterolateral medulla and the postcentral gyrus) for 41 healthy volunteers. Somatotopic mapping was obtained from the highest probabilistic location at the ACPC level. The highest probabilistic locations for the hand and leg were an average of 16.86 and 16.37 mm lateral to the ACPC line and 7.53 and 8.71 mm posterior to the midpoint of the ACPC line, respectively. Somatotopic locations for the hand and leg were different in the anteroposterior direction (P < .05); however, no difference was observed in the mediolateral direction (P > .05). We found the somatotopic locations for hand and leg of the STP at the VPL nucleus; these somatotopies were arranged in the anteroposterior direction.
PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rucker, D.F.
2000-09-01
This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have beenmore » overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration batch, which included 5%, 50%, and 95% dose likelihood, and the sensitivity of each assumption to the calculated doses. As one would intuitively expect, the doses from the probabilistic assessment for most scenarios were found to be much less than the deterministic assessment. The lower dose of the probabilistic assessment can be attributed to a ''smearing'' of values from the high and low end of the PDF spectrum of the various input parameters. The analysis also found a potential weakness in the deterministic analysis used in the SAR, a detail on drum loading was not taken into consideration. Waste emplacement operations thus far have handled drums from each shipment as a single unit, i.e. drums from each shipment are kept together. Shipments typically come from a single waste stream, and therefore the curie loading of each drum can be considered nearly identical to that of its neighbor. Calculations show that if there are large numbers of drums used in the accident scenario assessment, e.g. 28 drums in the waste hoist failure scenario (CH5), then the probabilistic dose assessment calculations will diverge from the deterministically determined doses. As it is currently calculated, the deterministic dose assessment assumes one drum loaded to the maximum allowable (80 PE-Ci), and the remaining are 10% of the maximum. The effective average of drum curie content is therefore less in the deterministic assessment than the probabilistic assessment for a large number of drums. EEG recommends that the WIPP SAR calculations be revisited and updated to include a probabilistic safety assessment.« less
Graham, Mark J; Lee, Richard G; Bell, Thomas A; Fu, Wuxia; Mullick, Adam E; Alexander, Veronica J; Singleton, Walter; Viney, Nick; Geary, Richard; Su, John; Baker, Brenda F; Burkey, Jennifer; Crooke, Stanley T; Crooke, Rosanne M
2013-05-24
Elevated plasma triglyceride levels have been recognized as a risk factor for the development of coronary heart disease. Apolipoprotein C-III (apoC-III) represents both an independent risk factor and a key regulatory factor of plasma triglyceride concentrations. Furthermore, elevated apoC-III levels have been associated with metabolic syndrome and type 2 diabetes mellitus. To date, no selective apoC-III therapeutic agent has been evaluated in the clinic. To test the hypothesis that selective inhibition of apoC-III with antisense drugs in preclinical models and in healthy volunteers would reduce plasma apoC-III and triglyceride levels. Rodent- and human-specific second-generation antisense oligonucleotides were identified and evaluated in preclinical models, including rats, mice, human apoC-III transgenic mice, and nonhuman primates. We demonstrated the selective reduction of both apoC-III and triglyceride in all preclinical pharmacological evaluations. We also showed that inhibition of apoC-III was well tolerated and not associated with increased liver triglyceride deposition or hepatotoxicity. A double-blind, placebo-controlled, phase I clinical study was performed in healthy subjects. Administration of the human apoC-III antisense drug resulted in dose-dependent reductions in plasma apoC-III, concomitant lowering of triglyceride levels, and produced no clinically meaningful signals in the safety evaluations. Antisense inhibition of apoC-III in preclinical models and in a phase I clinical trial with healthy subjects produced potent, selective reductions in plasma apoC-III and triglyceride, 2 known risk factors for cardiovascular disease. This compelling pharmacological profile supports further clinical investigations in hypertriglyceridemic subjects.
Probabilistic objective functions for sensor management
NASA Astrophysics Data System (ADS)
Mahler, Ronald P. S.; Zajic, Tim R.
2004-08-01
This paper continues the investigation of a foundational and yet potentially practical basis for control-theoretic sensor management, using a comprehensive, intuitive, system-level Bayesian paradigm based on finite-set statistics (FISST). In this paper we report our most recent progress, focusing on multistep look-ahead -- i.e., allocation of sensor resources throughout an entire future time-window. We determine future sensor states in the time-window using a "probabilistically natural" sensor management objective function, the posterior expected number of targets (PENT). This objective function is constructed using a new "maxi-PIMS" optimization strategy that hedges against unknowable future observation-collections. PENT is used in conjuction with approximate multitarget filters: the probability hypothesis density (PHD) filter or the multi-hypothesis correlator (MHC) filter.
A multilevel probabilistic beam search algorithm for the shortest common supersequence problem.
Gallardo, José E
2012-01-01
The shortest common supersequence problem is a classical problem with many applications in different fields such as planning, Artificial Intelligence and especially in Bioinformatics. Due to its NP-hardness, we can not expect to efficiently solve this problem using conventional exact techniques. This paper presents a heuristic to tackle this problem based on the use at different levels of a probabilistic variant of a classical heuristic known as Beam Search. The proposed algorithm is empirically analysed and compared to current approaches in the literature. Experiments show that it provides better quality solutions in a reasonable time for medium and large instances of the problem. For very large instances, our heuristic also provides better solutions, but required execution times may increase considerably.
Symmetric nonnegative matrix factorization: algorithms and applications to probabilistic clustering.
He, Zhaoshui; Xie, Shengli; Zdunek, Rafal; Zhou, Guoxu; Cichocki, Andrzej
2011-12-01
Nonnegative matrix factorization (NMF) is an unsupervised learning method useful in various applications including image processing and semantic analysis of documents. This paper focuses on symmetric NMF (SNMF), which is a special case of NMF decomposition. Three parallel multiplicative update algorithms using level 3 basic linear algebra subprograms directly are developed for this problem. First, by minimizing the Euclidean distance, a multiplicative update algorithm is proposed, and its convergence under mild conditions is proved. Based on it, we further propose another two fast parallel methods: α-SNMF and β -SNMF algorithms. All of them are easy to implement. These algorithms are applied to probabilistic clustering. We demonstrate their effectiveness for facial image clustering, document categorization, and pattern clustering in gene expression.
Xiao, Feng; Simcik, Matt F; Halbach, Thomas R; Gulliver, John S
2015-04-01
Perfluorooctane sulfonate (PFOS) and perfluorooctanoate (PFOA) are emerging anthropogenic compounds that have recently become the target of global concern due to their ubiquitous presence in the environment, persistence, and bioaccumulative properties. This study was carried out to investigate the migration of PFOS and PFOA in soils and groundwater in a U.S. metropolitan area. We observed elevated levels in surface soils (median: 12.2 ng PFOS/g dw and 8.0 ng PFOA/g dw), which were much higher than the soil-screening levels for groundwater protection developed in this study. The measured levels in subsurface soils show a general increase with depth, suggesting a downward movement toward the groundwater table and a potential risk of aquifer contamination. Furthermore, concentrations of PFOS and PFOA in monitoring wells in the source zone varied insignificantly over 5 years (2009-2013), suggesting limited or no change in either the source or the magnitude of the source. The analysis also shows that natural processes of dispersion and dilution can significantly attenuate the groundwater contamination; the adsorption on aquifer solids, on the other hand, appears to have limited effects on the transport of PFOS and PFOA in the aquifer. The probabilistic exposure assessment indicates that ingestion of contaminated groundwater constitutes a much more important exposure route than ingestion of contaminated soil. Overall, the results suggest that (i) the transport of PFOS and PFOA is retarded in the vadose zone, but not in the aquifer; (ii) the groundwater contamination of PFOS and PFOA often follows their release to surface soils by years, if not decades; and (iii) the aquifer can be a major source of exposure for communities living near point sources. Copyright © 2014 Elsevier Ltd. All rights reserved.
Oral Assessment Kit, Levels II & III. Draft.
ERIC Educational Resources Information Center
Agrelo-Gonzalez, Maria; And Others
The assessment packet includes a series of oral tests to help develop speaking as an integral part of second language instruction at levels II and III. It contains: 8 mini-tests for use at level II; 9 mini-tests for use at level III; a rating scale and score sheet masters for evaluating performance on these tests; and a collection of suggested…
NASA Astrophysics Data System (ADS)
Fatimah, F.; Rosadi, D.; Hakim, R. B. F.
2018-03-01
In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.
Learning Probabilistic Logic Models from Probabilistic Examples
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2009-01-01
Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348
Learning Probabilistic Logic Models from Probabilistic Examples.
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2008-10-01
We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.
DOT National Transportation Integrated Search
2013-06-01
This report summarizes a research project aimed at developing degradation models for bridge decks in the state of Michigan based on durability mechanics. A probabilistic framework to implement local-level mechanistic-based models for predicting the c...
These novel modeling approaches for screening, evaluating and classifying chemicals based on the potential for biologically-relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. The new modeling approach is derived from the Stocha...
Probabilistic simulation of uncertainties in thermal structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael
1990-01-01
Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.
Low, Kah Hin; Zain, Sharifuddin Md; Abas, Mhd Radzi; Md Salleh, Kaharudin; Teo, Yin Yin
2015-06-15
The trace metal concentrations in edible muscle of red tilapia (Oreochromis spp.) sampled from a former tin mining pool, concrete tank and earthen pond in Jelebu were analysed with microwave assisted digestion-inductively coupled plasma-mass spectrometry. Results were compared with established legal limits and the daily ingestion exposures simulated using the Monte Carlo algorithm for potential health risks. Among the metals investigated, arsenic was found to be the key contaminant, which may have arisen from the use of formulated feeding pellets. Although the risks of toxicity associated with consumption of red tilapia from the sites investigated were found to be within the tolerable range, the preliminary probabilistic estimation of As cancer risk shows that the 95th percentile risk level surpassed the benchmark level of 10(-5). In general, the probabilistic health risks associated with ingestion of red tilapia can be ranked as follows: former tin mining pool > concrete tank > earthen pond. Copyright © 2015 Elsevier Ltd. All rights reserved.
Report of the Workshop on Extreme Ground Motions at Yucca Mountain, August 23-25, 2004
Hanks, T.C.; Abrahamson, N.A.; Board, M.; Boore, D.M.; Brune, J.N.; Cornell, C.A.
2006-01-01
This Workshop has its origins in the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain, the designated site of the underground repository for the nation's high-level radioactive waste. In 1998 the Nuclear Regulatory Commission's Senior Seismic Hazard Analysis Committee (SSHAC) developed guidelines for PSHA which were published as NUREG/CR-6372, 'Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and the use of experts,' (SSHAC, 1997). This Level-4 study was the most complicated and complex PSHA ever undertaken at the time. The procedures, methods, and results of this PSHA are described in Stepp et al. (2001), mostly in the context of a probability of exceedance (hazard) of 10-4/yr for ground motion at Site A, a hypothetical, reference rock outcrop site at the elevation of the proposed emplacement drifts within the mountain. Analysis and inclusion of both aleatory and epistemic uncertainty were significant and time-consuming aspects of the study, which took place over three years and involved several dozen scientists, engineers, and analysts.
NASA Astrophysics Data System (ADS)
Romanova, Vanya; Hense, Andreas; Wahl, Sabrina; Brune, Sebastian; Baehr, Johanna
2016-04-01
The decadal variability and its predictability of the surface net freshwater fluxes is compared in a set of retrospective predictions, all using the same model setup, and only differing in the implemented ocean initialisation method and ensemble generation method. The basic aim is to deduce the differences between the initialization/ensemble generation methods in view of the uncertainty of the verifying observational data sets. The analysis will give an approximation of the uncertainties of the net freshwater fluxes, which up to now appear to be one of the most uncertain products in observational data and model outputs. All ensemble generation methods are implemented into the MPI-ESM earth system model in the framework of the ongoing MiKlip project (www.fona-miklip.de). Hindcast experiments are initialised annually between 2000-2004, and from each start year 10 ensemble members are initialized for 5 years each. Four different ensemble generation methods are compared: (i) a method based on the Anomaly Transform method (Romanova and Hense, 2015) in which the initial oceanic perturbations represent orthogonal and balanced anomaly structures in space and time and between the variables taken from a control run, (ii) one-day-lagged ocean states from the MPI-ESM-LR baseline system (iii) one-day-lagged of ocean and atmospheric states with preceding full-field nudging to re-analysis in both the atmospheric and the oceanic component of the system - the baseline one MPI-ESM-LR system, (iv) an Ensemble Kalman Filter (EnKF) implemented into oceanic part of MPI-ESM (Brune et al. 2015), assimilating monthly subsurface oceanic temperature and salinity (EN3) using the Parallel Data Assimilation Framework (PDAF). The hindcasts are evaluated probabilistically using fresh water flux data sets from four different reanalysis data sets: MERRA, NCEP-R1, GFDL ocean reanalysis and GECCO2. The assessments show no clear differences in the evaluations scores on regional scales. However, on the global scale the physically motivated methods (i) and (iv) provide probabilistic hindcasts with a consistently higher reliability than the lagged initialization methods (ii)/(iii) despite the large uncertainties in the verifying observations and in the simulations.
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Uncertainty and the Social Cost of Methane Using Bayesian Constrained Climate Models
NASA Astrophysics Data System (ADS)
Errickson, F. C.; Anthoff, D.; Keller, K.
2016-12-01
Social cost estimates of greenhouse gases are important for the design of sound climate policies and are also plagued by uncertainty. One major source of uncertainty stems from the simplified representation of the climate system used in the integrated assessment models that provide these social cost estimates. We explore how uncertainty over the social cost of methane varies with the way physical processes and feedbacks in the methane cycle are modeled by (i) coupling three different methane models to a simple climate model, (ii) using MCMC to perform a Bayesian calibration of the three coupled climate models that simulates direct sampling from the joint posterior probability density function (pdf) of model parameters, and (iii) producing probabilistic climate projections that are then used to calculate the Social Cost of Methane (SCM) with the DICE and FUND integrated assessment models. We find that including a temperature feedback in the methane cycle acts as an additional constraint during the calibration process and results in a correlation between the tropospheric lifetime of methane and several climate model parameters. This correlation is not seen in the models lacking this feedback. Several of the estimated marginal pdfs of the model parameters also exhibit different distributional shapes and expected values depending on the methane model used. As a result, probabilistic projections of the climate system out to the year 2300 exhibit different levels of uncertainty and magnitudes of warming for each of the three models under an RCP8.5 scenario. We find these differences in climate projections result in differences in the distributions and expected values for our estimates of the SCM. We also examine uncertainty about the SCM by performing a Monte Carlo analysis using a distribution for the climate sensitivity while holding all other climate model parameters constant. Our SCM estimates using the Bayesian calibration are lower and exhibit less uncertainty about extremely high values in the right tail of the distribution compared to the Monte Carlo approach. This finding has important climate policy implications and suggests previous work that accounts for climate model uncertainty by only varying the climate sensitivity parameter may overestimate the SCM.
A Probabilistic Framework for the Validation and Certification of Computer Simulations
NASA Technical Reports Server (NTRS)
Ghanem, Roger; Knio, Omar
2000-01-01
The paper presents a methodology for quantifying, propagating, and managing the uncertainty in the data required to initialize computer simulations of complex phenomena. The purpose of the methodology is to permit the quantitative assessment of a certification level to be associated with the predictions from the simulations, as well as the design of a data acquisition strategy to achieve a target level of certification. The value of a methodology that can address the above issues is obvious, specially in light of the trend in the availability of computational resources, as well as the trend in sensor technology. These two trends make it possible to probe physical phenomena both with physical sensors, as well as with complex models, at previously inconceivable levels. With these new abilities arises the need to develop the knowledge to integrate the information from sensors and computer simulations. This is achieved in the present work by tracing both activities back to a level of abstraction that highlights their commonalities, thus allowing them to be manipulated in a mathematically consistent fashion. In particular, the mathematical theory underlying computer simulations has long been associated with partial differential equations and functional analysis concepts such as Hilbert spares and orthogonal projections. By relying on a probabilistic framework for the modeling of data, a Hilbert space framework emerges that permits the modeling of coefficients in the governing equations as random variables, or equivalently, as elements in a Hilbert space. This permits the development of an approximation theory for probabilistic problems that parallels that of deterministic approximation theory. According to this formalism, the solution of the problem is identified by its projection on a basis in the Hilbert space of random variables, as opposed to more traditional techniques where the solution is approximated by its first or second-order statistics. The present representation, in addition to capturing significantly more information than the traditional approach, facilitates the linkage between different interacting stochastic systems as is typically observed in real-life situations.
Learning Sparse Feature Representations using Probabilistic Quadtrees and Deep Belief Nets
2015-04-24
Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Learning sparse feature representations is a useful instru- ment for solving an...novel framework for the classifi cation of handwritten digits that learns sparse representations using probabilistic quadtrees and Deep Belief Nets... Learning Sparse Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Report Title Learning sparse feature representations is a useful
ORNL Pre-test Analyses of A Large-scale Experiment in STYLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Paul T; Yin, Shengjun; Klasky, Hilda B
Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less
NASA Astrophysics Data System (ADS)
Peeters, L. J.; Mallants, D.; Turnadge, C.
2017-12-01
Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a probabilistic framework, using groundwater metrics derived for a highly stressed groundwater system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balkey, K.; Witt, F.J.; Bishop, B.A.
1995-06-01
Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less
Exploring the calibration of a wind forecast ensemble for energy applications
NASA Astrophysics Data System (ADS)
Heppelmann, Tobias; Ben Bouallegue, Zied; Theis, Susanne
2015-04-01
In the German research project EWeLiNE, Deutscher Wetterdienst (DWD) and Fraunhofer Institute for Wind Energy and Energy System Technology (IWES) are collaborating with three German Transmission System Operators (TSO) in order to provide the TSOs with improved probabilistic power forecasts. Probabilistic power forecasts are derived from probabilistic weather forecasts, themselves derived from ensemble prediction systems (EPS). Since the considered raw ensemble wind forecasts suffer from underdispersiveness and bias, calibration methods are developed for the correction of the model bias and the ensemble spread bias. The overall aim is to improve the ensemble forecasts such that the uncertainty of the possible weather deployment is depicted by the ensemble spread from the first forecast hours. Additionally, the ensemble members after calibration should remain physically consistent scenarios. We focus on probabilistic hourly wind forecasts with horizon of 21 h delivered by the convection permitting high-resolution ensemble system COSMO-DE-EPS which has become operational in 2012 at DWD. The ensemble consists of 20 ensemble members driven by four different global models. The model area includes whole Germany and parts of Central Europe with a horizontal resolution of 2.8 km and a vertical resolution of 50 model levels. For verification we use wind mast measurements around 100 m height that corresponds to the hub height of wind energy plants that belong to wind farms within the model area. Calibration of the ensemble forecasts can be performed by different statistical methods applied to the raw ensemble output. Here, we explore local bivariate Ensemble Model Output Statistics at individual sites and quantile regression with different predictors. Applying different methods, we already show an improvement of ensemble wind forecasts from COSMO-DE-EPS for energy applications. In addition, an ensemble copula coupling approach transfers the time-dependencies of the raw ensemble to the calibrated ensemble. The calibrated wind forecasts are evaluated first with univariate probabilistic scores and additionally with diagnostics of wind ramps in order to assess the time-consistency of the calibrated ensemble members.
DOT National Transportation Integrated Search
2015-12-01
This study adopts a dwelling unit level of analysis and considers a probabilistic choice set generation approach for residential choice modeling. In doing so, we accommodate the fact that housing choices involve both characteristics of the dwelling u...
Research Analysis on MOOC Course Dropout and Retention Rates
ERIC Educational Resources Information Center
Gomez-Zermeno, Marcela Gerogina; Aleman de La Garza, Lorena
2016-01-01
This research's objective was to identify the terminal efficiency of the Massive Online Open Course "Educational Innovation with Open Resources" offered by a Mexican private university. A quantitative methodology was used, combining descriptive statistics and probabilistic models to analyze the levels of retention, completion, and…
Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components
NASA Technical Reports Server (NTRS)
1999-01-01
Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.
Processing of probabilistic information in weight perception and motor prediction.
Trampenau, Leif; van Eimeren, Thilo; Kuhtz-Buschbeck, Johann
2017-02-01
We studied the effects of probabilistic cues, i.e., of information of limited certainty, in the context of an action task (GL: grip-lift) and of a perceptual task (WP: weight perception). Normal subjects (n = 22) saw four different probabilistic visual cues, each of which announced the likely weight of an object. In the GL task, the object was grasped and lifted with a pinch grip, and the peak force rates indicated that the grip and load forces were scaled predictively according to the probabilistic information. The WP task provided the expected heaviness related to each probabilistic cue; the participants gradually adjusted the object's weight until its heaviness matched the expected weight for a given cue. Subjects were randomly assigned to two groups: one started with the GL task and the other one with the WP task. The four different probabilistic cues influenced weight adjustments in the WP task and peak force rates in the GL task in a similar manner. The interpretation and utilization of the probabilistic information was critically influenced by the initial task. Participants who started with the WP task classified the four probabilistic cues into four distinct categories and applied these categories to the subsequent GL task. On the other side, participants who started with the GL task applied three distinct categories to the four cues and retained this classification in the following WP task. The initial strategy, once established, determined the way how the probabilistic information was interpreted and implemented.
Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.
2010-01-01
Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502
Prescribing smoked cannabis for chronic noncancer pain: preliminary recommendations.
Kahan, Meldon; Srivastava, Anita; Spithoff, Sheryl; Bromley, Lisa
2014-12-01
To offer preliminary guidance on prescribing smoked cannabis for chronic pain before the release of formal guidelines. We reviewed the literature on the analgesic effectiveness of smoked cannabis and the harms of medical and recreational cannabis use. We developed recommendations on indications, contraindications, precautions, and dosing of smoked cannabis, and categorized the recommendations based on levels of evidence. Evidence is mostly level II (well conducted observational studies) and III (expert opinion). Smoked cannabis might be indicated for patients with severe neuropathic pain conditions who have not responded to adequate trials of pharmaceutical cannabinoids and standard analgesics (level II evidence). Smoked cannabis is contraindicated in patients who are 25 years of age or younger (level II evidence); who have a current, past, or strong family history of psychosis (level II evidence); who have a current or past cannabis use disorder (level III evidence); who have a current substance use disorder (level III evidence); who have cardiovascular or respiratory disease (level III evidence); or who are pregnant or planning to become pregnant (level II evidence). It should be used with caution in patients who smoke tobacco (level II evidence), who are at increased risk of cardiovascular disease (level III evidence), who have anxiety or mood disorders (level II evidence), or who are taking higher doses of opioids or benzodiazepines (level III evidence). Cannabis users should be advised not to drive for at least 3 to 4 hours after smoking, for at least 6 hours after oral ingestion, and for at least 8 hours if they experience a subjective "high" (level II evidence). The maximum recommended dose is 1 inhalation 4 times per day (approximately 400 mg per day) of dried cannabis containing 9% delta-9-tetrahydrocannabinol (level III evidence). Physicians should avoid referring patients to "cannabinoid" clinics (level III evidence). Future guidelines should be based on systematic review of the literature on the safety and effectiveness of smoked cannabis. Further research is needed on the effectiveness and long-term safety of smoked cannabis compared with pharmaceutical cannabinoids, opioids, and other standard analgesics. Copyright© the College of Family Physicians of Canada.
Prescribing smoked cannabis for chronic noncancer pain
Kahan, Meldon; Srivastava, Anita; Spithoff, Sheryl; Bromley, Lisa
2014-01-01
Objective To offer preliminary guidance on prescribing smoked cannabis for chronic pain before the release of formal guidelines. Quality of evidence We reviewed the literature on the analgesic effectiveness of smoked cannabis and the harms of medical and recreational cannabis use. We developed recommendations on indications, contraindications, precautions, and dosing of smoked cannabis, and categorized the recommendations based on levels of evidence. Evidence is mostly level II (well conducted observational studies) and III (expert opinion). Main message Smoked cannabis might be indicated for patients with severe neuropathic pain conditions who have not responded to adequate trials of pharmaceutical cannabinoids and standard analgesics (level II evidence). Smoked cannabis is contraindicated in patients who are 25 years of age or younger (level II evidence); who have a current, past, or strong family history of psychosis (level II evidence); who have a current or past cannabis use disorder (level III evidence); who have a current substance use disorder (level III evidence); who have cardiovascular or respiratory disease (level III evidence); or who are pregnant or planning to become pregnant (level II evidence). It should be used with caution in patients who smoke tobacco (level II evidence), who are at increased risk of cardiovascular disease (level III evidence), who have anxiety or mood disorders (level II evidence), or who are taking higher doses of opioids or benzodiazepines (level III evidence). Cannabis users should be advised not to drive for at least 3 to 4 hours after smoking, for at least 6 hours after oral ingestion, and for at least 8 hours if they experience a subjective “high” (level II evidence). The maximum recommended dose is 1 inhalation 4 times per day (approximately 400 mg per day) of dried cannabis containing 9% delta-9-tetrahydrocannabinol (level III evidence). Physicians should avoid referring patients to “cannabinoid” clinics (level III evidence). Conclusion Future guidelines should be based on systematic review of the literature on the safety and effectiveness of smoked cannabis. Further research is needed on the effectiveness and long-term safety of smoked cannabis compared with pharmaceutical cannabinoids, opioids, and other standard analgesics. PMID:25500598
Singh, Amit Pal; Dixit, Garima; Kumar, Amit; Mishra, Seema; Kumar, Navin; Dixit, Sameer; Singh, Pradyumna Kumar; Dwivedi, Sanjay; Trivedi, Prabodh Kumar; Pandey, Vivek; Dhankher, Om Prakash; Norton, Gareth J; Chakrabarty, Debasis; Tripathi, Rudra Deo
2017-06-01
Nitric oxide (NO) and salicylic acid (SA) are important signaling molecules in plant system. In the present study both NO and SA showed a protective role against arsenite (As III ) stress in rice plants when supplied exogenously. The application of NO and SA alleviated the negative impact of As III on plant growth. Nitric oxide supplementation to As III treated plants greatly decreased arsenic (As) accumulation in the roots as well as shoots/roots translocation factor. Arsenite exposure in plants decreased the endogenous levels of NO and SA. Exogenous supplementation of SA not only enhanced endogenous level of SA but also the level of NO through enhanced nitrate reductase (NR) activity, whether As III was present or not. Exogenously supplied NO decreased the NR activity and level of endogenous NO. Arsenic accumulation was positively correlated with the expression level of OsLsi1, a transporter responsible for As III uptake. The endogenous level of NO and SA were positively correlated to each other either when As III was present or not. This close relationship indicates that NO and SA work in harmony to modulate the signaling response in As III stressed plants. Copyright © 2017. Published by Elsevier Masson SAS.
Probabilistic finite elements for fracture mechanics
NASA Technical Reports Server (NTRS)
Besterfield, Glen
1988-01-01
The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components
NASA Technical Reports Server (NTRS)
1991-01-01
Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.
A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography
2010-04-01
distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by...umn.edu 2 ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in...criteria for aligning curves and particularly tracts. In this work, we present a global probabilistic approach inspired by the voting procedure provided
The cost-effectiveness of etanercept in patients with severe ankylosing spondylitis in the UK.
Ara, R M; Reynolds, A V; Conway, P
2007-08-01
To examine the costs and benefits associated with long-term etanercept (ETN) treatment in patients with severe ankylosing spondylitis (AS) in the UK in accordance with the BSR guidelines. A mathematical model was constructed to estimate the costs and benefits associated with ETN plus non-steroidal anti-inflammatory drugs (NSAIDs) compared with NSAIDs alone. Individual patient data from Phase III RCTs was used to inform the proportion and magnitude of initial response to treatment and changes in health-related quality of life. A retrospective costing exercise on patients attending a UK secondary care rheumatology unit was used to inform disease costs. Published evidence on long-term disease progression was extrapolated over a 25-yr horizon. Uncertainty was examined using probabilistic sensitivity analyses. Over a 25-yr horizon, ETN plus NSAIDs gave 1.58 more QALYs at an additional cost of 35,978 pounds when compared with NSAID treatment alone. This equates to a central estimate of 22,700 pounds per QALY. The incremental cost per QALYs using shorter time periods were 27,600 pounds, 23,600 pounds and 22,600 pounds at 2, 5 and 15 yrs, respectively. Using a 25-yr horizon, 93% of results from the probabilistic analyses fall below a threshold of 25,000 pounds per QALY. This study demonstrates the potential cost-effectiveness of ETN plus NSAIDs compared with NSAIDs alone in patients with severe AS treated according to the BSR guidelines in the UK.
Dehesh, Katayoon; Tai, Heeyoung; Edwards, Patricia; Byrne, James; Jaworski, Jan G.
2001-01-01
A cDNA coding for 3-ketoacyl-acyl-carrier protein (ACP) synthase III (KAS III) from spinach (Spinacia oleracea; So KAS III) was used to isolate two closely related KAS III clones (Ch KAS III-1 and Ch KAS III-2) from Cuphea hookeriana. Both Ch KAS IIIs are expressed constitutively in all tissues examined. An increase in the levels of 16:0 was observed in tobacco (Nicotiana tabacum, WT-SR) leaves overexpressing So KAS III when under the control of the cauliflower mosaic virus-35S promoter and in Arabidopsis and rapeseed (Brassica napus) seeds overexpressing either of the Ch KAS IIIs driven by napin. These data indicate that this enzyme has a universal role in fatty acid biosynthesis, irrespective of the plant species from which it is derived or the tissue in which it is expressed. The transgenic rapeseed seeds also contained lower levels of oil as compared with the wild-type levels. In addition, the rate of lipid synthesis in transgenic rapeseed seeds was notably slower than that of the wild-type seeds. The results of the measurements of the levels of the acyl-ACP intermediates as well as any changes in levels of other fatty acid synthase enzymes suggest that malonyl-ACP, the carbon donor utilized by all the 3- ketoacyl-ACP synthases, is limiting in the transgenic plants. This further suggests that malonyl-coenzyme A is a potential limiting factor impacting the final oil content as well as further extension of 16:0. PMID:11161065
Dehesh, K; Tai, H; Edwards, P; Byrne, J; Jaworski, J G
2001-02-01
A cDNA coding for 3-ketoacyl-acyl-carrier protein (ACP) synthase III (KAS III) from spinach (Spinacia oleracea; So KAS III) was used to isolate two closely related KAS III clones (Ch KAS III-1 and Ch KAS III-2) from Cuphea hookeriana. Both Ch KAS IIIs are expressed constitutively in all tissues examined. An increase in the levels of 16:0 was observed in tobacco (Nicotiana tabacum, WT-SR) leaves overexpressing So KAS III when under the control of the cauliflower mosaic virus-35S promoter and in Arabidopsis and rapeseed (Brassica napus) seeds overexpressing either of the Ch KAS IIIs driven by napin. These data indicate that this enzyme has a universal role in fatty acid biosynthesis, irrespective of the plant species from which it is derived or the tissue in which it is expressed. The transgenic rapeseed seeds also contained lower levels of oil as compared with the wild-type levels. In addition, the rate of lipid synthesis in transgenic rapeseed seeds was notably slower than that of the wild-type seeds. The results of the measurements of the levels of the acyl-ACP intermediates as well as any changes in levels of other fatty acid synthase enzymes suggest that malonyl-ACP, the carbon donor utilized by all the 3- ketoacyl-ACP synthases, is limiting in the transgenic plants. This further suggests that malonyl-coenzyme A is a potential limiting factor impacting the final oil content as well as further extension of 16:0.
Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erin V.
2007-01-01
A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for a 2-bearing shaft assembly in each body flap actuator established a reliability level of 99.6 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.
NASA Astrophysics Data System (ADS)
Gözükırmızı, Coşar; Kırkın, Melike Ebru
2017-01-01
Probabilistic evolution theory (PREVTH) provides a powerful framework for the solution of initial value problems of explicit ordinary differential equation sets with second degree multinomial right hand side functions. The use of the recursion between squarified telescope matrices provides the opportunity to obtain accurate results without much effort. Convergence may be considered as one of the drawbacks of PREVTH. It is related to many factors: the initial values and the coefficients in the right hand side functions are the most apparent ones. If a space extension is utilized before PREVTH, the convergence of PREVTH may also be affected by how the space extension is performed. There are works about implementations related to probabilistic evolution and how to improve the convergence by methods like analytic continuation. These works were written before squarification was introduced. Since recursion between squarified telescope matrices has given us the opportunity to obtain results corresponding to relatively higher truncation levels, it is important to obtain and analyze results related to certain problems in different areas of engineering. This manuscript may be considered to be in a series of papers and conference proceedings which serves for this purpose.
A spatio-temporal model for probabilistic seismic hazard zonation of Tehran
NASA Astrophysics Data System (ADS)
Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza
2013-08-01
A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.
Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erwin V.
2008-01-01
A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for the four actuators on one shuttle, each with a 2-bearing shaft assembly, established a reliability level of 96.9 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Kurth, R. E.; Ho, H.
1986-01-01
A multiyear program is performed with the objective to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen (LOX) posts. Progress of the first year's effort includes completion of a sufficient portion of each task -- probabilistic models, code development, validation, and an initial operational code. This code has from its inception an expert system philosophy that could be added to throughout the program and in the future. The initial operational code is only applicable to turbine blade type loadings. The probabilistic model included in the operational code has fitting routines for loads that utilize a modified Discrete Probabilistic Distribution termed RASCAL, a barrier crossing method and a Monte Carlo method. An initial load model was developed by Battelle that is currently used for the slowly varying duty cycle type loading. The intent is to use the model and related codes essentially in the current form for all loads that are based on measured or calculated data that have followed a slowly varying profile.
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Astrophysics Data System (ADS)
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Ibrahim, Shewkar E; Sayed, Tarek; Ismail, Karim
2012-11-01
Several earlier studies have noted the shortcomings with existing geometric design guides which provide deterministic standards. In these standards the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from the standards. To mitigate these shortcomings, probabilistic geometric design has been advocated where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a mechanism for risk measurement to evaluate the safety impact of deviations from design standards. This paper applies reliability analysis for optimizing the safety of highway cross-sections. The paper presents an original methodology to select a suitable combination of cross-section elements with restricted sight distance to result in reduced collisions and consistent risk levels. The purpose of this optimization method is to provide designers with a proactive approach to the design of cross-section elements in order to (i) minimize the risk associated with restricted sight distance, (ii) balance the risk across the two carriageways of the highway, and (iii) reduce the expected collision frequency. A case study involving nine cross-sections that are parts of two major highway developments in British Columbia, Canada, was presented. The results showed that an additional reduction in collisions can be realized by incorporating the reliability component, P(nc) (denoting the probability of non-compliance), in the optimization process. The proposed approach results in reduced and consistent risk levels for both travel directions in addition to further collision reductions. Copyright © 2012 Elsevier Ltd. All rights reserved.
New Intensity Attenuation in Georgia
NASA Astrophysics Data System (ADS)
Tsereteli, N. S.; Varazanashvili, O.; Tibaldi, A.; Bonali, F.; Gogoladze, Z.; Kvavadze, N.; Kvedelidze, I.
2016-12-01
In seismic-prone zones, increase of urbanization and infrastructures in turn produces increase of seismic risk that is mainly related to: the level of seismic hazard itself, the seismic resistance of dwelling houses, and many other factors. The relevant objectives of the present work is to improve the regional seismic hazard maps of Georgia, by implementing state-of-the art probabilistic seismic hazard assessment techniques and outputs from recent national and international collaborations. Seismic zoning is the identification of zones of similar levels of earthquake hazard. With reference to seismic zoning by ground motion assessment, the shaking intensity essentially depends on i) regional seismicity, ii) attenuation of ground motion with distance, iii) local site effects on ground motion. In the last decade, seismic hazard assessment is presented in terms of Peak Ground Acceleration (PGA), Peak Ground Velocity (PGV), or other recorded parameters. But there are very limited strong motion dataset in Georgia. Furthermore, vulnerability of buildings still is estimated for intensity, and there are no information about correlation between the distribution of ground motion recorded parameters and damage. So, macroseimic Intensity is still a very important parameter for strong ground motion evaluation. In the present work, we calibrated intensity prediction equations (IPE) for the Georgian dataset based on about 78 reviewed earthquakes. Metadata for Intensity (MSK 64 scale) were constrained and predictionequations for various types of distance (epicentral and hypocentral distance, Joyner-Boore distance, closest distance to the fault rupture plane) were calibrated. Relations between intensity and PGA values were derived. For this we used hybrid-empirical ground motion equation derived for Georgia and run scenario earthquakes for events with macroseismic data.
Quantitative risk assessment of foods containing peanut advisory labeling.
Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L
2013-12-01
Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. Copyright © 2013 Elsevier Ltd. All rights reserved.
Word-level language modeling for P300 spellers based on discriminative graphical models
NASA Astrophysics Data System (ADS)
Delgado Saa, Jaime F.; de Pesters, Adriana; McFarland, Dennis; Çetin, Müjdat
2015-04-01
Objective. In this work we propose a probabilistic graphical model framework that uses language priors at the level of words as a mechanism to increase the performance of P300-based spellers. Approach. This paper is concerned with brain-computer interfaces based on P300 spellers. Motivated by P300 spelling scenarios involving communication based on a limited vocabulary, we propose a probabilistic graphical model framework and an associated classification algorithm that uses learned statistical models of language at the level of words. Exploiting such high-level contextual information helps reduce the error rate of the speller. Main results. Our experimental results demonstrate that the proposed approach offers several advantages over existing methods. Most importantly, it increases the classification accuracy while reducing the number of times the letters need to be flashed, increasing the communication rate of the system. Significance. The proposed approach models all the variables in the P300 speller in a unified framework and has the capability to correct errors in previous letters in a word, given the data for the current one. The structure of the model we propose allows the use of efficient inference algorithms, which in turn makes it possible to use this approach in real-time applications.
Wambui, Joseph M; Karuri, Edward G; Ojiambo, Julia A; Njage, Patrick M K
2017-01-01
Epidemiological studies show a definite connection between areas of high aflatoxin content and a high occurrence of human hepatocellular carcinoma (HCC). Hepatitis B virus in individuals further increases the risk of HCC. The two risk factors are prevalent in rural Kenya and continuously predispose the rural populations to HCC. A quantitative cancer risk assessment therefore quantified the levels at which potential pre- and postharvest interventions reduce the HCC risk attributable to consumption of contaminated maize and groundnuts. The assessment applied a probabilistic model to derive probability distributions of HCC cases and percentage reductions levels of the risk from secondary data. Contaminated maize and groundnuts contributed to 1,847 ± 514 and 158 ± 52 HCC cases per annum, respectively. The total contribution of both foods to the risk was additive as it resulted in 2,000 ± 518 cases per annum. Consumption and contamination levels contributed significantly to the risk whereby lower age groups were most affected. Nonetheless, pre- and postharvest interventions might reduce the risk by 23.0-83.4% and 4.8-95.1%, respectively. Therefore, chronic exposure to aflatoxins increases the HCC risk in rural Kenya, but a significant reduction of the risk can be achieved by applying specific pre- and postharvest interventions.
NASA Astrophysics Data System (ADS)
Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria
2010-05-01
The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed, through the site approach, for Catania area can be interpreted in terms of greater damage historically observed at this town and its smaller distance from the seismogenic structures. On the other hand, the higher level of hazard found for Siracusa, throughout the Esteva-Cornell approach, could be a consequence of the features of such method which spreads out the intensities over a wide area. However, in SHA the use of a combined approach is recommended for a mutual validation of obtained results and any choice between the two approaches is strictly linked to the knowledge of the local seismotectonic features. References Albarello D. and Mucciarelli M.; 2002: Seismic hazard estimates using ill?defined macroseismic data at site. Pure Appl. Geophys., 159, 1289?1304. Cornell C.A.; 1968: Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58(5), 1583-1606. D'Amico V. and Albarello D.; 2007: Codice per il calcolo della pericolosità sismica da dati di sito (freeware). Progetto DPC-INGV S1, http://esse1.mi.ingv.it/d12.html Esteva L.; 1967: Criterios para la construcción de espectros para diseño sísmico. Proceedings of XII Jornadas Sudamericanas de Ingeniería Estructural y III Simposio Panamericano de Estructuras, Caracas, 1967. Published later in Boletín del Instituto de Materiales y Modelos Estructurales, Universidad Central de Venezuela, No. 19. Ordaz M., Aguilar A. and Arboleda J.; 2007: CRISIS2007, Program for computing seismic hazard. Version 5.4, Mexico City: UNAM. SSHAC (Senior Seismic Hazard Analysis Committee); 1997: Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts. NUREG/CR-6372.
Relationships between host viremia and vector susceptibility for arboviruses.
Lord, Cynthia C; Rutledge, C Roxanne; Tabachnick, Walter J
2006-05-01
Using a threshold model where a minimum level of host viremia is necessary to infect vectors affects our assessment of the relative importance of different host species in the transmission and spread of these pathogens. Other models may be more accurate descriptions of the relationship between host viremia and vector infection. Under the threshold model, the intensity and duration of the viremia above the threshold level is critical in determining the potential numbers of infected mosquitoes. A probabilistic model relating host viremia to the probability distribution of virions in the mosquito bloodmeal shows that the threshold model will underestimate the significance of hosts with low viremias. A probabilistic model that includes avian mortality shows that the maximum number of mosquitoes is infected by feeding on hosts whose viremia peaks just below the lethal level. The relationship between host viremia and vector infection is complex, and there is little experimental information to determine the most accurate model for different arthropod-vector-host systems. Until there is more information, the ability to distinguish the relative importance of different hosts in infecting vectors will remain problematic. Relying on assumptions with little support may result in erroneous conclusions about the importance of different hosts.
Relationships Between Host Viremia and Vector Susceptibility for Arboviruses
Lord, Cynthia C.; Rutledge, C. Roxanne; Tabachnick, Walter J.
2010-01-01
Using a threshold model where a minimum level of host viremia is necessary to infect vectors affects our assessment of the relative importance of different host species in the transmission and spread of these pathogens. Other models may be more accurate descriptions of the relationship between host viremia and vector infection. Under the threshold model, the intensity and duration of the viremia above the threshold level is critical in determining the potential numbers of infected mosquitoes. A probabilistic model relating host viremia to the probability distribution of virions in the mosquito bloodmeal shows that the threshold model will underestimate the significance of hosts with low viremias. A probabilistic model that includes avian mortality shows that the maximum number of mosquitoes is infected by feeding on hosts whose viremia peaks just below the lethal level. The relationship between host viremia and vector infection is complex, and there is little experimental information to determine the most accurate model for different arthropod–vector–host systems. Until there is more information, the ability to distinguish the relative importance of different hosts in infecting vectors will remain problematic. Relying on assumptions with little support may result in erroneous conclusions about the importance of different hosts. PMID:16739425
Cai, Shanqing; Tourville, Jason A.; Beal, Deryk S.; Perkell, Joseph S.; Guenther, Frank H.; Ghosh, Satrajit S.
2013-01-01
Deficits in brain white matter have been a main focus of recent neuroimaging studies on stuttering. However, no prior study has examined brain connectivity on the global level of the cerebral cortex in persons who stutter (PWS). In the current study, we analyzed the results from probabilistic tractography between regions comprising the cortical speech network. An anatomical parcellation scheme was used to define 28 speech production-related ROIs in each hemisphere. We used network-based statistic (NBS) and graph theory to analyze the connectivity patterns obtained from tractography. At the network-level, the probabilistic corticocortical connectivity from the PWS group were significantly weaker than that from persons with fluent speech (PFS). NBS analysis revealed significant components in the bilateral speech networks with negative correlations with stuttering severity. To facilitate comparison with previous studies, we also performed tract-based spatial statistics (TBSS) and regional fractional anisotropy (FA) averaging. Results from tractography, TBSS and regional FA averaging jointly highlight the importance of several regions in the left peri-Rolandic sensorimotor and premotor areas, most notably the left ventral premotor cortex (vPMC) and middle primary motor cortex, in the neuroanatomical basis of stuttering. PMID:24611042
Cai, Shanqing; Tourville, Jason A; Beal, Deryk S; Perkell, Joseph S; Guenther, Frank H; Ghosh, Satrajit S
2014-01-01
Deficits in brain white matter have been a main focus of recent neuroimaging studies on stuttering. However, no prior study has examined brain connectivity on the global level of the cerebral cortex in persons who stutter (PWS). In the current study, we analyzed the results from probabilistic tractography between regions comprising the cortical speech network. An anatomical parcellation scheme was used to define 28 speech production-related ROIs in each hemisphere. We used network-based statistic (NBS) and graph theory to analyze the connectivity patterns obtained from tractography. At the network-level, the probabilistic corticocortical connectivity from the PWS group were significantly weaker than that from persons with fluent speech (PFS). NBS analysis revealed significant components in the bilateral speech networks with negative correlations with stuttering severity. To facilitate comparison with previous studies, we also performed tract-based spatial statistics (TBSS) and regional fractional anisotropy (FA) averaging. Results from tractography, TBSS and regional FA averaging jointly highlight the importance of several regions in the left peri-Rolandic sensorimotor and premotor areas, most notably the left ventral premotor cortex (vPMC) and middle primary motor cortex, in the neuroanatomical basis of stuttering.
Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts
NASA Astrophysics Data System (ADS)
Hemri, S.; Klein, B.
2017-11-01
Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.
Probabilistic structural analysis of aerospace components using NESSUS
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.
1988-01-01
Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.
Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona
2016-01-01
Abstract Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
A systematic risk management approach employed on the CloudSat project
NASA Technical Reports Server (NTRS)
Basilio, R. R.; Plourde, K. S.; Lam, T.
2000-01-01
The CloudSat Project has developed a simplified approach for fault tree analysis and probabilistic risk assessment. A system-level fault tree has been constructed to identify credible fault scenarios and failure modes leading up to a potential failure to meet the nominal mission success criteria.
Monitoring wetlands at the ecoregion level provides information for resource managers beyond the site scale and can guide regionwide agency priorities for management and restoration. Our study was a component of the U.S. Environmental Protection Agency’s (EPA) 2002 Environmental ...
Statistics without Tears: Complex Statistics with Simple Arithmetic
ERIC Educational Resources Information Center
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
Arcella, D; Soggiu, M E; Leclercq, C
2003-10-01
For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.
Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.
2013-01-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559
Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H
2013-10-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.
NASA Astrophysics Data System (ADS)
Mishra, H.; Karmakar, S.; Kumar, R.
2016-12-01
Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.
Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J
2015-12-15
An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.
Zulkifley, Mohd Asyraf; Rawlinson, David; Moran, Bill
2012-01-01
In video analytics, robust observation detection is very important as the content of the videos varies a lot, especially for tracking implementation. Contrary to the image processing field, the problems of blurring, moderate deformation, low illumination surroundings, illumination change and homogenous texture are normally encountered in video analytics. Patch-Based Observation Detection (PBOD) is developed to improve detection robustness to complex scenes by fusing both feature- and template-based recognition methods. While we believe that feature-based detectors are more distinctive, however, for finding the matching between the frames are best achieved by a collection of points as in template-based detectors. Two methods of PBOD—the deterministic and probabilistic approaches—have been tested to find the best mode of detection. Both algorithms start by building comparison vectors at each detected points of interest. The vectors are matched to build candidate patches based on their respective coordination. For the deterministic method, patch matching is done in 2-level test where threshold-based position and size smoothing are applied to the patch with the highest correlation value. For the second approach, patch matching is done probabilistically by modelling the histograms of the patches by Poisson distributions for both RGB and HSV colour models. Then, maximum likelihood is applied for position smoothing while a Bayesian approach is applied for size smoothing. The result showed that probabilistic PBOD outperforms the deterministic approach with average distance error of 10.03% compared with 21.03%. This algorithm is best implemented as a complement to other simpler detection methods due to heavy processing requirement. PMID:23202226
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Use of model calibration to achieve high accuracy in analysis of computer networks
Frogner, Bjorn; Guarro, Sergio; Scharf, Guy
2004-05-11
A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.
ZERO: probabilistic routing for deploy and forget Wireless Sensor Networks.
Vilajosana, Xavier; Llosa, Jordi; Pacho, Jose Carlos; Vilajosana, Ignasi; Juan, Angel A; Vicario, Jose Lopez; Morell, Antoni
2010-01-01
As Wireless Sensor Networks are being adopted by industry and agriculture for large-scale and unattended deployments, the need for reliable and energy-conservative protocols become critical. Physical and Link layer efforts for energy conservation are not mostly considered by routing protocols that put their efforts on maintaining reliability and throughput. Gradient-based routing protocols route data through most reliable links aiming to ensure 99% packet delivery. However, they suffer from the so-called "hot spot" problem. Most reliable routes waste their energy fast, thus partitioning the network and reducing the area monitored. To cope with this "hot spot" problem we propose ZERO a combined approach at Network and Link layers to increase network lifespan while conserving reliability levels by means of probabilistic load balancing techniques.
Jang, Cheng-Shin
2008-03-15
This work probabilistically explored a safe utilization ratio (UR) of groundwater in fish ponds located in blackfoot disease hyperendemic areas in terms of the regulation of arsenic (As) concentrations. Sequential indicator simulation was used to reproduce As concentrations in groundwater and to propagate their uncertainty. Corresponding URs of groundwater were obtained from the relationship of mass balance between reproduced As concentrations in groundwater and the As regulation in farmed fish ponds. Three levels were adopted to evaluate the UR - UR> or =0.5, 0.5>UR> or =0.1 and UR<0.1. The high probability of the UR> or =0.5 level presents in the northern and southern regions where groundwater can be a major water source. The high probability of the 0.5>UR> or =0.1 level is mainly distributed in the central-coastal, central-eastern and southeastern regions where groundwater should be considered as a subordinate water source. Being available, extra surface water has priority over providing aquacultural needs of the regions with the high probability of the UR> or =0.5 and 0.5>UR> or =0.1 levels. In the regions with the high probability of the UR<0.1 level, in the central-coastal and southwestern regions, groundwater utilization should be reduced substantially or even prohibited completely for no adverse effects on human health.
Khetarpal, Sumeet A; Zeng, Xuemei; Millar, John S; Vitali, Cecilia; Somasundara, Amritha Varshini Hanasoge; Zanoni, Paolo; Landro, James A; Barucci, Nicole; Zavadoski, William J; Sun, Zhiyuan; de Haard, Hans; Toth, Ildikó V; Peloso, Gina M; Natarajan, Pradeep; Cuchel, Marina; Lund-Katz, Sissel; Phillips, Michael C; Tall, Alan R; Kathiresan, Sekar; DaSilva-Jardine, Paul; Yates, Nathan A; Rader, Daniel J
2017-09-01
Recent large-scale genetic sequencing efforts have identified rare coding variants in genes in the triglyceride-rich lipoprotein (TRL) clearance pathway that are protective against coronary heart disease (CHD), independently of LDL cholesterol (LDL-C) levels. Insight into the mechanisms of protection of these variants may facilitate the development of new therapies for lowering TRL levels. The gene APOC3 encodes apoC-III, a critical inhibitor of triglyceride (TG) lipolysis and remnant TRL clearance. Here we report a detailed interrogation of the mechanism of TRL lowering by the APOC3 Ala43Thr (A43T) variant, the only missense (rather than protein-truncating) variant in APOC3 reported to be TG lowering and protective against CHD. We found that both human APOC3 A43T heterozygotes and mice expressing human APOC3 A43T display markedly reduced circulating apoC-III levels. In mice, this reduction is due to impaired binding of A43T apoC-III to lipoproteins and accelerated renal catabolism of free apoC-III. Moreover, the reduced content of apoC-III in TRLs resulted in accelerated clearance of circulating TRLs. On the basis of this protective mechanism, we developed a monoclonal antibody targeting lipoprotein-bound human apoC-III that promotes circulating apoC-III clearance in mice expressing human APOC3 and enhances TRL catabolism in vivo. These data reveal the molecular mechanism by which a missense variant in APOC3 causes reduced circulating TG levels and, hence, protects from CHD. This protective mechanism has the potential to be exploited as a new therapeutic approach to reduce apoC-III levels and circulating TRL burden.
Khetarpal, Sumeet A; Zeng, Xuemei; Millar, John S; Vitali, Cecilia; Somasundara, Amritha Varshini Hanasoge; Zanoni, Paolo; Landro, James A; Barucci, Nicole; Zavadoski, William J; Sun, Zhiyuan; de Haard, Hans; Toth, Ildikó V; Peloso, Gina M; Natarajan, Pradeep; Cuchel, Marina; Lund-Katz, Sissel; Phillips, Michael C; Tall, Alan R; Kathiresan, Sekar; DaSilva-Jardine, Paul; Yates, Nathan A; Rader, Daniel J
2017-01-01
Recent large-scale genetic sequencing efforts have identified rare coding variants in genes in the triglyceride-rich lipoprotein (TRL) clearance pathway that are protective against coronary heart disease (CHD), independently of LDL cholesterol (LDL-C) levels1. Insight into the mechanisms of protection of these variants may facilitate the development of new therapies for lowering TRL levels. The gene APOC3 encodes apoC-III, a critical inhibitor of triglyceride (TG) lipolysis and remnant TRL clearance2. Here we report a detailed interrogation of the mechanism of TRL lowering by the APOC3 Ala43Thr (A43T) variant, the only missense (rather than protein-truncating) variant in APOC3 reported to be TG lowering and protective against CHD3–5. We found that both human APOC3 A43T heterozygotes and mice expressing human APOC3 A43T display markedly reduced circulating apoC-III levels. In mice, this reduction is due to impaired binding of A43T apoC-III to lipoproteins and accelerated renal catabolism of free apoC-III. Moreover, the reduced content of apoC-III in TRLs resulted in accelerated clearance of circulating TRLs. On the basis of this protective mechanism, we developed a monoclonal antibody targeting lipoprotein-bound human apoC-III that promotes circulating apoC-III clearance in mice expressing human APOC3 and enhances TRL catabolism in vivo. These data reveal the molecular mechanism by which a missense variant in APOC3 causes reduced circulating TG levels and, hence, protects from CHD. This protective mechanism has the potential to be exploited as a new therapeutic approach to reduce apoC-III levels and circulating TRL burden. PMID:28825717
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, Jeff; Ayala, Samuel
2000-01-01
NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.
Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.
The game of making decisions under uncertainty: How sure must one be?
NASA Astrophysics Data System (ADS)
Werner, Micha; Verkade, Jan; Wetterhall, Fredrik; van Andel, Schalk-Jan; Ramos, Maria-Helena
2016-04-01
Probabilistic hydrometeorological forecasting is now widely accepted to be more skillful than deterministic forecasts, and is increasingly being integrated into operational practice. Provided they are reliable and unbiased, probabilistic forecasts have the advantage that they give decision maker not only the forecast value, but also the uncertainty associated to that prediction. Though that information provides more insight, it does now leave the forecaster/decision maker with the challenge of deciding at what level of probability of a threshold being exceeded the decision to act should be taken. According to the cost-loss theory, that probability should be related to the impact of the threshold being exceeded. However, it is not entirely clear how easy it is for decision makers to follow that rule, even when the impact of a threshold being exceeded, and the actions to choose from are known. To continue the tradition in the "Ensemble Hydrometeorological Forecast" session, we will address the challenge of making decisions based on probabilistic forecasts through a game to be played with the audience. We will explore how decisions made differ depending on the known impacts of the forecasted events. Participants will be divided into a number of groups with differing levels of impact, and will be faced with a number of forecast situations. They will be asked to make decisions and record the consequence of those decisions. A discussion of the differences in the decisions made will be presented at the end of the game, with a fuller analysis later posted on the HEPEX web site blog (www.hepex.org).
Validation analysis of probabilistic models of dietary exposure to food additives.
Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J
2003-10-01
The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.
Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.
Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon
2017-04-24
Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.
Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair
Lee, Young-Joo; Kim, Robin E.; Suh, Wonho; Park, Kiwon
2017-01-01
Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed. PMID:28441768
Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce
2018-03-01
The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.
Rajan, Prashant V; Qudsi, Rameez A; Dyer, George S M; Losina, Elena
2018-02-07
There is no consensus on the optimal fixation method for patients who require a surgical procedure for distal radial fractures. We used cost-effectiveness analyses to determine which of 3 modalities offers the best value: closed reduction and percutaneous pinning, open reduction and internal fixation, or external fixation. We developed a Markov model that projected short-term and long-term health benefits and costs in patients undergoing a surgical procedure for a distal radial fracture. Simulations began at the patient age of 50 years and were run over the patient's lifetime. The analysis was conducted from health-care payer and societal perspectives. We estimated transition probabilities and quality-of-life values from the literature and determined costs from Medicare reimbursement schedules in 2016 U.S. dollars. Suboptimal postoperative outcomes were determined by rates of reduction loss (4% for closed reduction and percutaneous pinning, 1% for open reduction and internal fixation, and 11% for external fixation) and rates of orthopaedic complications. Procedural costs were $7,638 for closed reduction and percutaneous pinning, $10,170 for open reduction and internal fixation, and $9,886 for external fixation. Outputs were total costs and quality-adjusted life-years (QALYs), discounted at 3% per year. We considered willingness-to-pay thresholds of $50,000 and $100,000. We conducted deterministic and probabilistic sensitivity analyses to evaluate the impact of data uncertainty. From the health-care payer perspective, closed reduction and percutaneous pinning dominated (i.e., produced greater QALYs at lower costs than) open reduction and internal fixation and dominated external fixation. From the societal perspective, the incremental cost-effectiveness ratio for closed reduction and percutaneous pinning compared with open reduction and internal fixation was $21,058 per QALY and external fixation was dominated. In probabilistic sensitivity analysis, open reduction and internal fixation was cost-effective roughly 50% of the time compared with roughly 45% for closed reduction and percutaneous pinning. When considering data uncertainty, there is only a 5% to 10% difference in the frequency of probability combinations that find open reduction and internal fixation to be more cost-effective. The current degree of uncertainty in the data produces difficulty in distinguishing either strategy as being more cost-effective overall and thus it may be left to surgeon and patient shared decision-making. Economic Level III. See Instructions for Authors for a complete description of levels of evidence.
Probabilistic Prediction of Lifetimes of Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.
2006-01-01
ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.
PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.
1998-01-01
PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.
Zhang, Dandan; Gao, Baojiao; Li, Yanbin
2017-08-01
Using molecular design and polymer reactions, two types of bidentate Schiff base ligands, salicylaldehyde-aniline (SAN) and salicylaldehyde-cyclohexylamine (SCA), were synchronously synthesized and bonded onto the side chain of polysulfone (PSF), giving two bidentate Schiff base ligand-functionalized PSFs, PSF-SAN and PSF-SCA, referred to as macromolecular ligands. Following coordination reactions between the macromolecular ligands and Eu(III) and Tb(III) ions (the reaction occurred between the bonded ligands SAN or SCA and the lanthanide ion), two series of luminescent polymer-rare earth complexes, PSF-SAN-Eu(III) and PSF-SCA-Tb(III), were obtained. The two macromolecular ligands were fully characterized by Fourier transform infrared (FTIR), 1 H NMR and UV absorption spectroscopy, and the prepared complexes were also characterized by FTIR, UV absorption spectroscopy and thermo-gravity analysis. On this basis, the photoluminescence properties of these complexes and the relationships between their structure and luminescence were investigated in depth. The results show that the bonded bidentate Schiff base ligands, SAN and SCA, can effectively sensitize the fluorescence emission of Eu(III) and Tb(III) ions, respectively. PSF-SAN-Eu(III) series complexes, namely the binary complex PSF-(SAN) 3 -Eu(III) and the ternary complex PSF-(SAN) 3 -Eu(III)-(Phen) 1 (Phen is the small-molecule ligand 1,10-phenanthroline), produce strong red luminescence, suggesting that the triplet state energy level of SAN is lower and well matched with the resonant energy level of the Eu(III) ion. By contrast, PSF-SAN-Eu(III) series complexes, namely the binary complex PSF-(SCA) 3 -Tb(III) and the ternary complex PSF-(SCA) 3 -Tb(III)-(Phen) 1 , display strong green luminescence, suggesting that the triplet state energy level of SCA is higher and is well matched with the resonant energy level of Tb(III). Copyright © 2017 John Wiley & Sons, Ltd.
Bolster, Eline A M; Dallmeijer, Annet J; de Wolf, G Sander; Versteegt, Marieke; Schie, Petra E M van
2017-05-01
To determine the test-retest reliability and construct validity of a novel 6-Minute Racerunner Test (6MRT) in children and youth with cerebral palsy (CP) classified as Gross Motor Function Classification System (GMFCS) levels III and IV. The racerunner is a step-propelled tricycle. The participants were 38 children and youth with CP (mean age 11 y 2 m, SD 3 y 7 m; GMFCS III, n = 19; IV, n = 19). Racerunner capability was determined as the distance covered during the 6MRT on three occasions. The intraclass correlation coefficient (ICC), standard error of measurement (SEM), and smallest detectable differences (SDD) were calculated to assess test-retest reliability. The ICC for tests 2 and 3 were 0.89 (SDD 37%; 147 m) for children in level III and 0.91 for children in level IV (SDD 52%; 118 m). When the average of two separate test occasions was used, the SDDs were reduced to 26% (104 m; level III) and 37% (118 m; level IV). For tests 1 to 3, the mean distance covered increased from 345 m (SD 148 m) to 413 m (SD 137 m) for children in level III, and from 193 m (SD 100 m) to 239 m (SD 148 m) for children in level IV. Results suggest high test-retest reliability. However, large SDDs indicate that a single 6MRT measurement is only useful for individual evaluation when large improvements are expected, or when taking the average of two tests. The 6MRT discriminated the distance covered between children and youth in levels III and IV, supporting construct validity.
Fast algorithm for probabilistic bone edge detection (FAPBED)
NASA Astrophysics Data System (ADS)
Scepanovic, Danilo; Kirshtein, Joshua; Jain, Ameet K.; Taylor, Russell H.
2005-04-01
The registration of preoperative CT to intra-operative reality systems is a crucial step in Computer Assisted Orthopedic Surgery (CAOS). The intra-operative sensors include 3D digitizers, fiducials, X-rays and Ultrasound (US). FAPBED is designed to process CT volumes for registration to tracked US data. Tracked US is advantageous because it is real time, noninvasive, and non-ionizing, but it is also known to have inherent inaccuracies which create the need to develop a framework that is robust to various uncertainties, and can be useful in US-CT registration. Furthermore, conventional registration methods depend on accurate and absolute segmentation. Our proposed probabilistic framework addresses the segmentation-registration duality, wherein exact segmentation is not a prerequisite to achieve accurate registration. In this paper, we develop a method for fast and automatic probabilistic bone surface (edge) detection in CT images. Various features that influence the likelihood of the surface at each spatial coordinate are combined using a simple probabilistic framework, which strikes a fair balance between a high-level understanding of features in an image and the low-level number crunching of standard image processing techniques. The algorithm evaluates different features for detecting the probability of a bone surface at each voxel, and compounds the results of these methods to yield a final, low-noise, probability map of bone surfaces in the volume. Such a probability map can then be used in conjunction with a similar map from tracked intra-operative US to achieve accurate registration. Eight sample pelvic CT scans were used to extract feature parameters and validate the final probability maps. An un-optimized fully automatic Matlab code runs in five minutes per CT volume on average, and was validated by comparison against hand-segmented gold standards. The mean probability assigned to nonzero surface points was 0.8, while nonzero non-surface points had a mean value of 0.38 indicating clear identification of surface points on average. The segmentation was also sufficiently crisp, with a full width at half maximum (FWHM) value of 1.51 voxels.
Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats
Marshall, Andrew T.; Kirkpatrick, Kimberly
2015-01-01
Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448
Qamar, Arman; Khetarpal, Sumeet A; Khera, Amit V; Qasim, Atif; Rader, Daniel J; Reilly, Muredach P
2015-08-01
Triglyceride-rich lipoproteins have emerged as causal risk factors for developing coronary heart disease independent of low-density lipoprotein cholesterol levels. Apolipoprotein C-III (ApoC-III) modulates triglyceride-rich lipoprotein metabolism through inhibition of lipoprotein lipase and hepatic uptake of triglyceride-rich lipoproteins. Mutations causing loss-of-function of ApoC-III lower triglycerides and reduce coronary heart disease risk, suggestive of a causal role for ApoC-III. Little data exist about the relationship of ApoC-III, triglycerides, and atherosclerosis in patients with type 2 diabetes mellitus (T2DM). Here, we examined the relationships between plasma ApoC-III, triglycerides, and coronary artery calcification in patients with T2DM. Plasma ApoC-III levels were measured in a cross-sectional study of 1422 subjects with T2DM but without clinically manifest coronary heart disease. ApoC-III levels were positively associated with total cholesterol (Spearman r=0.36), triglycerides (r=0.59), low-density lipoprotein cholesterol (r=0.16), fasting glucose (r=0.16), and glycosylated hemoglobin (r=0.12; P<0.0001 for all). In age, sex, and race-adjusted analysis, ApoC-III levels were positively associated with coronary artery calcification (Tobit regression ratio, 1.78; 95% confidence interval, 1.27-2.50 per SD increase in ApoC-III; P<0.001). As expected for an intermediate mediator, these findings were attenuated when adjusted for both triglycerides (Tobit regression ratio, 1.43; 95% confidence interval, 0.94-2.18; P=0.086) and separately for very low-density lipoprotein cholesterol (Tobit regression ratio, 1.14; 95% confidence interval, 0.75-1.71; P=0.53). In persons with T2DM, increased plasma ApoC-III is associated with higher triglycerides, less favorable cardiometabolic phenotypes, and higher coronary artery calcification, a measure of subclinical atherosclerosis. Therapeutic inhibition of ApoC-III may thus be a novel strategy for reducing plasma triglyceride-rich lipoproteins and cardiovascular risk in T2DM. © 2015 American Heart Association, Inc.
Visualizing Uncertainty for Probabilistic Weather Forecasting based on Reforecast Analogs
NASA Astrophysics Data System (ADS)
Pelorosso, Leandro; Diehl, Alexandra; Matković, Krešimir; Delrieux, Claudio; Ruiz, Juan; Gröeller, M. Eduard; Bruckner, Stefan
2016-04-01
Numerical weather forecasts are prone to uncertainty coming from inaccuracies in the initial and boundary conditions and lack of precision in numerical models. Ensemble of forecasts partially addresses these problems by considering several runs of the numerical model. Each forecast is generated with different initial and boundary conditions and different model configurations [GR05]. The ensembles can be expressed as probabilistic forecasts, which have proven to be very effective in the decision-making processes [DE06]. The ensemble of forecasts represents only some of the possible future atmospheric states, usually underestimating the degree of uncertainty in the predictions [KAL03, PH06]. Hamill and Whitaker [HW06] introduced the "Reforecast Analog Regression" (RAR) technique to overcome the limitations of ensemble forecasting. This technique produces probabilistic predictions based on the analysis of historical forecasts and observations. Visual analytics provides tools for processing, visualizing, and exploring data to get new insights and discover hidden information patterns in an interactive exchange between the user and the application [KMS08]. In this work, we introduce Albero, a visual analytics solution for probabilistic weather forecasting based on the RAR technique. Albero targets at least two different type of users: "forecasters", who are meteorologists working in operational weather forecasting and "researchers", who work in the construction of numerical prediction models. Albero is an efficient tool for analyzing precipitation forecasts, allowing forecasters to make and communicate quick decisions. Our solution facilitates the analysis of a set of probabilistic forecasts, associated statistical data, observations and uncertainty. A dashboard with small-multiples of probabilistic forecasts allows the forecasters to analyze at a glance the distribution of probabilities as a function of time, space, and magnitude. It provides the user with a more accurate measure of forecast uncertainty that could result in better decision-making. It offers different level of abstractions to help with the recalibration of the RAR method. It also has an inspection tool that displays the selected analogs, their observations and statistical data. It gives the users access to inner parts of the method, unveiling hidden information. References [GR05] GNEITING T., RAFTERY A. E.: Weather forecasting with ensemble methods. Science 310, 5746, 248-249, 2005. [KAL03] KALNAY E.: Atmospheric modeling, data assimilation and predictability. Cambridge University Press, 2003. [PH06] PALMER T., HAGEDORN R.: Predictability of weather and climate. Cambridge University Press, 2006. [HW06] HAMILL T. M., WHITAKER J. S.: Probabilistic quantitative precipitation forecasts based on reforecast analogs: Theory and application. Monthly Weather Review 134, 11, 3209-3229, 2006. [DE06] DEITRICK S., EDSALL R.: The influence of uncertainty visualization on decision making: An empirical evaluation. Springer, 2006. [KMS08] KEIM D. A., MANSMANN F., SCHNEIDEWIND J., THOMAS J., ZIEGLER H.: Visual analytics: Scope and challenges. Springer, 2008.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting during October 6-8 at the Best Western Sterling Inn, Sterling Heights (Detroit), Michigan is co-sponsored by US Army Tank-automotive & Armaments Command (TACOM). The meeting will provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11's Probabilistic Methods Committee is to "enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development."
A Markov Chain Approach to Probabilistic Swarm Guidance
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Bayard, David S.
2012-01-01
This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.
Lentz, Erika E.; Stippa, Sawyer R.; Thieler, E. Robert; Plant, Nathaniel G.; Gesch, Dean B.; Horton, Radley M.
2014-02-13
The U.S. Geological Survey is examining effects of future sea-level rise on the coastal landscape from Maine to Virginia by producing spatially explicit, probabilistic predictions using sea-level projections, vertical land movement rates (due to isostacy), elevation data, and land-cover data. Sea-level-rise scenarios used as model inputs are generated by using multiple sources of information, including Coupled Model Intercomparison Project Phase 5 models following representative concentration pathways 4.5 and 8.5 in the Intergovernmental Panel on Climate Change Fifth Assessment Report. A Bayesian network is used to develop a predictive coastal response model that integrates the sea-level, elevation, and land-cover data with assigned probabilities that account for interactions with coastal geomorphology as well as the corresponding ecological and societal systems it supports. The effects of sea-level rise are presented as (1) level of landscape submergence and (2) coastal response type characterized as either static (that is, inundation) or dynamic (that is, landform or landscape change). Results are produced at a spatial scale of 30 meters for four decades (the 2020s, 2030s, 2050s, and 2080s). The probabilistic predictions can be applied to landscape management decisions based on sea-level-rise effects as well as on assessments of the prediction uncertainty and need for improved data or fundamental understanding. This report describes the methods used to produce predictions, including information on input datasets; the modeling approach; model outputs; data-quality-control procedures; and information on how to access the data and metadata online.
NASA Technical Reports Server (NTRS)
Lentz, Erika E.; Stippa, Sawyer R.; Thieler, E. Robert; Plant, Nathaniel G.; Gesch, Dean B.; Horton, Radley M.
2015-01-01
The U.S. Geological Survey is examining effects of future sea-level rise on the coastal landscape from Maine to Virginia by producing spatially explicit, probabilistic predictions using sea-level projections, vertical land movement rates (due to isostacy), elevation data, and land-cover data. Sea-level-rise scenarios used as model inputs are generated by using multiple sources of information, including Coupled Model Intercomparison Project Phase 5 models following representative concentration pathways 4.5 and 8.5 in the Intergovernmental Panel on Climate Change Fifth Assessment Report. A Bayesian network is used to develop a predictive coastal response model that integrates the sea-level, elevation, and land-cover data with assigned probabilities that account for interactions with coastal geomorphology as well as the corresponding ecological and societal systems it supports. The effects of sea-level rise are presented as (1) level of landscape submergence and (2) coastal response type characterized as either static (that is, inundation) or dynamic (that is, landform or landscape change). Results are produced at a spatial scale of 30 meters for four decades (the 2020s, 2030s, 2050s, and 2080s). The probabilistic predictions can be applied to landscape management decisions based on sea-level-rise effects as well as on assessments of the prediction uncertainty and need for improved data or fundamental understanding. This report describes the methods used to produce predictions, including information on input datasets; the modeling approach; model outputs; data-quality-control procedures; and information on how to access the data and metadata online.
Yoshimura, Etsuro; Kohdr, Hicham; Mori, Satoshi; Hider, Robert C
2011-08-01
The phytosiderophores, mugineic acid (MA) and epi-hydroxymugineic acid (HMA), together with a related compound, nicotianamine (NA), were investigated for their ability to bind Al(III). Potentiometric titration analysis demonstrated that MA and HMA bind Al(III), in contrast to NA which does not under normal physiological conditions. With MA and HMA, in addition to the Al complex (AlL), the protonated (AlLH) and deprotonated (AlLH(-1)) complexes were identified from an analysis of titration curves, where L denotes the phytosiderophore form in which all the carboxylate functions are ionized. The equilibrium formation constants of the Al(III) phytosiderophore complexes are much smaller than those of the corresponding Fe(III) complexes. The higher selectivity of phytosiderophores for Fe(III) over Al(III) facilitates Fe(III) acquisition in alkaline conditions where free Al(III) levels are higher than free Fe(III) levels.
A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Etingov, Pavel V.; Ren, Huiying
2016-07-18
This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.
Error Discounting in Probabilistic Category Learning
ERIC Educational Resources Information Center
Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.
2011-01-01
The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…
DOT National Transportation Integrated Search
2009-10-13
This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...
NASA Astrophysics Data System (ADS)
Fei, Cheng-Wei; Bai, Guang-Chen
2014-12-01
To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.
Superposition-Based Analysis of First-Order Probabilistic Timed Automata
NASA Astrophysics Data System (ADS)
Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph
This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.
[Diagnostic values of serum type III procollagen N-terminal peptide in type IV gastric cancer].
Akazawa, S; Fujiki, T; Kanda, Y; Kumai, R; Yoshida, S
1985-04-01
Since increased synthesis of collagen has been demonstrated in tissue of type IV gastric cancer, we attempted to distinguish type IV gastric cancer from other cancers by measuring serum levels of type III procollagen N-terminal peptide (type III-N-peptide). Mean serum levels in type IV gastric cancer patients without metastasis were found to be elevated above normal values and developed a tendency to be higher than those in types I, II and III gastric cancer patients without metastasis. Highly positive ratios were found in patients with liver diseases including hepatoma and colon cancer, biliary tract cancer, and esophageal cancer patients with liver, lung or bone metastasis, but only 2 out of 14 of these cancer patients without such metastasis showed positive serum levels of type III-N-peptide. Positive cases in patients with type IV gastric cancer were obtained not only in the group with clinical stage IV but also in the groups with clinical stages II and III. In addition, high serum levels of type III-N-peptide in patients with type IV gastric cancer were seen not only in the cases with liver, lung or bone metastasis but also in cases with disseminated peritoneal metastasis alone. These results suggest that if the serum level of type III-N-peptide is elevated above normal values, type IV gastric cancer should be suspected after ruling out liver diseases, myelofibrosis and liver, lung or bone metastasis.
Lang, Carrie L; Simon, Diane; Kilgore, Jane
The American College of Surgeons Committee on Trauma revised the Resources for Optimal Care of the Injured Patient to include the criteria for trauma centers to participate in a risk-adjusted benchmarking system. Trauma Quality Improvement Program is currently the risk-adjusted benchmarking program sponsored by the American College of Surgeons, which will be required of all trauma centers to participate in early 2017. Prior to this, there were no risk-adjusted programs for Level III verified trauma centers. The Ohio Society of Trauma Nurse Leaders is a collaborative group made up of trauma program managers, coordinators, and other trauma leaders who meet 6 times a year. Within this group, a Level III Subcommittee was formed initially to provide a place for the Level III centers to discuss issues specific to the Level III centers. When the new requirement regarding risk-adjustment became official, the subcommittee agreed to begin reporting simple data points with the idea to risk adjust in the future.
As ecological risk assessments (ERA) move beyond organism-based determinations towards probabilistic population-level assessments, model complexity must be evaluated against the goals of the assessment, the information available to parameterize components with minimal dependence ...
Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. This presentation will describe the development of EPA’s screening-level, probabilistic SHEDS-Li...
Probabilistic approach to long range planning of manpower
NASA Technical Reports Server (NTRS)
Lejk, R. A.
1967-01-01
Publication presents a total long range planning model for project oriented organizations. The total model consists of planning systems which originate - /1/ at the project level and consolidate into an overall plan, and /2/ from a budetary ceiling and allocate to the individual projects. Analysis of /1/ and /2/ is provided for management decision making.
ERIC Educational Resources Information Center
Abayomi, Kobi; Pizarro, Gonzalo
2013-01-01
We offer a straightforward framework for measurement of progress, across many dimensions, using cross-national social indices, which we classify as linear combinations of multivariate country level data onto a univariate score. We suggest a Bayesian approach which yields probabilistic (confidence type) intervals for the point estimates of country…
Resonance lines and energy levels of Cs III, Ba IV, and La V
NASA Technical Reports Server (NTRS)
Epstein, G. L.; Reader, J.
1976-01-01
Spectra of Cs III, Ba IV, and La V were photographed in a low-voltage sliding spark on a 10.7 m normal-incidence vacuum spectrograph. These ions are isoelectronic with neutral iodine and display a halogen-like energy level structure. Detailed isoelectronic comparisons, level transition diagrams, and tabular data on the transitions of the ions and percentage compositions of Cs III configurations are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knowlton, Robert G.; Cochran, John Russell; Arnold, Bill Walter
2007-01-01
Sandia National Laboratories and the Institute of Nuclear Energy Research, Taiwan have collaborated in a technology transfer program related to low-level radioactive waste (LLW) disposal in Taiwan. Phase I of this program included regulatory analysis of LLW final disposal, development of LLW disposal performance assessment capabilities, and preliminary performance assessments of two potential disposal sites. Performance objectives were based on regulations in Taiwan and comparisons to those in the United States. Probabilistic performance assessment models were constructed based on limited site data using software including GoldSim, BLT-MS, FEHM, and HELP. These software codes provided the probabilistic framework, container degradation, waste-formmore » leaching, groundwater flow, radionuclide transport, and cover infiltration simulation capabilities in the performance assessment. Preliminary performance assessment analyses were conducted for a near-surface disposal system and a mined cavern disposal system at two representative sites in Taiwan. Results of example calculations indicate peak simulated concentrations to a receptor within a few hundred years of LLW disposal, primarily from highly soluble, non-sorbing radionuclides.« less
Bullock, Theodore Holmes
1970-01-01
The prevalent probabilistic view is virtually untestable; it remains a plausible belief. The cases usually cited can not be taken as evidence for it. Several grounds for this conclusion are developed. Three issues are distinguished in an attempt to clarify a murky debate: (a) the utility of probabilistic methods in data reduction, (b) the value of models that assume indeterminacy, and (c) the validity of the inference that the nervous system is largely indeterministic at the neuronal level. No exception is taken to the first two; the second is a private heuristic question. The third is the issue to which the assertion in the first two sentences is addressed. Of the two kinds of uncertainty, statistical mechanical (= practical unpredictability) as in a gas, and Heisenbergian indeterminancy, the first certainly exists, the second is moot at the neuronal level. It would contribute to discussion to recognize that neurons perform with a degree of reliability. Although unreliability is difficult to establish, to say nothing of measure, evidence that some neurons have a high degree of reliability, in both connections and activity is increasing greatly. An example is given from sternarchine electric fish. PMID:5462670
Probabilistic representation of gene regulatory networks.
Mao, Linyong; Resat, Haluk
2004-09-22
Recent experiments have established unambiguously that biological systems can have significant cell-to-cell variations in gene expression levels even in isogenic populations. Computational approaches to studying gene expression in cellular systems should capture such biological variations for a more realistic representation. In this paper, we present a new fully probabilistic approach to the modeling of gene regulatory networks that allows for fluctuations in the gene expression levels. The new algorithm uses a very simple representation for the genes, and accounts for the repression or induction of the genes and for the biological variations among isogenic populations simultaneously. Because of its simplicity, introduced algorithm is a very promising approach to model large-scale gene regulatory networks. We have tested the new algorithm on the synthetic gene network library bioengineered recently. The good agreement between the computed and the experimental results for this library of networks, and additional tests, demonstrate that the new algorithm is robust and very successful in explaining the experimental data. The simulation software is available upon request. Supplementary material will be made available on the OUP server.
Alteration of Antithrombin III and D-dimer Levels in Clinically Localized Prostate Cancer
Ko, Dong Woo; Park, Juhyun; Kim, In Sung; Doo, Seung Hwan; Yoon, Cheol Yong; Park, Hongzoo; Lee, Won Ki; Kim, Dae Sung; Jeong, Seong Jin; Byun, Seok-Soo; Lee, Sang Eun
2010-01-01
Purpose We performed a comparative analysis of the plasma levels of antithrombin (AT) III, plasminogen, fibrinogen, and D-dimer among patients with and without clinically localized prostate cancer to investigate the clinical significance of the coagulation profile in prostate cancer. Materials and Methods A prospective study was performed in which plasma levels of AT III, plasminogen, fibrinogen, and D-dimer were assessed in patients before they underwent prostate biopsy. According to the results of the biopsy, the patients were categorized into the cancer group or the control group. Levels of the four coagulation factors were then compared between the cancer and control groups. Also, levels of the four coagulation factors were correlated with tumor stage and grade in the cancer group. Results The cancer group had significantly lower levels of AT III activity and higher plasma D-dimer levels than did the control group (p=0.007 and p=0.018, respectively). Within the cancer group, no significant differences were observed in the levels of AT III, plasminogen, fibrinogen, or D-dimer between those with a pathological Gleason score of ≥7 and otherwise. Regarding pathologic stage of prostate cancer, the subjects with organ-confined disease and those with extraprostatic extension of a tumor demonstrated no significant differences in the preoperative levels of the four coagulation factors analyzed. Conclusions Our results suggest that plasma levels of AT III and D-dimer are altered in patients with prostate cancer. Further study is needed to elucidate the underlying mechanism and clinical significances of such a phenomenon among patients with clinically localized prostate cancer. PMID:20414406
NASA Astrophysics Data System (ADS)
Manceau, Jean-Charles; Loschetter, Annick; Rohmer, Jérémy; Le Cozannet, Gonéri; Lary Louis, de; Guénan Thomas, Le; Ken, Hnottavange-Telleen
2017-04-01
In a context of high degree of uncertainty, when very few data are available, experts are commonly requested to provide their opinions on input parameters of risk assessment models. Not only might each expert express a certain degree of uncertainty on his/her own statements, but the set of information collected from the pool of experts introduces an additional level of uncertainty. It is indeed very unlikely that all experts agree on exactly the same data, especially regarding parameters needed for natural risk assessments. In some cases, their opinions may differ only slightly (e.g. the most plausible value for a parameter is similar for different experts, and they only disagree on the level of uncertainties that taint the said value) while on other cases they may express incompatible opinions for a same parameter. Dealing with these different kinds of uncertainties remains a challenge for assessing geological hazards or/and risks. Extra-probabilistic approaches (such as the Dempster-Shafer theory or the possibility theory) have shown to offer promising solutions for representing parameters on which the knowledge is limited. It is the case for instance when the available information prevents an expert from identifying a unique probability law to picture the total uncertainty. Moreover, such approaches are known to be particularly flexible when it comes to aggregating several and potentially conflicting opinions. We therefore propose to discuss the opportunity of applying these new theories for managing the uncertainties on parameters elicited by experts, by a comparison with the application of more classical probability approaches. The discussion is based on two different examples. The first example deals with the estimation of the injected CO2 plume extent in a reservoir in the context of CO2 geological storage. This estimation requires information on the effective porosity of the reservoir, which has been estimated by 14 different experts. The Dempster-Shafer theory has been used to represent and aggregate these pieces of information. The results of different aggregation rules as well as those of a classical probabilistic approach are compared with the purpose of highlighting the elements each of them could provide to the decision-maker (Manceau et al., 2016). The second example focuses on projections of future sea-level rise. Based on IPCC's constraints on the projection quantiles, and on the scientific community consensus level on the physical limits to future sea-level rise, a possibility distribution of the projections by 2100 under the RCP 8.5 scenario has been established. This possibility distribution has been confronted with a set of previously published probabilistic sea-level projections, with a focus on their ability to explore high ranges of sea-level rise (Le Cozannet et al., 2016). These two examples are complementary in the sense that they allow to address various aspects of the problem (e.g. representation of different types of information, conflict among experts, sources dependence). Moreover, we believe that the issues faced during these two experiences can be generalized to many risks/hazards assessment situations. References Manceau, JC., Loschetter, A., Rohmer, J., de Lary, L., Le Guénan, T., Hnottavange-Telleen, K. (2016). Dealing with uncertainty on parameters elicited from a pool of experts for CCS risk assessment. Congrès λμ 20 (St-Malo, France). Le Cozannet G., Manceau JC., Rohmer, J. (2016). Bounding probabilistic sea-level rise projections within the framework of the possibility theory. Accepted in Environmental Research Letters.
Durrieu-Gaillard, Stéphanie; Dumay-Odelot, Hélène; Boldina, Galina; Tourasse, Nicolas J; Allard, Delphine; André, Fabrice; Macari, Françoise; Choquet, Armelle; Lagarde, Pauline; Drutel, Guillaume; Leste-Lasserre, Thierry; Petitet, Marion; Lesluyes, Tom; Lartigue-Faustin, Lydia; Dupuy, Jean-William; Chibon, Frédéric; Roeder, Robert G; Joubert, Dominique; Vagner, Stéphan; Teichmann, Martin
2018-01-01
RNA polymerase (Pol) III transcribes small untranslated RNAs that are essential for cellular homeostasis and growth. Its activity is regulated by inactivation of tumor suppressor proteins and overexpression of the oncogene c-MYC, but the concerted action of these tumor-promoting factors on Pol III transcription has not yet been assessed. In order to comprehensively analyse the regulation of Pol III transcription during tumorigenesis we employ a model system that relies on the expression of five genetic elements to achieve cellular transformation. Expression of these elements in six distinct transformation intermediate cell lines leads to the inactivation of TP53, RB1, and protein phosphatase 2A, as well as the activation of RAS and the protection of telomeres by TERT, thereby conducting to full tumoral transformation of IMR90 fibroblasts. Transformation is accompanied by moderately enhanced levels of a subset of Pol III-transcribed RNAs (7SK; MRP; H1). In addition, mRNA and/or protein levels of several Pol III subunits and transcription factors are upregulated, including increased protein levels of TFIIIB and TFIIIC subunits, of SNAPC1 and of Pol III subunits. Strikingly, the expression of POLR3G and of SNAPC1 is strongly enhanced during transformation in this cellular transformation model. Collectively, our data indicate that increased expression of several components of the Pol III transcription system accompanied by a 2-fold increase in steady state levels of a subset of Pol III RNAs is sufficient for sustaining tumor formation.
Probabilistic #D data fusion for multiresolution surface generation
NASA Technical Reports Server (NTRS)
Manduchi, R.; Johnson, A. E.
2002-01-01
In this paper we present an algorithm for adaptive resolution integration of 3D data collected from multiple distributed sensors. The input to the algorithm is a set of 3D surface points and associated sensor models. Using a probabilistic rule, a surface probability function is generated that represents the probability that a particular volume of space contains the surface. The surface probability function is represented using an octree data structure; regions of space with samples of large conariance are stored at a coarser level than regions of space containing samples with smaller covariance. The algorithm outputs an adaptive resolution surface generated by connecting points that lie on the ridge of surface probability with triangles scaled to match the local discretization of space given by the algorithm, we present results from 3D data generated by scanning lidar and structure from motion.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Kurth, R. E.; Ho, H.
1991-01-01
The objective of this program is to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen posts and system ducting. The first approach will consist of using state of the art probabilistic methods to describe the individual loading conditions and combinations of these loading conditions to synthesize the composite load spectra simulation. The second approach will consist of developing coupled models for composite load spectra simulation which combine the deterministic models for composite load dynamic, acoustic, high pressure, and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients will then be determined using advanced probabilistic simulation methods with and without strategically selected experimental data.
Then we all fall down: fall mortality by trauma center level.
Roubik, Daniel; Cook, Alan D; Ward, Jeanette G; Chapple, Kristina M; Teperman, Sheldon; Stone, Melvin E; Gross, Brian; Moore, Forrest O
2017-09-01
Ground-level falls (GLFs) are the predominant mechanism of injury in US trauma centers and accompany a spectrum of comorbidities, injury severity, and physiologic derangement. Trauma center levels define tiers of capability to treat injured patients. We hypothesized that risk-adjusted observed-to-expected mortality (O:E) by trauma center level would evaluate the degree to which need for care was met by provision of care. This retrospective cohort study used National Trauma Data Bank files for 2007-2014. Trauma center level was defined as American College of Surgeons (ACS) level I/II, ACS III/IV, State I/II, and State III/IV for within-group homogeneity. Risk-adjusted expected mortality was estimated using hierarchical, multivariable regression techniques. Analysis of 812,053 patients' data revealed the proportion of GLF in the National Trauma Data Bank increased 8.7% (14.1%-22.8%) over the 8 y studied. Mortality was 4.21% overall with a three-fold increase for those aged 60 y and older versus younger than 60 y (4.93% versus 1.46%, P < 0.001). O:E was lowest for ACS III/IV, (0.973, 95% CI: 0.971-0.975) and highest for State III/IV (1.043, 95% CI: 1.041-1.044). Risk-adjusted outcomes can be measured and meaningfully compared among groups of trauma centers. Differential O:E for ACS III/IV and State III/IV centers suggests that factors beyond case mix alone influence outcomes for GLF patients. More work is needed to optimize trauma care for GLF patients across the spectrum of trauma center capability. Copyright © 2017 Elsevier Inc. All rights reserved.
Topics in Probabilistic Judgment Aggregation
ERIC Educational Resources Information Center
Wang, Guanchun
2011-01-01
This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…
Cognitive Development Effects of Teaching Probabilistic Decision Making to Middle School Students
ERIC Educational Resources Information Center
Mjelde, James W.; Litzenberg, Kerry K.; Lindner, James R.
2011-01-01
This study investigated the comprehension and effectiveness of teaching formal, probabilistic decision-making skills to middle school students. Two specific objectives were to determine (1) if middle school students can comprehend a probabilistic decision-making approach, and (2) if exposure to the modeling approaches improves middle school…
Generative Topic Modeling in Image Data Mining and Bioinformatics Studies
ERIC Educational Resources Information Center
Chen, Xin
2012-01-01
Probabilistic topic models have been developed for applications in various domains such as text mining, information retrieval and computer vision and bioinformatics domain. In this thesis, we focus on developing novel probabilistic topic models for image mining and bioinformatics studies. Specifically, a probabilistic topic-connection (PTC) model…
Probabilistic Cue Combination: Less Is More
ERIC Educational Resources Information Center
Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen
2013-01-01
Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…
Is Probabilistic Evidence a Source of Knowledge?
ERIC Educational Resources Information Center
Friedman, Ori; Turri, John
2015-01-01
We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…
Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, J.; Ayala, S.
1999-01-01
NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.
Reliability and risk assessment of structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1991-01-01
Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.
Fully probabilistic control design in an adaptive critic framework.
Herzallah, Randa; Kárný, Miroslav
2011-12-01
Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Naseri Kouzehgarani, Asal
2009-12-01
Most models of aircraft trajectories are non-linear and stochastic in nature; and their internal parameters are often poorly defined. The ability to model, simulate and analyze realistic air traffic management conflict detection scenarios in a scalable, composable, multi-aircraft fashion is an extremely difficult endeavor. Accurate techniques for aircraft mode detection are critical in order to enable the precise projection of aircraft conflicts, and for the enactment of altitude separation resolution strategies. Conflict detection is an inherently probabilistic endeavor; our ability to detect conflicts in a timely and accurate manner over a fixed time horizon is traded off against the increased human workload created by false alarms---that is, situations that would not develop into an actual conflict, or would resolve naturally in the appropriate time horizon-thereby introducing a measure of probabilistic uncertainty in any decision aid fashioned to assist air traffic controllers. The interaction of the continuous dynamics of the aircraft, used for prediction purposes, with the discrete conflict detection logic gives rise to the hybrid nature of the overall system. The introduction of the probabilistic element, common to decision alerting and aiding devices, places the conflict detection and resolution problem in the domain of probabilistic hybrid phenomena. A hidden Markov model (HMM) has two stochastic components: a finite-state Markov chain and a finite set of output probability distributions. In other words an unobservable stochastic process (hidden) that can only be observed through another set of stochastic processes that generate the sequence of observations. The problem of self separation in distributed air traffic management reduces to the ability of aircraft to communicate state information to neighboring aircraft, as well as model the evolution of aircraft trajectories between communications, in the presence of probabilistic uncertain dynamics as well as partially observable and uncertain data. We introduce the Hybrid Hidden Markov Modeling (HHMM) formalism to enable the prediction of the stochastic aircraft states (and thus, potential conflicts), by combining elements of the probabilistic timed input output automaton and the partially observable Markov decision process frameworks, along with the novel addition of a Markovian scheduler to remove the non-deterministic elements arising from the enabling of several actions simultaneously. Comparisons of aircraft in level, climbing/descending and turning flight are performed, and unknown flight track data is evaluated probabilistically against the tuned model in order to assess the effectiveness of the model in detecting the switch between multiple flight modes for a given aircraft. This also allows for the generation of probabilistic distribution over the execution traces of the hybrid hidden Markov model, which then enables the prediction of the states of aircraft based on partially observable and uncertain data. Based on the composition properties of the HHMM, we study a decentralized air traffic system where aircraft are moving along streams and can perform cruise, accelerate, climb and turn maneuvers. We develop a common decentralized policy for conflict avoidance with spatially distributed agents (aircraft in the sky) and assure its safety properties via correctness proofs.
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.; Schutte, Jeff S.
2016-01-01
This report documents work done by the Aerospace Systems Design Lab (ASDL) at the Georgia Institute of Technology, Daniel Guggenheim School of Aerospace Engineering for the National Aeronautics and Space Administration, Aeronautics Research Mission Directorate, Integrated System Research Program, Environmentally Responsible Aviation (ERA) Project. This report was prepared under contract NNL12AA12C, "Application of Deterministic and Probabilistic System Design Methods and Enhancement of Conceptual Design Tools for ERA Project". The research within this report addressed the Environmentally Responsible Aviation (ERA) project goal stated in the NRA solicitation "to advance vehicle concepts and technologies that can simultaneously reduce fuel burn, noise, and emissions." To identify technology and vehicle solutions that simultaneously meet these three metrics requires the use of system-level analysis with the appropriate level of fidelity to quantify feasibility, benefits and degradations, and associated risk. In order to perform the system level analysis, the Environmental Design Space (EDS) [Kirby 2008, Schutte 2012a] environment developed by ASDL was used to model both conventional and unconventional configurations as well as to assess technologies from the ERA and N+2 timeframe portfolios. A well-established system design approach was used to perform aircraft conceptual design studies, including technology trade studies to identify technology portfolios capable of accomplishing the ERA project goal and to obtain accurate tradeoffs between performance, noise, and emissions. The ERA goal, shown in Figure 1, is to simultaneously achieve the N+2 benefits of a cumulative noise margin of 42 EPNdB relative to stage 4, a 75 percent reduction in LTO NOx emissions relative to CAEP 6 and a 50 percent reduction in fuel burn relative to the 2005 best in class aircraft. There were 5 research task associated with this research: 1) identify technology collectors, 2) model technology collectors in EDS, 3) model and assess ERA technologies, 4) LTO and cruise emission prediction, and 5) probabilistic analysis of technology collectors and portfolios.
Song, Min-Ae; Marian, Catalin; Brasky, Theodore M; Reisinger, Sarah; Djordjevic, Mirjana; Shields, Peter G
2016-03-14
Use of smokeless tobacco products (STPs) is associated with oral cavity cancer and other health risks. Comprehensive analysis for chemical composition and toxicity is needed to compare conventional and newer STPs with lower tobacco-specific nitrosamines (TSNAs) yields. Seven conventional and 12 low-TSNA moist snuff products purchased in the U.S., Sweden, and South Africa were analyzed for 18 chemical constituents (International Agency for Research on Cancer classified carcinogens), pH, nicotine, and free nicotine. Chemicals were compared in each product using Wilcoxon rank-sum test and principle component analysis (PCA). Conventional compared to low-TSNA moist snuff products had higher ammonia, benzo[a]pyrene, cadmium, nickel, nicotine, nitrate, and TSNAs and had lower arsenic in dry weight content and per mg nicotine. Lead and chromium were significantly higher in low-TSNA moist snuff products. PCA showed a clear difference for constituents between conventional and low-TSNA moist snuff products. Differences among products were reduced when considered on a per mg nicotine basis. As one way to contextualize differences in constituent levels, probabilistic lifetime cancer risk was estimated for chemicals included in The University of California's carcinogenic potency database (CPDB). Estimated probabilistic cancer risks were 3.77-fold or 3-fold higher in conventional compared to low-TSNA moist snuff products under dry weight or under per mg nicotine content, respectively. In vitro testing for the STPs indicated low level toxicity and no substantial differences. The comprehensive chemical characterization of both conventional and low-TSNA moist snuff products from this study provides a broader assessment of understanding differences in carcinogenic potential of the products. In addition, the high levels and probabilistic cancer risk estimates for certain chemical constituents of smokeless tobacco products will further inform regulatory decision makers and aid them in their efforts to reduce carcinogen exposure in smokeless tobacco products. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nowak, W.; Enzenhoefer, R.; Bunk, T.
2013-12-01
Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.
Groundwater Remediation using Bayesian Information-Gap Decision Theory
NASA Astrophysics Data System (ADS)
O'Malley, D.; Vesselinov, V. V.
2016-12-01
Probabilistic analyses of groundwater remediation scenarios frequently fail because the probability of an adverse, unanticipated event occurring is often high. In general, models of flow and transport in contaminated aquifers are always simpler than reality. Further, when a probabilistic analysis is performed, probability distributions are usually chosen more for convenience than correctness. The Bayesian Information-Gap Decision Theory (BIGDT) was designed to mitigate the shortcomings of the models and probabilistic decision analyses by leveraging a non-probabilistic decision theory - information-gap decision theory. BIGDT considers possible models that have not been explicitly enumerated and does not require us to commit to a particular probability distribution for model and remediation-design parameters. Both the set of possible models and the set of possible probability distributions grow as the degree of uncertainty increases. The fundamental question that BIGDT asks is "How large can these sets be before a particular decision results in an undesirable outcome?". The decision that allows these sets to be the largest is considered to be the best option. In this way, BIGDT enables robust decision-support for groundwater remediation problems. Here we apply BIGDT to in a representative groundwater remediation scenario where different options for hydraulic containment and pump & treat are being considered. BIGDT requires many model runs and for complex models high-performance computing resources are needed. These analyses are carried out on synthetic problems, but are applicable to real-world problems such as LANL site contaminations. BIGDT is implemented in Julia (a high-level, high-performance dynamic programming language for technical computing) and is part of the MADS framework (http://mads.lanl.gov/ and https://github.com/madsjulia/Mads.jl).
Probabilistic topic modeling for the analysis and classification of genomic sequences
2015-01-01
Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734
Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria
2012-01-01
This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122
NASA Technical Reports Server (NTRS)
Packard, Michael H.
2002-01-01
Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.
Eliciting probabilistic expectations: Collaborations between psychologists and economists
Bruine de Bruin, Wändi
2017-01-01
We describe two collaborations in which psychologists and economists provided essential support on foundational projects in major research programs. One project involved eliciting adolescents’ expectations regarding significant future life events affecting their psychological and economic development. The second project involved eliciting consumers’ expectations regarding inflation, a potentially vital input to their investment, saving, and purchasing decisions. In each project, we sought questions with the precision needed for economic modeling and the simplicity needed for lay respondents. We identify four conditions that, we believe, promoted our ability to sustain these transdisciplinary collaborations and coproduce the research: (i) having a shared research goal, which neither discipline could achieve on its own; (ii) finding common ground in shared methodology, which met each discipline’s essential evidentiary conditions, but without insisting on its culturally acquired tastes; (iii) sharing the effort throughout, with common language and sense of ownership; and (iv) gaining mutual benefit from both the research process and its products. PMID:28270610
Eliciting probabilistic expectations: Collaborations between psychologists and economists.
Bruine de Bruin, Wändi; Fischhoff, Baruch
2017-03-28
We describe two collaborations in which psychologists and economists provided essential support on foundational projects in major research programs. One project involved eliciting adolescents' expectations regarding significant future life events affecting their psychological and economic development. The second project involved eliciting consumers' expectations regarding inflation, a potentially vital input to their investment, saving, and purchasing decisions. In each project, we sought questions with the precision needed for economic modeling and the simplicity needed for lay respondents. We identify four conditions that, we believe, promoted our ability to sustain these transdisciplinary collaborations and coproduce the research: ( i ) having a shared research goal, which neither discipline could achieve on its own; ( ii ) finding common ground in shared methodology, which met each discipline's essential evidentiary conditions, but without insisting on its culturally acquired tastes; ( iii ) sharing the effort throughout, with common language and sense of ownership; and ( iv ) gaining mutual benefit from both the research process and its products.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less
Probabilistic seismic hazard zonation for the Cuban building code update
NASA Astrophysics Data System (ADS)
Garcia, J.; Llanes-Buron, C.
2013-05-01
A probabilistic seismic hazard assessment has been performed in response to a revision and update of the Cuban building code (NC-46-99) for earthquake-resistant building construction. The hazard assessment have been done according to the standard probabilistic approach (Cornell, 1968) and importing the procedures adopted by other nations dealing with the problem of revising and updating theirs national building codes. Problems of earthquake catalogue treatment, attenuation of peak and spectral ground acceleration, as well as seismic source definition have been rigorously analyzed and a logic-tree approach was used to represent the inevitable uncertainties encountered through the whole seismic hazard estimation process. The seismic zonation proposed here, is formed by a map where it is reflected the behaviour of the spectral acceleration values for short (0.2 seconds) and large (1.0 seconds) periods on rock conditions with a 1642 -year return period, which being considered as maximum credible earthquake (ASCE 07-05). In addition, other three design levels are proposed (severe earthquake: with a 808 -year return period, ordinary earthquake: with a 475 -year return period and minimum earthquake: with a 225 -year return period). The seismic zonation proposed here fulfils the international standards (IBC-ICC) as well as the world tendencies in this thematic.
NASA Technical Reports Server (NTRS)
Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha
2012-01-01
Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.
The Role of Working Memory in the Probabilistic Inference of Future Sensory Events.
Cashdollar, Nathan; Ruhnau, Philipp; Weisz, Nathan; Hasson, Uri
2017-05-01
The ability to represent the emerging regularity of sensory information from the external environment has been thought to allow one to probabilistically infer future sensory occurrences and thus optimize behavior. However, the underlying neural implementation of this process is still not comprehensively understood. Through a convergence of behavioral and neurophysiological evidence, we establish that the probabilistic inference of future events is critically linked to people's ability to maintain the recent past in working memory. Magnetoencephalography recordings demonstrated that when visual stimuli occurring over an extended time series had a greater statistical regularity, individuals with higher working-memory capacity (WMC) displayed enhanced slow-wave neural oscillations in the θ frequency band (4-8 Hz.) prior to, but not during stimulus appearance. This prestimulus neural activity was specifically linked to contexts where information could be anticipated and influenced the preferential sensory processing for this visual information after its appearance. A separate behavioral study demonstrated that this process intrinsically emerges during continuous perception and underpins a realistic advantage for efficient behavioral responses. In this way, WMC optimizes the anticipation of higher level semantic concepts expected to occur in the near future. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Incorporating Duration Information in Activity Recognition
NASA Astrophysics Data System (ADS)
Chaurasia, Priyanka; Scotney, Bryan; McClean, Sally; Zhang, Shuai; Nugent, Chris
Activity recognition has become a key issue in smart home environments. The problem involves learning high level activities from low level sensor data. Activity recognition can depend on several variables; one such variable is duration of engagement with sensorised items or duration of intervals between sensor activations that can provide useful information about personal behaviour. In this paper a probabilistic learning algorithm is proposed that incorporates episode, time and duration information to determine inhabitant identity and the activity being undertaken from low level sensor data. Our results verify that incorporating duration information consistently improves the accuracy.
Chen, Yi; Pouillot, Régis; S Burall, Laurel; Strain, Errol A; Van Doren, Jane M; De Jesus, Antonio J; Laasri, Anna; Wang, Hua; Ali, Laila; Tatavarthy, Aparna; Zhang, Guodong; Hu, Lijun; Day, James; Sheth, Ishani; Kang, Jihun; Sahu, Surasri; Srinivasan, Devayani; Brown, Eric W; Parish, Mickey; Zink, Donald L; Datta, Atin R; Hammack, Thomas S; Macarisin, Dumitru
2017-01-16
A precise and accurate method for enumeration of low level of Listeria monocytogenes in foods is critical to a variety of studies. In this study, paired comparison of most probable number (MPN) and direct plating enumeration of L. monocytogenes was conducted on a total of 1730 outbreak-associated ice cream samples that were naturally contaminated with low level of L. monocytogenes. MPN was performed on all 1730 samples. Direct plating was performed on all samples using the RAPID'L.mono (RLM) agar (1600 samples) and agar Listeria Ottaviani and Agosti (ALOA; 130 samples). Probabilistic analysis with Bayesian inference model was used to compare paired direct plating and MPN estimates of L. monocytogenes in ice cream samples because assumptions implicit in ordinary least squares (OLS) linear regression analyses were not met for such a comparison. The probabilistic analysis revealed good agreement between the MPN and direct plating estimates, and this agreement showed that the MPN schemes and direct plating schemes using ALOA or RLM evaluated in the present study were suitable for enumerating low levels of L. monocytogenes in these ice cream samples. The statistical analysis further revealed that OLS linear regression analyses of direct plating and MPN data did introduce bias that incorrectly characterized systematic differences between estimates from the two methods. Published by Elsevier B.V.
Brown, Meredith; Kuperberg, Gina R.
2015-01-01
Language and thought dysfunction are central to the schizophrenia syndrome. They are evident in the major symptoms of psychosis itself, particularly as disorganized language output (positive thought disorder) and auditory verbal hallucinations (AVHs), and they also manifest as abnormalities in both high-level semantic and contextual processing and low-level perception. However, the literatures characterizing these abnormalities have largely been separate and have sometimes provided mutually exclusive accounts of aberrant language in schizophrenia. In this review, we propose that recent generative probabilistic frameworks of language processing can provide crucial insights that link these four lines of research. We first outline neural and cognitive evidence that real-time language comprehension and production normally involve internal generative circuits that propagate probabilistic predictions to perceptual cortices — predictions that are incrementally updated based on prediction error signals as new inputs are encountered. We then explain how disruptions to these circuits may compromise communicative abilities in schizophrenia by reducing the efficiency and robustness of both high-level language processing and low-level speech perception. We also argue that such disruptions may contribute to the phenomenology of thought-disordered speech and false perceptual inferences in the language system (i.e., AVHs). This perspective suggests a number of productive avenues for future research that may elucidate not only the mechanisms of language abnormalities in schizophrenia, but also promising directions for cognitive rehabilitation. PMID:26640435
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110
Apolipoprotein C-III in triglyceride-rich lipoprotein metabolism.
Ramms, Bastian; Gordts, Philip L S M
2018-06-01
Apolipoprotein (apo) C-III is a key player in triglyceride-rich lipoprotein metabolism and strongly associated with elevated plasma triglyceride levels. Several new studies added important insights on apoC-III and its physiological function confirming its promise as a valid therapeutic target. APOC3 is expressed in liver and intestine and regulates triglyceride-rich lipoprotein (TRL) catabolism and anabolism. The transcriptional regulation in both organs requires different regulatory elements. Clinical and preclinical studies established that apoC-III raises plasma triglyceride levels predominantly by inhibiting hepatic TRL clearance. Mechanistic insights into missense variants indicate accelerated renal clearance of apoC-III variants resulting in enhanced TRL catabolism. In contrast, an APOC3 gain-of-function variant enhances de novo lipogenesis and hepatic TRL production. Multiple studies confirmed the correlation between increased apoC-III levels and cardiovascular disease. This has opened up new therapeutic avenues allowing targeting of specific apoC-III properties in triglyceride metabolism. Novel in vivo models and APOC3 missense variants revealed unique mechanisms by which apoC-III inhibits TRL catabolism. Clinical trials with Volanesorsen, an APOC3 antisense oligonucleotide, report very promising lipid-lowering outcomes. However, future studies will need to address if acute apoC-III lowering will have the same clinical benefits as a life-long reduction.
Subcortical structure segmentation using probabilistic atlas priors
NASA Astrophysics Data System (ADS)
Gouttard, Sylvain; Styner, Martin; Joshi, Sarang; Smith, Rachel G.; Cody Hazlett, Heather; Gerig, Guido
2007-03-01
The segmentation of the subcortical structures of the brain is required for many forms of quantitative neuroanatomic analysis. The volumetric and shape parameters of structures such as lateral ventricles, putamen, caudate, hippocampus, pallidus and amygdala are employed to characterize a disease or its evolution. This paper presents a fully automatic segmentation of these structures via a non-rigid registration of a probabilistic atlas prior and alongside a comprehensive validation. Our approach is based on an unbiased diffeomorphic atlas with probabilistic spatial priors built from a training set of MR images with corresponding manual segmentations. The atlas building computes an average image along with transformation fields mapping each training case to the average image. These transformation fields are applied to the manually segmented structures of each case in order to obtain a probabilistic map on the atlas. When applying the atlas for automatic structural segmentation, an MR image is first intensity inhomogeneity corrected, skull stripped and intensity calibrated to the atlas. Then the atlas image is registered to the image using an affine followed by a deformable registration matching the gray level intensity. Finally, the registration transformation is applied to the probabilistic maps of each structures, which are then thresholded at 0.5 probability. Using manual segmentations for comparison, measures of volumetric differences show high correlation with our results. Furthermore, the dice coefficient, which quantifies the volumetric overlap, is higher than 62% for all structures and is close to 80% for basal ganglia. The intraclass correlation coefficient computed on these same datasets shows a good inter-method correlation of the volumetric measurements. Using a dataset of a single patient scanned 10 times on 5 different scanners, reliability is shown with a coefficient of variance of less than 2 percents over the whole dataset. Overall, these validation and reliability studies show that our method accurately and reliably segments almost all structures. Only the hippocampus and amygdala segmentations exhibit relative low correlation with the manual segmentation in at least one of the validation studies, whereas they still show appropriate dice overlap coefficients.
Trastuzumab in early stage breast cancer: a cost-effectiveness analysis for Belgium.
Neyt, Mattias; Huybrechts, Michel; Hulstaert, Frank; Vrijens, France; Ramaekers, Dirk
2008-08-01
Although trastuzumab is traditionally used in metastatic breast cancer treatment, studies reported on the efficacy and safety of trastuzumab in adjuvant setting for the treatment of early stage breast cancer in HER2+ tumors. We estimated the cost-effectiveness and budget impact of reimbursing trastuzumab in this indication from a payer's perspective. We constructed a health economic model. Long-term consequences of preventing patients to progress to metastatic breast cancer and side effects such as congestive heart failure were taken into account. Uncertainty was handled applying probabilistic modeling and through probabilistic sensitivity analyses. In the HERA scenario, applying an arbitrary threshold of euro30000 per life-year gained, early stage breast cancer treatment with trastuzumab is cost-effective for 9 out of 15 analyzed subgroups (according to age and stage). In contrast, treatment according to the FinHer scenario is cost-effective in 14 subgroups. Furthermore, the FinHer regimen is most of the times cost saving with an average incremental cost of euro668, euro-1045, and euro-6869 for respectively stages I, II and III breast cancer patients whereas the HERA regimen is never cost saving due to the higher initial treatment costs. The model shows better cost-effectiveness for the 9-week initial treatment (FinHer) compared to no trastuzumab treatment than for the 1-year post-chemotherapy treatment (HERA). Both from a medical and an economic point of view, the 9-week initial treatment regimen with trastuzumab shows promising results and justifies the initiation of a large comparative trial with a 1-year regimen.
NASA Astrophysics Data System (ADS)
Constantinescu, Robert; Robertson, Richard; Lindsay, Jan M.; Tonini, Roberto; Sandri, Laura; Rouwet, Dmitri; Smith, Patrick; Stewart, Roderick
2016-11-01
We report on the first "real-time" application of the BET_UNREST (Bayesian Event Tree for Volcanic Unrest) probabilistic model, during a VUELCO Simulation Exercise carried out on the island of Dominica, Lesser Antilles, in May 2015. Dominica has a concentration of nine potentially active volcanic centers and frequent volcanic earthquake swarms at shallow depths, intense geothermal activity, and recent phreatic explosions (1997) indicate the region is still active. The exercise scenario was developed in secret by a team of scientists from The University of the West Indies (Trinidad and Tobago) and University of Auckland (New Zealand). The simulated unrest activity was provided to the exercise's Scientific Team in three "phases" through exercise injects comprising processed monitoring data. We applied the newly created BET_UNREST model through its software implementation PyBetUnrest, to estimate the probabilities of having (i) unrest of (ii) magmatic, hydrothermal or tectonic origin, which may or may not lead to (iii) an eruption. The probabilities obtained for each simulated phase raised controversy and intense deliberations among the members of the Scientific Team. The results were often considered to be "too high" and were not included in any of the reports presented to ODM (Office for Disaster Management) revealing interesting crisis communication challenges. We concluded that the PyBetUnrest application itself was successful and brought the tool one step closer to a full implementation. However, as with any newly proposed method, it needs more testing, and in order to be able to use it in the future, we make a series of recommendations for future applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lumen, A, E-mail: Annie.Lumen@fda.hhs.gov
The risk of ubiquitous perchlorate exposure and the dose-response on thyroid hormone levels in pregnant women in the United States (U.S.) have yet to be characterized. In the current work, we integrated a previously developed perchlorate submodel into a recently developed population-based pregnancy model to predict reductions in maternal serum free thyroxine (fT4) levels for late-gestation pregnant women in the U.S. Our findings indicated no significant difference in geometric mean estimates of fT4 when perchlorate exposure from food only was compared to no perchlorate exposure. The reduction in maternal fT4 levels reached statistical significance when an added contribution from drinkingmore » water (i.e., 15 μg/L, 20 μg/L, or 24.5 μg/L) was assumed in addition to the 90th percentile of food intake for pregnant women (0.198 μg/kg/day). We determined that a daily intake of 0.45 to 0.50 μg/kg/day of perchlorate was necessary to produce results that were significantly different than those obtained from no perchlorate exposure. Adjusting for this food intake dose, the relative source contribution of perchlorate from drinking water (or other non-dietary sources) was estimated to range from 0.25–0.3 μg/kg/day. Assuming a drinking water intake rate of 0.033 L/kg/day, the drinking water concentration allowance for perchlorate equates to 7.6–9.2 μg/L. In summary, we have demonstrated the utility of a probabilistic biologically-based dose-response model for perchlorate risk assessment in a sensitive life-stage at a population level; however, there is a need for continued monitoring in regions of the U.S. where perchlorate exposure may be higher. - Highlights: • Probabilistic risk assessment for perchlorate in U.S. pregnant women was conducted. • No significant change in maternal fT4 predicted due to perchlorate from food alone. • Drinking water concentration allowance for perchlorate estimated as 7.6–9.2 μg/L.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...
Judging Words by Their Covers and the Company They Keep: Probabilistic Cues Support Word Learning
ERIC Educational Resources Information Center
Lany, Jill
2014-01-01
Statistical learning may be central to lexical and grammatical development. The phonological and distributional properties of words provide probabilistic cues to their grammatical and semantic properties. Infants can capitalize on such probabilistic cues to learn grammatical patterns in listening tasks. However, infants often struggle to learn…
Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.
Herzallah, Randa
2015-03-01
Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.
Global/local methods for probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.
1993-01-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Global/local methods for probabilistic structural analysis
NASA Astrophysics Data System (ADS)
Millwater, H. R.; Wu, Y.-T.
1993-04-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Probabilistic framework for assessing the ice sheet contribution to sea level change.
Little, Christopher M; Urban, Nathan M; Oppenheimer, Michael
2013-02-26
Previous sea level rise (SLR) assessments have excluded the potential for dynamic ice loss over much of Greenland and Antarctica, and recently proposed "upper bounds" on Antarctica's 21st-century SLR contribution are derived principally from regions where present-day mass loss is concentrated (basin 15, or B15, drained largely by Pine Island, Thwaites, and Smith glaciers). Here, we present a probabilistic framework for assessing the ice sheet contribution to sea level change that explicitly accounts for mass balance uncertainty over an entire ice sheet. Applying this framework to Antarctica, we find that ongoing mass imbalances in non-B15 basins give an SLR contribution by 2100 that: (i) is comparable to projected changes in B15 discharge and Antarctica's surface mass balance, and (ii) varies widely depending on the subset of basins and observational dataset used in projections. Increases in discharge uncertainty, or decreases in the exceedance probability used to define an upper bound, increase the fractional contribution of non-B15 basins; even weak spatial correlations in future discharge growth rates markedly enhance this sensitivity. Although these projections rely on poorly constrained statistical parameters, they may be updated with observations and/or models at many spatial scales, facilitating a more comprehensive account of uncertainty that, if implemented, will improve future assessments.
Software architecture of the III/FBI segment of the FBI's integrated automated identification system
NASA Astrophysics Data System (ADS)
Booker, Brian T.
1997-02-01
This paper will describe the software architecture of the Interstate Identification Index (III/FBI) Segment of the FBI's Integrated Automated Fingerprint Identification System (IAFIS). IAFIS is currently under development, with deployment to begin in 1998. III/FBI will provide the repository of criminal history and photographs for criminal subjects, as well as identification data for military and civilian federal employees. Services provided by III/FBI include maintenance of the criminal and civil data, subject search of the criminal and civil data, and response generation services for IAFIS. III/FBI software will be comprised of both COTS and an estimated 250,000 lines of developed C code. This paper will describe the following: (1) the high-level requirements of the III/FBI software; (2) the decomposition of the III/FBI software into Computer Software Configuration Items (CSCIs); (3) the top-level design of the III/FBI CSCIs; and (4) the relationships among the developed CSCIs and the COTS products that will comprise the III/FBI software.
Zeng, Jia; Hannenhalli, Sridhar
2013-01-01
Gene duplication, followed by functional evolution of duplicate genes, is a primary engine of evolutionary innovation. In turn, gene expression evolution is a critical component of overall functional evolution of paralogs. Inferring evolutionary history of gene expression among paralogs is therefore a problem of considerable interest. It also represents significant challenges. The standard approaches of evolutionary reconstruction assume that at an internal node of the duplication tree, the two duplicates evolve independently. However, because of various selection pressures functional evolution of the two paralogs may be coupled. The coupling of paralog evolution corresponds to three major fates of gene duplicates: subfunctionalization (SF), conserved function (CF) or neofunctionalization (NF). Quantitative analysis of these fates is of great interest and clearly influences evolutionary inference of expression. These two interrelated problems of inferring gene expression and evolutionary fates of gene duplicates have not been studied together previously and motivate the present study. Here we propose a novel probabilistic framework and algorithm to simultaneously infer (i) ancestral gene expression and (ii) the likely fate (SF, NF, CF) at each duplication event during the evolution of gene family. Using tissue-specific gene expression data, we develop a nonparametric belief propagation (NBP) algorithm to predict the ancestral expression level as a proxy for function, and describe a novel probabilistic model that relates the predicted and known expression levels to the possible evolutionary fates. We validate our model using simulation and then apply it to a genome-wide set of gene duplicates in human. Our results suggest that SF tends to be more frequent at the earlier stage of gene family expansion, while NF occurs more frequently later on.
Wang, Yan; Deng, Lei; Caballero-Guzman, Alejandro; Nowack, Bernd
2016-12-01
Nano iron oxide particles are beneficial to our daily lives through their use in paints, construction materials, biomedical imaging and other industrial fields. However, little is known about the possible risks associated with the current exposure level of engineered nano iron oxides (nano-FeOX) to organisms in the environment. The goal of this study was to predict the release of nano-FeOX to the environment and assess their risks for surface waters in the EU and Switzerland. The material flows of nano-FeOX to technical compartments (waste incineration and waste water treatment plants) and to the environment were calculated with a probabilistic modeling approach. The mean value of the predicted environmental concentrations (PECs) of nano-FeOX in surface waters in the EU for a worst-case scenario (no particle sedimentation) was estimated to be 28 ng/l. Using a probabilistic species sensitivity distribution, the predicted no-effect concentration (PNEC) was determined from ecotoxicological data. The risk characterization ratio, calculated by dividing the PEC by PNEC values, was used to characterize the risks. The mean risk characterization ratio was predicted to be several orders of magnitude smaller than 1 (1.4 × 10 - 4 ). Therefore, this modeling effort indicates that only a very limited risk is posed by the current release level of nano-FeOX to organisms in surface waters. However, a better understanding of the hazards of nano-FeOX to the organisms in other ecosystems (such as sediment) needs to be assessed to determine the overall risk of these particles to the environment.
Villalpando, Salvador; García-Guerra, Armando; Ramírez-Silva, Claudia Ivonne; Mejía-Rodríguez, Fabiola; Matute, Guadalupe; Shamah-Levy, Teresa; Rivera, Juan A
2003-01-01
To describe the epidemiology of iron, zinc and iodide deficiencies in a probabilistic sample of Mexican women and children and explore its association with some dietary and socio-demographic variables. We carried out in 1999 an epidemiological description of iron (percent transferrin saturation, PTS, < 16%), serum zinc (< 65 ug/dl) and iodide (< 50 ug/l urine) deficiencies in a probabilistic sample of 1,363 Mexican children under 12 years and of 731 women of child-bearing age. Serum iron, Total Iron Binding Capacity (TIBC) and zinc were measured by atomic absorption spectrometry, and urinary iodide by a colorimetric method. Logistic regression models explored determinants for such micromineral deficiencies. Iron deficiency was higher (67%) in infants < 2 years of age. Prevalence declined (34-39%) at school age. The prevalence for iron deficiency in women was 40%. Zinc deficiency was higher in infants < 2 years of age (34%) than in school-age children (19-24%). Prevalence in women was 30%, with no rural/urban difference. In women the likelihood of iron deficiency decreased as SEL improved (p = 0.04) and increased with the intake of cereals (p = 0.01). The likelihood of low serum zinc levels was greater in women and children of low socioeconomic level (SEL) (p < 0.02 and p = 0.001) iodide deficiency was negligible in both children and women. The data shows high prevalence of iron deficiency-specially in infants 12 to 24 months of age. It is suggested that in older children and women 12 to 49 years of age that iron bioavailability is low. The prevalence of zinc deficiency was also very high. The English version of this paper is available too at: http://www.insp.mx/salud/index.html.
Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty
NASA Astrophysics Data System (ADS)
Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang
2016-12-01
Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration signal and principal components.
Physical limits on ground motion at Yucca Mountain
Andrews, D.J.; Hanks, T.C.; Whitney, J.W.
2007-01-01
Physical limits on possible maximum ground motion at Yucca Mountain, Nevada, the designated site of a high-level radioactive waste repository, are set by the shear stress available in the seismogenic depth of the crust and by limits on stress change that can propagate through the medium. We find in dynamic deterministic 2D calculations that maximum possible horizontal peak ground velocity (PGV) at the underground repository site is 3.6 m/sec, which is smaller than the mean PGV predicted by the probabilistic seismic hazard analysis (PSHA) at annual exceedance probabilities less than 10-6 per year. The physical limit on vertical PGV, 5.7 m/sec, arises from supershear rupture and is larger than that from the PSHA down to 10-8 per year. In addition to these physical limits, we also calculate the maximum ground motion subject to the constraint of known fault slip at the surface, as inferred from paleoseismic studies. Using a published probabilistic fault displacement hazard curve, these calculations provide a probabilistic hazard curve for horizontal PGV that is lower than that from the PSHA. In all cases the maximum ground motion at the repository site is found by maximizing constructive interference of signals from the rupture front, for physically realizable rupture velocity, from all parts of the fault. Vertical PGV is maximized for ruptures propagating near the P-wave speed, and horizontal PGV is maximized for ruptures propagating near the Rayleigh-wave speed. Yielding in shear with a Mohr-Coulomb yield condition reduces ground motion only a modest amount in events with supershear rupture velocity, because ground motion consists primarily of P waves in that case. The possibility of compaction of the porous unsaturated tuffs at the higher ground-motion levels is another attenuating mechanism that needs to be investigated.
[Five years after the Spanish neonatal resuscitation survey. Are we improving?].
Iriondo, M; Izquierdo, M; Salguero, E; Aguayo, J; Vento, M; Thió, M
2016-05-01
An analysis is presented of delivery room (DR) neonatal resuscitation practices in Spanish hospitals. A questionnaire was sent by e-mail to all hospitals attending deliveries in Spain. A total of 180 questionnaires were sent, of which 155 were fully completed (86%). Less than half (71, 46%) were level i or ii hospitals, while 84 were level iii hospital (54%). In almost three-quarters (74.2%) of the centres, parents and medical staff were involved in the decision on whether to start resuscitation or withdraw it. A qualified resuscitation team (at least two members) was available in 80% of the participant centres (63.9% level i-ii, and 94.0% level iii, P<.001). Neonatal resuscitation courses were held in 90.3% of the centres. The availability of gas blenders, pulse oximeters, manual ventilators, and plastic wraps was higher in level iii hospitals. Plastic wraps for pre-term hypothermia prevention were used in 63.9% of the centres (40.8% level i-iiand 83.3% level iii, P<.001). Term newborn resuscitation was started on room air in 89.7% of the centres. A manual ventilator (T-piece) was the device used in most cases when ventilation was required (42.3% level i-iiand 78.6% level iii, P<.001). Early CPAP in preterm infants was applied in 91.7% of the tertiary hospitals. In last 5 years some practices have improved, such neonatal resuscitation training, pulse oximeter use, or early CPAP support. There is an improvement in some practices of neonatal resuscitation. Significant differences have been found as regards the equipment or practices in the DR, when comparing hospitals of different levels of care. Copyright © 2015 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.
NASA Technical Reports Server (NTRS)
Boyce, L.
1992-01-01
A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Probabilistic machine learning and artificial intelligence
NASA Astrophysics Data System (ADS)
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
ERIC Educational Resources Information Center
Kolodny, Oren; Lotem, Arnon; Edelman, Shimon
2015-01-01
We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given…
Critical voids in exposure data and models lead risk assessors to rely on conservative assumptions. Risk assessors and managers need improved tools beyond the screening level analysis to address aggregate exposures to pesticides as required by the Food Quality Protection Act o...
Marbles: A Means of Introducing Students to Scattering Concepts
ERIC Educational Resources Information Center
Bender, K. M.; Westphal, P. S.; Ramsier, R. D.
2008-01-01
The purpose of this activity is to introduce students to concepts of short-range and long-range scattering, and engage them in using indirect measurements and probabilistic models. The activity uses simple and readily available apparatus, and can be adapted for use with secondary level students as well as those in general physics courses or…
ERIC Educational Resources Information Center
de Vries, Meinou H.; Ulte, Catrin; Zwitserlood, Pienie; Szymanski, Barbara; Knecht, Stefan
2010-01-01
Recently, an increasing number of studies have suggested a role for the basal ganglia and related dopamine inputs in procedural learning, specifically when learning occurs through trial-by-trial feedback (Shohamy, Myers, Kalanithi, & Gluck. (2008). "Basal ganglia and dopamine contributions to probabilistic category learning." "Neuroscience and…
Near-term probabilistic forecast of significant wildfire events for the Western United States
Haiganoush K. Preisler; Karin L. Riley; Crystal S. Stonesifer; Dave E. Calkin; Matt Jolly
2016-01-01
Fire danger and potential for large fires in the United States (US) is currently indicated via several forecasted qualitative indices. However, landscape-level quantitative forecasts of the probability of a large fire are currently lacking. In this study, we present a framework for forecasting large fire occurrence - an extreme value event - and evaluating...
ERIC Educational Resources Information Center
Fazio, Frank; Moser, Gene W.
A probabilistic model (see SE 013 578) describing information processing during the cognitive tasks of recall and problem solving was tested, refined, and developed by testing graduate students on a number of tasks which combined oral, written, and overt "input" and "output" modes in several ways. In a verbal chain one subject…
Integrated Risk-Informed Decision-Making for an ALMR PRISM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, Michael David; Belles, Randy; Denning, Richard S.
Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less
Probabilistic Simulation of Multi-Scale Composite Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2012-01-01
A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.
Sorace, Lorenzo; Sangregorio, Claudio; Figuerola, Albert; Benelli, Cristiano; Gatteschi, Dante
2009-01-01
We report here a detailed single-crystal EPR and magnetic study of a homologous series of complexes of the type Ln-M (Ln = La(III), Ce(III); M = Fe(III), Co(III)). We were able to obtain a detailed picture of the low-lying levels of Ce(III) and Fe(III) centres through the combined use of single-crystal EPR and magnetic susceptibility data. We show that classical ligand field theory can be of great help in rationalising the energies of the low-lying levels of both the transition-metal and rare-earth ions. The combined analysis of single-crystal EPR and magnetic data of the coupled system Ce-Fe confirmed the great complexity of the interactions involving rare-earth elements. With little uncertainty, it turned out clearly that the description of the interaction involving the lowest lying spin levels requires the introduction of the isotropic, anisotropic and antisymmetric terms.
Barnard, Patrick; Maarten van Ormondt,; Erikson, Li H.; Jodi Eshleman,; Hapke, Cheryl J.; Peter Ruggiero,; Peter Adams,; Foxgrover, Amy C.
2014-01-01
The Coastal Storm Modeling System (CoSMoS) applies a predominantly deterministic framework to make detailed predictions (meter scale) of storm-induced coastal flooding, erosion, and cliff failures over large geographic scales (100s of kilometers). CoSMoS was developed for hindcast studies, operational applications (i.e., nowcasts and multiday forecasts), and future climate scenarios (i.e., sea-level rise + storms) to provide emergency responders and coastal planners with critical storm hazards information that may be used to increase public safety, mitigate physical damages, and more effectively manage and allocate resources within complex coastal settings. The prototype system, developed for the California coast, uses the global WAVEWATCH III wave model, the TOPEX/Poseidon satellite altimetry-based global tide model, and atmospheric-forcing data from either the US National Weather Service (operational mode) or Global Climate Models (future climate mode), to determine regional wave and water-level boundary conditions. These physical processes are dynamically downscaled using a series of nested Delft3D-WAVE (SWAN) and Delft3D-FLOW (FLOW) models and linked at the coast to tightly spaced XBeach (eXtreme Beach) cross-shore profile models and a Bayesian probabilistic cliff failure model. Hindcast testing demonstrates that, despite uncertainties in preexisting beach morphology over the ~500 km alongshore extent of the pilot study area, CoSMoS effectively identifies discrete sections of the coast (100s of meters) that are vulnerable to coastal hazards under a range of current and future oceanographic forcing conditions, and is therefore an effective tool for operational and future climate scenario planning.
A Bayesian network to predict coastal vulnerability to sea level rise
Gutierrez, B.T.; Plant, N.G.; Thieler, E.R.
2011-01-01
Sea level rise during the 21st century will have a wide range of effects on coastal environments, human development, and infrastructure in coastal areas. The broad range of complex factors influencing coastal systems contributes to large uncertainties in predicting long-term sea level rise impacts. Here we explore and demonstrate the capabilities of a Bayesian network (BN) to predict long-term shoreline change associated with sea level rise and make quantitative assessments of prediction uncertainty. A BN is used to define relationships between driving forces, geologic constraints, and coastal response for the U.S. Atlantic coast that include observations of local rates of relative sea level rise, wave height, tide range, geomorphic classification, coastal slope, and shoreline change rate. The BN is used to make probabilistic predictions of shoreline retreat in response to different future sea level rise rates. Results demonstrate that the probability of shoreline retreat increases with higher rates of sea level rise. Where more specific information is included, the probability of shoreline change increases in a number of cases, indicating more confident predictions. A hindcast evaluation of the BN indicates that the network correctly predicts 71% of the cases. Evaluation of the results using Brier skill and log likelihood ratio scores indicates that the network provides shoreline change predictions that are better than the prior probability. Shoreline change outcomes indicating stability (-1 1 m/yr) was not well predicted. We find that BNs can assimilate important factors contributing to coastal change in response to sea level rise and can make quantitative, probabilistic predictions that can be applied to coastal management decisions. Copyright ?? 2011 by the American Geophysical Union.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
Probabilistic Geoacoustic Inversion in Complex Environments
2015-09-30
Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is
Correleation of the SAGE III on ISS Thermal Models in Thermal Desktop
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; Davis, Warren T.; Liles, Kaitlin, A. K.; McLeod, Shawn C.
2017-01-01
The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III was launched on February 19, 2017 and mounted to the International Space Station (ISS) to begin its three-year mission. A detailed thermal model of the SAGE III payload, which consists of multiple subsystems, has been developed in Thermal Desktop (TD). Correlation of the thermal model is important since the payload will be expected to survive a three-year mission on ISS under varying thermal environments. Three major thermal vacuum (TVAC) tests were completed during the development of the SAGE III Instrument Payload (IP); two subsystem-level tests and a payload-level test. Additionally, a characterization TVAC test was performed in order to verify performance of a system of heater plates that was designed to allow the IP to achieve the required temperatures during payload-level testing; model correlation was performed for this test configuration as well as those including the SAGE III flight hardware. This document presents the methods that were used to correlate the SAGE III models to TVAC at the subsystem and IP level, including the approach for modeling the parts of the payload in the thermal chamber, generating pre-test predictions, and making adjustments to the model to align predictions with temperatures observed during testing. Model correlation quality will be presented and discussed, and lessons learned during the correlation process will be shared.
Model for Cumulative Solar Heavy Ion Energy and LET Spectra
NASA Technical Reports Server (NTRS)
Xapsos, Mike; Barth, Janet; Stauffer, Craig; Jordan, Tom; Mewaldt, Richard
2007-01-01
A probabilistic model of cumulative solar heavy ion energy and lineary energy transfer (LET) spectra is developed for spacecraft design applications. Spectra are given as a function of confidence level, mission time period during solar maximum and shielding thickness. It is shown that long-term solar heavy ion fluxes exceed galactic cosmic ray fluxes during solar maximum for shielding levels of interest. Cumulative solar heavy ion fluences should therefore be accounted for in single event effects rate calculations and in the planning of space missions.
Uric acid levels in plasma and urine in rats chronically exposed to inorganic As (III) and As(V).
Jauge, P; Del-Razo, L M
1985-07-01
The effect of inorganic arsenic (III) and arsenic (V) on renal excretion and plasma levels of uric acid was examined in rats. Oral administration of 1200 micrograms As/kg/day for 6 weeks diminished uric acid levels in plasma by 67.1% and 26.5% of control after the administration of As(III) and As(V), respectively. Renal excretion of uric acid was significantly reduced during the first 3 weeks following As (III) administration, with a subsequent increase to approach control values at the end of the treatment. When As(V) was administered, the diminution in renal excretion was significant at 6 weeks.
Probabilistic liquefaction triggering based on the cone penetration test
Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.
2005-01-01
Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.
Scientific assessment of accuracy, skill and reliability of ocean probabilistic forecast products.
NASA Astrophysics Data System (ADS)
Wei, M.; Rowley, C. D.; Barron, C. N.; Hogan, P. J.
2016-02-01
As ocean operational centers are increasingly adopting and generating probabilistic forecast products for their customers with valuable forecast uncertainties, how to assess and measure these complicated probabilistic forecast products objectively is challenging. The first challenge is how to deal with the huge amount of the data from the ensemble forecasts. The second one is how to describe the scientific quality of probabilistic products. In fact, probabilistic forecast accuracy, skills, reliability, resolutions are different attributes of a forecast system. We briefly introduce some of the fundamental metrics such as the Reliability Diagram, Reliability, Resolution, Brier Score (BS), Brier Skill Score (BSS), Ranked Probability Score (RPS), Ranked Probability Skill Score (RPSS), Continuous Ranked Probability Score (CRPS), and Continuous Ranked Probability Skill Score (CRPSS). The values and significance of these metrics are demonstrated for the forecasts from the US Navy's regional ensemble system with different ensemble members. The advantages and differences of these metrics are studied and clarified.
Probabilistic thinking and death anxiety: a terror management based study.
Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S
2014-01-01
Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.
Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.
Avramenko, M; Bolyatko, V; Kosterev, V
2005-01-01
Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.
Do probabilistic forecasts lead to better decisions?
NASA Astrophysics Data System (ADS)
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2012-12-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.
Do probabilistic forecasts lead to better decisions?
NASA Astrophysics Data System (ADS)
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2013-06-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.
NASA Astrophysics Data System (ADS)
Smith, L. A.
2012-04-01
There is, at present, no attractive foundation for quantitative probabilistic decision support in the face of model inadequacy, or given ambiguity (deep uncertainty) regarding the relative likelihood of various outcomes, known or unknown. True model error arguably precludes the extraction of objective probabilities from an ensemble of model runs drawn from an available (inadequate) model class, while the acknowledgement of incomplete understanding precludes the justified use of (if not the very formation of) an individual's subjective probabilities. An alternative approach based on Sustainable Odds is proposed and investigated. Sustainable Odds differ from "fair odds" (and are easily distinguished any claim which implying well defined probabilities) as the probabilities implied by sustainable odds summed over all outcomes is expected to exceed one. Traditionally, a person's fair odds are found by identifying the probability level at which one would happily accept either side of a bet, thus the probabilities implied by fair odds always sum to one. Knowing that one has incomplete information and perhaps even erroneous beliefs, there is no compelling reason a rational agent should accept the constraint implied by "fair odds" in any bet. Rather, a rational agent might insist on longer odds both on the event and against the event in order to account for acknowledged ignorance. Let probabilistic odds imply any set of odds for which the implied probabilities sum to one; once model error is acknowledged can one rationally demand non-probabilistic odds? The danger of using fair odds (or probabilities) in decision making is illustrated by considering the risk of ruin a cooperative insurance scheme using probabilistic odds is exposed to. Cases where knowing merely that the insurer's model is imperfect, and nothing else, is sufficient to place bets which drive the insurer to an unexpectedly early ruin are presented. Methodologies which allow the insurer to avoid this early ruin are explored; those which prevent early ruin are said to provide "sustainable odds", and it is suggested that these must be non-probabilistic. The aim here is not for the insurance cooperative to make a profit in the long run (or to form a book in any one round) but rather to increase the chance that the cooperative will not go bust, merely breaking even in the long run and thereby continuing to provide a service. In the perfect model scenario, with complete knowledge of all uncertainties and unlimited computational resources, fair odds may prove to be sustainable. The implications these results hold in the case of games against nature, which is perhaps a more relevant context for decision makers concerned with geophysical systems, are discussed. The claim that acknowledged model error makes fair (probabilistic) odds an irrational aim is considered, as are the challenges of working within the framework of sustainable (but non-probabilistic) odds.
‘Particle genetics’: treating every cell as unique
Yvert, Gaël
2014-01-01
Genotype-phenotype relations are usually inferred from a deterministic point of view. For example, quantitative trait loci (QTL), which describe regions of the genome associated with a particular phenotype, are based on a mean trait difference between genotype categories. However, living systems comprise huge numbers of cells (the ‘particles’ of biology). Each cell can exhibit substantial phenotypic individuality, which can have dramatic consequences at the organismal level. Now, with technology capable of interrogating individual cells, it is time to consider how genotypes shape the probability laws of single cell traits. The possibility of mapping single cell probabilistic trait loci (PTL), which link genomic regions to probabilities of cellular traits, is a promising step in this direction. This approach requires thinking about phenotypes in probabilistic terms, a concept that statistical physicists have been applying to particles for a century. Here, I describe PTL and discuss their potential to enlarge our understanding of genotype-phenotype relations. PMID:24315431
Damage Tolerance and Reliability of Turbine Engine Components
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1999-01-01
This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.
Galtier, N; Boursot, P
2000-03-01
A new, model-based method was devised to locate nucleotide changes in a given phylogenetic tree. For each site, the posterior probability of any possible change in each branch of the tree is computed. This probabilistic method is a valuable alternative to the maximum parsimony method when base composition is skewed (i.e., different from 25% A, 25% C, 25% G, 25% T): computer simulations showed that parsimony misses more rare --> common than common --> rare changes, resulting in biased inferred change matrices, whereas the new method appeared unbiased. The probabilistic method was applied to the analysis of the mutation and substitution processes in the mitochondrial control region of mouse. Distinct change patterns were found at the polymorphism (within species) and divergence (between species) levels, rejecting the hypothesis of a neutral evolution of base composition in mitochondrial DNA.
Research on response spectrum of dam based on scenario earthquake
NASA Astrophysics Data System (ADS)
Zhang, Xiaoliang; Zhang, Yushan
2017-10-01
Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.
A Tutorial in Bayesian Potential Outcomes Mediation Analysis.
Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P
2018-01-01
Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.
Probabilistic performance-based design for high performance control systems
NASA Astrophysics Data System (ADS)
Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice
2017-04-01
High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.
Ge, Yichen; Gong, Zhihong; Olson, James R; Xu, Peilin; Buck, Michael J; Ren, Xuefeng
2013-10-04
Inorganic arsenic (iAs) and its high toxic metabolite, monomethylarsonous acid (MMA(III)), are able to induce malignant transformation of human cells. Chronic exposure to these chemicals is associated with an increased risk of developing multiple cancers in human. However, the mechanisms contributing to iAs/MMA(III)-induced cell malignant transformation and carcinogenesis are not fully elucidated. We recently showed that iAs/MMA(III) exposure to human cells led to a decreased level of histone acetylation globally, which was associated with an increased sensitivity to arsenic cytotoxicity. In the current study, it demonstrated that prolonged exposure to low-level MMA(III) in human urothelial cells significantly increased the expression and activity of histone deacetylases (HDACs) with an associated reduction of histone acetylation levels both globally and lysine specifically. Administration of the HDAC inhibitor, suberoylanilide hydroxamic acid (SAHA), at 4 weeks after the initial MMA(III) treatment inhibited the MMA(III)-mediated up-regulation of the expression and activities of HDACs, leading to increase histone acetylation and prevention of MMA(III)-induced malignant transformation. These new findings suggest that histone acetylation dysregulation may be a key mechanism in MMA(III)-induced malignant transformation and carcinogenesis, and that HDAC inhibitors could be targeted to prevent or treat iAs-related cancers. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen
2014-05-01
Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.
Level III preventive medicine in a counterinsurgency environment.
Licina, Derek J
2008-01-01
As the Department of Defense moves forward to secure Baghdad, military forces are being strategically dispersed in very austere environments. These forces live and work side-by-side with their Iraqi counterparts in an effort to clear, hold, and reconstruct the city block by block, and further separate the insurgents from the general population. Level II preventive medicine (PM) personnel directly support these forces and keep them in the fight by reducing acute illness and disease and nonbattle injuries. Level III PM is performing the traditional PM mission of reducing both acute and chronic illness while conducting Deployment Occupational Environmental Health Surveillance and supporting Level II PM. However, the doctrinal basis of Level III allocation and priorities of core competencies have shifted. Are we meeting the need? This article attempts to answer the question based on experience as a Level III PM detachment commander in Baghdad, and provide recommendations for change across the spectrum of the Army's structure of doctrine, organizations, training, materiel, leadership, education, personnel, and facilities.
Level III and IV Ecoregions by State
Information and links to downloadable maps and datasets for Level III and IV ecoregions, listed by state. Ecoregions are areas of general similarity in the type, quality, and quantity of environmental resources.
Level III and IV Ecoregions by EPA Region
Information and downloadable maps and datasets for Level III and IV ecoregions, listed by EPA region. Ecoregions are areas of general similarity in the type, quality, and quantity of environmental resources.
Control of Walking Speed in Children With Cerebral Palsy.
Davids, Jon R; Cung, Nina Q; Chen, Suzy; Sison-Williamson, Mitell; Bagley, Anita M
2017-03-21
Children's ability to control the speed of gait is important for a wide range of activities. It is thought that the ability to increase the speed of gait for children with cerebral palsy (CP) is common. This study considered 3 hypotheses: (1) most ambulatory children with CP can increase gait speed, (2) the characteristics of free (self-selected) and fast walking are related to motor impairment level, and (3) the strategies used to increase gait speed are distinct among these levels. A retrospective review of time-distance parameters (TDPs) for 212 subjects with CP and 34 typically developing subjects walking at free and fast speeds was performed. Only children who could increase their gait speed above the minimal clinically important difference were defined as having a fast walk. Analysis of variance was used to compare TDPs of children with CP, among Gross Motor Function Classification System (GMFCS) levels, and children in typically developing group. Eight-five percent of the CP group (GMFCS I, II, III; 96%, 99%, and 34%, respectively) could increase gait speed on demand. At free speed, children at GMFCS I and II were significantly faster than children at GMFCS level III. At free speed, children at GMFCS I and II had significantly greater stride length than those at GMFCS levels III. At free speed, children at GMFCS level III had significantly lower cadence than those at GMFCS I and II. There were no significant differences in cadence among GMFCS levels at fast speeds. There were no significant differences among GMFCS levels for percent change in any TDP between free and fast walking. Almost all children with CP at GMFCS levels I and II can control the speed of gait, however, only one-third at GMFCS III level have this ability. This study suggests that children at GMFCS III level can be divided into 2 groups based on their ability to control gait speed; however, the prognostic significance of such categorization remains to be determined. Diagnostic level II.
Structural reliability assessment capability in NESSUS
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.
1992-01-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
Structural reliability assessment capability in NESSUS
NASA Astrophysics Data System (ADS)
Millwater, H.; Wu, Y.-T.
1992-07-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
Probabilistic Composite Design
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1997-01-01
Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.
Probabilistic population projections with migration uncertainty
Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.
2016-01-01
We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571
Application of Probabilistic Analysis to Aircraft Impact Dynamics
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.
Emergence of spontaneous anticipatory hand movements in a probabilistic environment
Bruhn, Pernille
2013-01-01
In this article, we present a novel experimental approach to the study of anticipation in probabilistic cuing. We implemented a modified spatial cuing task in which participants made an anticipatory hand movement toward one of two probabilistic targets while the (x, y)-computer mouse coordinates of their hand movements were sampled. This approach allowed us to tap into anticipatory processes as they occurred, rather than just measuring their behavioral outcome through reaction time to the target. In different conditions, we varied the participants’ degree of certainty of the upcoming target position with probabilistic pre-cues. We found that participants initiated spontaneous anticipatory hand movements in all conditions, even when they had no information on the position of the upcoming target. However, participants’ hand position immediately before the target was affected by the degree of certainty concerning the target’s position. This modulation of anticipatory hand movements emerged rapidly in most participants as they encountered a constant probabilistic relation between a cue and an upcoming target position over the course of the experiment. Finally, we found individual differences in the way anticipatory behavior was modulated with an uncertain/neutral cue. Implications of these findings for probabilistic spatial cuing are discussed. PMID:23833694
Probabilistic graphs as a conceptual and computational tool in hydrology and water management
NASA Astrophysics Data System (ADS)
Schoups, Gerrit
2014-05-01
Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.
Probabilistic Polling And Voting In The 2008 Presidential Election
Delavande, Adeline; Manski, Charles F.
2010-01-01
This article reports new empirical evidence on probabilistic polling, which asks persons to state in percent-chance terms the likelihood that they will vote and for whom. Before the 2008 presidential election, seven waves of probabilistic questions were administered biweekly to participants in the American Life Panel (ALP). Actual voting behavior was reported after the election. We find that responses to the verbal and probabilistic questions are well-aligned ordinally. Moreover, the probabilistic responses predict voting behavior beyond what is possible using verbal responses alone. The probabilistic responses have more predictive power in early August, and the verbal responses have more power in late October. However, throughout the sample period, one can predict voting behavior better using both types of responses than either one alone. Studying the longitudinal pattern of responses, we segment respondents into those who are consistently pro-Obama, consistently anti-Obama, and undecided/vacillators. Membership in the consistently pro- or anti-Obama group is an almost perfect predictor of actual voting behavior, while the undecided/vacillators group has more nuanced voting behavior. We find that treating the ALP as a panel improves predictive power: current and previous polling responses together provide more predictive power than do current responses alone. PMID:24683275
Probabilistic drug connectivity mapping
2014-01-01
Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351
Probabilistic numerics and uncertainty in computations
Hennig, Philipp; Osborne, Michael A.; Girolami, Mark
2015-01-01
We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321
Probabilistic numerics and uncertainty in computations.
Hennig, Philipp; Osborne, Michael A; Girolami, Mark
2015-07-08
We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.
NASA Astrophysics Data System (ADS)
Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang
2017-05-01
Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Hard and Soft Constraints in Reliability-Based Design Optimization
NASA Technical Reports Server (NTRS)
Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.
2006-01-01
This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.
Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina
2014-01-01
This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.
Li, Zhixi; Peck, Kyung K.; Brennan, Nicole P.; Jenabi, Mehrnaz; Hsu, Meier; Zhang, Zhigang; Holodny, Andrei I.; Young, Robert J.
2014-01-01
Purpose The purpose of this study was to compare the deterministic and probabilistic tracking methods of diffusion tensor white matter fiber tractography in patients with brain tumors. Materials and Methods We identified 29 patients with left brain tumors <2 cm from the arcuate fasciculus who underwent pre-operative language fMRI and DTI. The arcuate fasciculus was reconstructed using a deterministic Fiber Assignment by Continuous Tracking (FACT) algorithm and a probabilistic method based on an extended Monte Carlo Random Walk algorithm. Tracking was controlled using two ROIs corresponding to Broca’s and Wernicke’s areas. Tracts in tumoraffected hemispheres were examined for extension between Broca’s and Wernicke’s areas, anterior-posterior length and volume, and compared with the normal contralateral tracts. Results Probabilistic tracts displayed more complete anterior extension to Broca’s area than did FACT tracts on the tumor-affected and normal sides (p < 0.0001). The median length ratio for tumor: normal sides was greater for probabilistic tracts than FACT tracts (p < 0.0001). The median tract volume ratio for tumor: normal sides was also greater for probabilistic tracts than FACT tracts (p = 0.01). Conclusion Probabilistic tractography reconstructs the arcuate fasciculus more completely and performs better through areas of tumor and/or edema. The FACT algorithm tends to underestimate the anterior-most fibers of the arcuate fasciculus, which are crossed by primary motor fibers. PMID:25328583
About Essence of the Wave Function on Atomic Level and in Superconductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikulov, A. V.
The wave function was proposed for description of quantum phenomena on the atomic level. But now it is well known that quantum phenomena are observed not only on atomic level and the wave function is used for description of macroscopic quantum phenomena, such as superconductivity. The essence of the wave function on level elementary particles was and is the subject of heated argument among founders of quantum mechanics and other physicists. This essence seems more clear in superconductor. But impossibility of probabilistic interpretation of wave function in this case results to obvious contradiction of quantum principles with some fundamental principlesmore » of physics.« less
Pirillo, Angela; Catapano, Alberico Luigi
2015-05-01
A direct relationship between high plasma triglyceride (TG) levels and increased risk of cardiovascular disease has been shown in several studies. TG are present in the blood associated with different lipoprotein classes, including hepatically-derived very low density lipoproteins (VLDL) and intestinally-derived chylomicrons. Lipoprotein lipase (LPL) is a key enzyme that hydrolyzes TG, releasing free fatty acids that accumulate in peripheral tissues and remnant lipoproteins, that are then cleared by the liver. LPL activity is finely modulated by several cofactors, including apolipoprotein C-III (apoC-III) which acts as a LPL inhibitor. The key role of apoCIII has been established in several studies: animal models lacking APOC3 gene exhibit reduced plasma TG levels, whereas the overexpression of APOC3 gene led to increased TG levels. In humans, several mutations in APOC3 gene have been identified, leading to lower apoC-III levels and associated with reduced plasma TG levels. Recently, these mutations were found to be associated with a reduced risk for cardiovascular ischemia and coronary heart disease, thus confirming the negative role of apoC-III in TG metabolism and suggesting apoC-III as possible therapeutic target for the management of hypertriglyceridemia.
Probabilistic properties of wavelets in kinetic surface roughening
NASA Astrophysics Data System (ADS)
Bershadskii, A.
2001-08-01
Using the data of a recent numerical simulation [M. Ahr and M. Biehl, Phys. Rev. E 62, 1773 (2000)] of homoepitaxial growth it is shown that the observed probability distribution of a wavelet based measure of the growing surface roughness is consistent with a stretched log-normal distribution and the corresponding branching dimension depends on the level of particle desorption.
The Dynamics of Scaling: A Memory-Based Anchor Model of Category Rating and Absolute Identification
ERIC Educational Resources Information Center
Petrov, Alexander A.; Anderson, John R.
2005-01-01
A memory-based scaling model--ANCHOR--is proposed and tested. The perceived magnitude of the target stimulus is compared with a set of anchors in memory. Anchor selection is probabilistic and sensitive to similarity, base-level strength, and recency. The winning anchor provides a reference point near the target and thereby converts the global…
NHANES subjects self-identified as “Asian, Pacific Islander, Native American, or multiracial” (A/P/N/M) have higher levels of blood organic mercury than other racial/ethnic groups; however, the reasons for this have been unclear. This research uses exposure modeling to determine ...