BTC method for evaluation of remaining strength and service life of bridge cables.
DOT National Transportation Integrated Search
2011-09-01
"This report presents the BTC method; a comprehensive state-of-the-art methodology for evaluation of remaining : strength and residual life of bridge cables. The BTC method is a probability-based, proprietary, patented, and peerreviewed : methodology...
NASA Technical Reports Server (NTRS)
Berg, M. D.; Kim, H. S.; Friendlich, M. A.; Perez, C. E.; Seidlick, C. M.; LaBel, K. A.
2011-01-01
We present SEU test and analysis of the Microsemi ProASIC3 FPGA. SEU Probability models are incorporated for device evaluation. Included is a comparison to the RTAXS FPGA illustrating the effectiveness of the overall testing methodology.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering
NASA Astrophysics Data System (ADS)
Hynes-Griffin, M. E.; Buege, L. L.
1983-09-01
Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.
Probabilistic Risk Analysis of Run-up and Inundation in Hawaii due to Distant Tsunamis
NASA Astrophysics Data System (ADS)
Gica, E.; Teng, M. H.; Liu, P. L.
2004-12-01
Risk assessment of natural hazards usually includes two aspects, namely, the probability of the natural hazard occurrence and the degree of damage caused by the natural hazard. Our current study is focused on the first aspect, i.e., the development and evaluation of a methodology that can predict the probability of coastal inundation due to distant tsunamis in the Pacific Basin. The calculation of the probability of tsunami inundation could be a simple statistical problem if a sufficiently long record of field data on inundation was available. Unfortunately, such field data are very limited in the Pacific Basin due to the reason that field measurement of inundation requires the physical presence of surveyors on site. In some areas, no field measurements were ever conducted in the past. Fortunately, there are more complete and reliable historical data on earthquakes in the Pacific Basin partly because earthquakes can be measured remotely. There are also numerical simulation models such as the Cornell COMCOT model that can predict tsunami generation by an earthquake, propagation in the open ocean, and inundation onto a coastal land. Our objective is to develop a methodology that can link the probability of earthquakes in the Pacific Basin with the inundation probability in a coastal area. The probabilistic methodology applied here involves the following steps: first, the Pacific Rim is divided into blocks of potential earthquake sources based on the past earthquake record and fault information. Then the COMCOT model is used to predict the inundation at a distant coastal area due to a tsunami generated by an earthquake of a particular magnitude in each source block. This simulation generates a response relationship between the coastal inundation and an earthquake of a particular magnitude and location. Since the earthquake statistics is known for each block, by summing the probability of all earthquakes in the Pacific Rim, the probability of the inundation in a coastal area can be determined through the response relationship. Although the idea of the statistical methodology applied here is not new, this study is the first to apply it to study the probability of inundation caused by earthquake-generated distant tsunamis in the Pacific Basin. As a case study, the methodology is applied to predict the tsunami inundation risk in Hilo Bay in Hawaii. Since relatively more field data on tsunami inundation are available for Hilo Bay, this case study can help to evaluate the applicability of the methodology for predicting tsunami inundation risk in the Pacific Basin. Detailed results will be presented at the AGU meeting.
A methodology for stochastic analysis of share prices as Markov chains with finite states.
Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey
2014-01-01
Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.
NASA Astrophysics Data System (ADS)
Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick
2016-06-01
Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.
NASA Astrophysics Data System (ADS)
Copping, A. E.; Blake, K.; Zdanski, L.
2011-12-01
As marine and hydrokinetic (MHK) energy development projects progress towards early deployments in the U.S., the process of determining the risks to aquatic animals, habitats, and ecosystem processes from these engineered systems continues to be a significant barrier to efficient siting and permitting. Understanding the risk of MHK installations requires that the two elements of risk - consequence and probability - be evaluated. However, standard risk assessment methodologies are not easily applied to MHK interactions with marine and riverine environment as there are few data that describe the interaction of stressors (MHK devices, anchors, foundations, mooring lines and power cables) and receptors (aquatic animals, habitats and ecosystem processes). The number of possible combinations and permutations of stressors and receptors in MHK systems is large: there are many different technologies designed to harvest energy from the tides, waves and flowing rivers; each device is planned for a specific waterbody that supports an endemic ecosystem of animals and habitats, tied together by specific physical and chemical processes. With few appropriate analogue industries in the oceans and rivers, little information on the effects of these technologies on the living world is available. Similarly, without robust data sets of interactions, mathematical probability models are difficult to apply. Pacific Northwest National Laboratory scientists are working with MHK developers, researchers, engineers, and regulators to rank the consequences of planned MHK projects on living systems, and exploring alternative methodologies to estimate probabilities of these encounters. This paper will present the results of ERES, the Environmental Risk Evaluation System, which has been used to rank consequences for major animal groups and habitats for five MHK projects that are in advanced stages of development and/or early commercial deployment. Probability analyses have been performed for high priority stressor/receptor interactions where data are adaptable from other industries. In addition, a methodology for evaluating the probability of encounter, and therefore risk, to an endangered marine mammal from tidal turbine blades will be presented.
NASA Astrophysics Data System (ADS)
D'silva, Oneil; Kerrison, Roger
2013-09-01
A key feature for the increased utilization of space robotics is to automate Extra-Vehicular manned space activities and thus significantly reduce the potential for catastrophic hazards while simultaneously minimizing the overall costs associated with manned space. The principal scope of the paper is to evaluate the use of industry standard accepted Probability risk/safety assessment (PRA/PSA) methodologies and Hazard Risk frequency Criteria as a hazard control. This paper illustrates the applicability of combining the selected Probability risk assessment methodology and hazard risk frequency criteria, in order to apply the necessary safety controls that allow for the increased use of the Mobile Servicing system (MSS) robotic system on the International Space Station. This document will consider factors such as component failure rate reliability, software reliability, and periods of operation and dormancy, fault tree analyses and their effects on the probability risk assessments. The paper concludes with suggestions for the incorporation of existing industry Risk/Safety plans to create an applicable safety process for future activities/programs
Proposal of a method for evaluating tsunami risk using response-surface methodology
NASA Astrophysics Data System (ADS)
Fukutani, Y.
2017-12-01
Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.
Coalescence computations for large samples drawn from populations of time-varying sizes
Polanski, Andrzej; Szczesna, Agnieszka; Garbulowski, Mateusz; Kimmel, Marek
2017-01-01
We present new results concerning probability distributions of times in the coalescence tree and expected allele frequencies for coalescent with large sample size. The obtained results are based on computational methodologies, which involve combining coalescence time scale changes with techniques of integral transformations and using analytical formulae for infinite products. We show applications of the proposed methodologies for computing probability distributions of times in the coalescence tree and their limits, for evaluation of accuracy of approximate expressions for times in the coalescence tree and expected allele frequencies, and for analysis of large human mitochondrial DNA dataset. PMID:28170404
Methodology for back-contamination risk assessment for a Mars sample return mission
NASA Technical Reports Server (NTRS)
Merkhofer, M. W.; Quinn, D. J.
1977-01-01
The risk of back-contamination from Mars Surface Sample Return (MSSR) missions is assessed. The methodology is designed to provide an assessment of the probability that a given mission design and strategy will result in accidental release of Martian organisms acquired as a result of MSSR. This is accomplished through the construction of risk models describing the mission risk elements and their impact on back-contamination probability. A conceptual framework is presented for using the risk model to evaluate mission design decisions that require a trade-off between science and planetary protection considerations.
NASA Astrophysics Data System (ADS)
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.
An approach to evaluating reactive airborne wind shear systems
NASA Technical Reports Server (NTRS)
Gibson, Joseph P., Jr.
1992-01-01
An approach to evaluating reactive airborne windshear detection systems was developed to support a deployment study for future FAA ground-based windshear detection systems. The deployment study methodology assesses potential future safety enhancements beyond planned capabilities. The reactive airborne systems will be an integral part of planned windshear safety enhancements. The approach to evaluating reactive airborne systems involves separate analyses for both landing and take-off scenario. The analysis estimates the probability of effective warning considering several factors including NASA energy height loss characteristics, reactive alert timing, and a probability distribution for microburst strength.
Evaluating the risk of industrial espionage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bott, T.F.
1998-12-31
A methodology for estimating the relative probabilities of different compromise paths for protected information by insider and visitor intelligence collectors has been developed based on an event-tree analysis of the intelligence collection operation. The analyst identifies target information and ultimate users who might attempt to gain that information. The analyst then uses an event tree to develop a set of compromise paths. Probability models are developed for each of the compromise paths that user parameters based on expert judgment or historical data on security violations. The resulting probability estimates indicate the relative likelihood of different compromise paths and provide anmore » input for security resource allocation. Application of the methodology is demonstrated using a national security example. A set of compromise paths and probability models specifically addressing this example espionage problem are developed. The probability models for hard-copy information compromise paths are quantified as an illustration of the results using parametric values representative of historical data available in secure facilities, supplemented where necessary by expert judgment.« less
NASA Technical Reports Server (NTRS)
Wiegmann, Douglas A.a
2005-01-01
The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.
The probability of transportation accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brobst, W.A.
1972-11-10
We examined the relative safety of different modes of transportation from a statistical basis, rather than an emotional one. As we were collecting data and evaluating its applicability, we found that our own emotions came into play in judging which data would be useful and which data we should discard. We developed a methodology of simple data analysis that would lend itself to similar evaluations to questions. The author described that methodology, and demonstrated its application to shipments of radioactive materials. 31 refs., 7 tabs/
2011-09-01
a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-05-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-01-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
NASA Technical Reports Server (NTRS)
Anderson, Leif; Box, Neil; Carter-Journet, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael
2012-01-01
Purpose of presentation: (1) Status update on the developing methodology to revise sub-system sparing targets. (2) To describe how to incorporate uncertainty into spare assessments and why it is important to do so (3) Demonstrate hardware risk postures through PACT evaluation
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Development of a Probabilistic Assessment Methodology for Evaluation of Carbon Dioxide Storage
Burruss, Robert A.; Brennan, Sean T.; Freeman, P.A.; Merrill, Matthew D.; Ruppert, Leslie F.; Becker, Mark F.; Herkelrath, William N.; Kharaka, Yousif K.; Neuzil, Christopher E.; Swanson, Sharon M.; Cook, Troy A.; Klett, Timothy R.; Nelson, Philip H.; Schenk, Christopher J.
2009-01-01
This report describes a probabilistic assessment methodology developed by the U.S. Geological Survey (USGS) for evaluation of the resource potential for storage of carbon dioxide (CO2) in the subsurface of the United States as authorized by the Energy Independence and Security Act (Public Law 110-140, 2007). The methodology is based on USGS assessment methodologies for oil and gas resources created and refined over the last 30 years. The resource that is evaluated is the volume of pore space in the subsurface in the depth range of 3,000 to 13,000 feet that can be described within a geologically defined storage assessment unit consisting of a storage formation and an enclosing seal formation. Storage assessment units are divided into physical traps (PTs), which in most cases are oil and gas reservoirs, and the surrounding saline formation (SF), which encompasses the remainder of the storage formation. The storage resource is determined separately for these two types of storage. Monte Carlo simulation methods are used to calculate a distribution of the potential storage size for individual PTs and the SF. To estimate the aggregate storage resource of all PTs, a second Monte Carlo simulation step is used to sample the size and number of PTs. The probability of successful storage for individual PTs or the entire SF, defined in this methodology by the likelihood that the amount of CO2 stored will be greater than a prescribed minimum, is based on an estimate of the probability of containment using present-day geologic knowledge. The report concludes with a brief discussion of needed research data that could be used to refine assessment methodologies for CO2 sequestration.
Evaluation methodologies for an advanced information processing system
NASA Technical Reports Server (NTRS)
Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.
1984-01-01
The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.
Electrical cable utilization for wave energy converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Diana; Baca, Michael; Schenkman, Benjamin
Here, this paper investigates the suitability of sizing the electrical export cable based on the rating of the contributing WECs within a farm. These investigations have produced a new methodology to evaluate the probabilities associated with peak power values on an annual basis. It has been shown that the peaks in pneumatic power production will follow an exponential probability function for a linear model. A methodology to combine all the individual probability functions into an annual view has been demonstrated on pneumatic power production by a Backward Bent Duct Buoy (BBDB). These investigations have also resulted in a highly simplifiedmore » and perfunctory model of installed cable cost as a function of voltage and conductor cross-section. This work solidifies the need to determine electrical export cable rating based on expected energy delivery as opposed to device rating as small decreases in energy delivery can result in cost savings.« less
Electrical cable utilization for wave energy converters
Bull, Diana; Baca, Michael; Schenkman, Benjamin
2018-04-27
Here, this paper investigates the suitability of sizing the electrical export cable based on the rating of the contributing WECs within a farm. These investigations have produced a new methodology to evaluate the probabilities associated with peak power values on an annual basis. It has been shown that the peaks in pneumatic power production will follow an exponential probability function for a linear model. A methodology to combine all the individual probability functions into an annual view has been demonstrated on pneumatic power production by a Backward Bent Duct Buoy (BBDB). These investigations have also resulted in a highly simplifiedmore » and perfunctory model of installed cable cost as a function of voltage and conductor cross-section. This work solidifies the need to determine electrical export cable rating based on expected energy delivery as opposed to device rating as small decreases in energy delivery can result in cost savings.« less
NASA Astrophysics Data System (ADS)
Foufoula-Georgiou, E.
1989-05-01
A storm transposition approach is investigated as a possible tool of assessing the frequency of extreme precipitation depths, that is, depths of return period much greater than 100 years. This paper focuses on estimation of the annual exceedance probability of extreme average precipitation depths over a catchment. The probabilistic storm transposition methodology is presented, and the several conceptual and methodological difficulties arising in this approach are identified. The method is implemented and is partially evaluated by means of a semihypothetical example involving extreme midwestern storms and two hypothetical catchments (of 100 and 1000 mi2 (˜260 and 2600 km2)) located in central Iowa. The results point out the need for further research to fully explore the potential of this approach as a tool for assessing the probabilities of rare storms, and eventually floods, a necessary element of risk-based analysis and design of large hydraulic structures.
[Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].
Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco
2014-01-01
the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.
Probability techniques for reliability analysis of composite materials
NASA Technical Reports Server (NTRS)
Wetherhold, Robert C.; Ucci, Anthony M.
1994-01-01
Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps
NASA Astrophysics Data System (ADS)
Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.
2012-05-01
Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.
Van Pamel, Anton; Brett, Colin R; Lowe, Michael J S
2014-12-01
Improving the ultrasound inspection capability for coarse-grained metals remains of longstanding interest and is expected to become increasingly important for next-generation electricity power plants. Conventional ultrasonic A-, B-, and C-scans have been found to suffer from strong background noise caused by grain scattering, which can severely limit the detection of defects. However, in recent years, array probes and full matrix capture (FMC) imaging algorithms have unlocked exciting possibilities for improvements. To improve and compare these algorithms, we must rely on robust methodologies to quantify their performance. This article proposes such a methodology to evaluate the detection performance of imaging algorithms. For illustration, the methodology is applied to some example data using three FMC imaging algorithms; total focusing method (TFM), phase-coherent imaging (PCI), and decomposition of the time-reversal operator with multiple scattering filter (DORT MSF). However, it is important to note that this is solely to illustrate the methodology; this article does not attempt the broader investigation of different cases that would be needed to compare the performance of these algorithms in general. The methodology considers the statistics of detection, presenting the detection performance as probability of detection (POD) and probability of false alarm (PFA). A test sample of coarse-grained nickel super alloy, manufactured to represent materials used for future power plant components and containing some simple artificial defects, is used to illustrate the method on the candidate algorithms. The data are captured in pulse-echo mode using 64-element array probes at center frequencies of 1 and 5 MHz. In this particular case, it turns out that all three algorithms are shown to perform very similarly when comparing their flaw detection capabilities.
(177)Lu: DDEP Evaluation of the decay scheme for an emerging radiopharmaceutical.
Kellett, M A
2016-03-01
A new decay scheme evaluation using the DDEP methodology for (177)Lu is presented. Recently measured half-life measurements have been incorporated, as well as newly available γ-ray emission probabilities. For the first time, a thorough investigation has been made of the γ-ray multipolarities. The complete data tables and detailed evaluator comments are available through the DDEP website. Copyright © 2015 Elsevier Ltd. All rights reserved.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua
2015-01-01
The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the approximate nature of the model and known limitations, means the applicability of UCERF3 should be evaluated on a case-by-case basis.
Probability calculations for three-part mineral resource assessments
Ellefsen, Karl J.
2017-06-27
Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon
NASA Astrophysics Data System (ADS)
Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.
2015-12-01
Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA
A methodology for the transfer of probabilities between accident severity categories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlow, J. D.; Neuhauser, K. S.
A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships.more » These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme.« less
NASA Astrophysics Data System (ADS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
2010-09-01
nationale, 2010 DRDC Valcartier CR 2010-237 i Abstract …….. A probability of hit ( PHit ) methodology has been developed to characterize the...CFB (Canadian Forces Base). Résumé …..... Une méthodologie de probabilité d’impact ( PHit ) a été développée pour caractériser la performance globale...the crew commander and gunner from their respective crew stations inside the vehicle. A probability of hit ( PHit ) methodology has been developed to
Probabilistic self-organizing maps for continuous data.
Lopez-Rubio, Ezequiel
2010-10-01
The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.
Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan
2017-02-01
Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza
2018-03-01
This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.
Olea, R.A.; Houseknecht, D.W.; Garrity, C.P.; Cook, T.A.
2011-01-01
Shale gas is a form of continuous unconventional hydrocarbon accumulation whose resource estimation is unfeasible through the inference of pore volume. Under these circumstances, the usual approach is to base the assessment on well productivity through estimated ultimate recovery (EUR). Unconventional resource assessments that consider uncertainty are typically done by applying analytical procedures based on classical statistics theory that ignores geographical location, does not take into account spatial correlation, and assumes independence of EUR from other variables that may enter into the modeling. We formulate a new, more comprehensive approach based on sequential simulation to test methodologies known to be capable of more fully utilizing the data and overcoming unrealistic simplifications. Theoretical requirements demand modeling of EUR as areal density instead of well EUR. The new experimental methodology is illustrated by evaluating a gas play in the Woodford Shale in the Arkoma Basin of Oklahoma. Differently from previous assessments, we used net thickness and vitrinite reflectance as secondary variables correlated to cell EUR. In addition to the traditional probability distribution for undiscovered resources, the new methodology provides maps of EUR density and maps with probabilities to reach any given cell EUR, which are useful to visualize geographical variations in prospectivity.
Evaluation of fault-tolerant parallel-processor architectures over long space missions
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1989-01-01
The impact of a five year space mission environment on fault-tolerant parallel processor architectures is examined. The target application is a Strategic Defense Initiative (SDI) satellite requiring 256 parallel processors to provide the computation throughput. The reliability requirements are that the system still be operational after five years with .99 probability and that the probability of system failure during one-half hour of full operation be less than 10(-7). The fault tolerance features an architecture must possess to meet these reliability requirements are presented, many potential architectures are briefly evaluated, and one candidate architecture, the Charles Stark Draper Laboratory's Fault-Tolerant Parallel Processor (FTPP) is evaluated in detail. A methodology for designing a preliminary system configuration to meet the reliability and performance requirements of the mission is then presented and demonstrated by designing an FTPP configuration.
Dantan, Etienne; Combescure, Christophe; Lorent, Marine; Ashton-Chess, Joanna; Daguin, Pascal; Classe, Jean-Marc; Giral, Magali; Foucher, Yohann
2014-04-01
Predicting chronic disease evolution from a prognostic marker is a key field of research in clinical epidemiology. However, the prognostic capacity of a marker is not systematically evaluated using the appropriate methodology. We proposed the use of simple equations to calculate time-dependent sensitivity and specificity based on published survival curves and other time-dependent indicators as predictive values, likelihood ratios, and posttest probability ratios to reappraise prognostic marker accuracy. The methodology is illustrated by back calculating time-dependent indicators from published articles presenting a marker as highly correlated with the time to event, concluding on the high prognostic capacity of the marker, and presenting the Kaplan-Meier survival curves. The tools necessary to run these direct and simple computations are available online at http://www.divat.fr/en/online-calculators/evalbiom. Our examples illustrate that published conclusions about prognostic marker accuracy may be overoptimistic, thus giving potential for major mistakes in therapeutic decisions. Our approach should help readers better evaluate clinical articles reporting on prognostic markers. Time-dependent sensitivity and specificity inform on the inherent prognostic capacity of a marker for a defined prognostic time. Time-dependent predictive values, likelihood ratios, and posttest probability ratios may additionally contribute to interpret the marker's prognostic capacity. Copyright © 2014 Elsevier Inc. All rights reserved.
Environmental probabilistic quantitative assessment methodologies
Crovelli, R.A.
1995-01-01
In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author
van Dieten, H. E M; Bos, I.; van Tulder, M. W; Lems, W.; Dijkmans, B.; Boers, M.
2000-01-01
A systematic review on the cost effectiveness of prophylactic treatments of non-steroidal anti-inflammatory drug (NSAID) induced gastropathy in patients with osteoarthritis or rheumatoid arthritis was conducted. Two reviewers conducted the literature search and the review. Both full and partial economic evaluations published in English, Dutch, or German were included. The criteria list published in the textbook of Drummond was used to determine the quality of the economic evaluations. The methodological quality of three randomised controlled trials (RCTs) in which the economic evaluations obtained probability estimates of NSAID induced gastropathy and adverse events was assessed by a list of internal validity criteria. The conclusions were based on a rating system consisting of four levels of evidence. Ten economic evaluations were included; three were based on RCTs. All evaluations studied misoprostol as prophylactic treatment: in one evaluation misoprostol was studied as a fixed component in a combination with diclofenac (Arthrotec). All economic evaluations comprised analytical studies containing a decision tree. The three trials were of high methodological quality. Nine economic evaluations were considered high quality and one economic evaluation was considered of low methodological quality. There is strong evidence (level "A") that the use of misoprostol for the prevention of NSAID induced gastropathy is cost effective, and limited evidence (level "C") that the use of Arthrotec is cost effective. Although the levels of evidence used in this review are arbitrary, it is believed that a qualitative analysis is useful: quantitative analyses in this field are hampered by the heterogeneity of economic evaluations. Existing criteria to evaluate the methodological quality of economic evaluations may need refinement for use in systematic reviews. PMID:11005773
van Dieten, H E; Korthals-de Bos, I B; van Tulder, M W; Lems, W F; Dijkmans, B A; Boers, M
2000-10-01
A systematic review on the cost effectiveness of prophylactic treatments of non-steroidal anti-inflammatory drug (NSAID) induced gastropathy in patients with osteoarthritis or rheumatoid arthritis was conducted. Two reviewers conducted the literature search and the review. Both full and partial economic evaluations published in English, Dutch, or German were included. The criteria list published in the textbook of Drummond was used to determine the quality of the economic evaluations. The methodological quality of three randomised controlled trials (RCTs) in which the economic evaluations obtained probability estimates of NSAID induced gastropathy and adverse events was assessed by a list of internal validity criteria. The conclusions were based on a rating system consisting of four levels of evidence. Ten economic evaluations were included; three were based on RCTs. All evaluations studied misoprostol as prophylactic treatment: in one evaluation misoprostol was studied as a fixed component in a combination with diclofenac (Arthrotec). All economic evaluations comprised analytical studies containing a decision tree. The three trials were of high methodological quality. Nine economic evaluations were considered high quality and one economic evaluation was considered of low methodological quality. There is strong evidence (level "A") that the use of misoprostol for the prevention of NSAID induced gastropathy is cost effective, and limited evidence (level "C") that the use of Arthrotec is cost effective. Although the levels of evidence used in this review are arbitrary, it is believed that a qualitative analysis is useful: quantitative analyses in this field are hampered by the heterogeneity of economic evaluations. Existing criteria to evaluate the methodological quality of economic evaluations may need refinement for use in systematic reviews.
Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.F.A. Deng; M. Saglam; L.J. Gratton
2001-05-23
In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{submore » eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.« less
NASA Technical Reports Server (NTRS)
Motyka, P.
1983-01-01
A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.
NASA Astrophysics Data System (ADS)
Koshinchanov, Georgy; Dimitrov, Dobri
2008-11-01
The characteristics of rainfall intensity are important for many purposes, including design of sewage and drainage systems, tuning flood warning procedures, etc. Those estimates are usually statistical estimates of the intensity of precipitation realized for certain period of time (e.g. 5, 10 min., etc) with different return period (e.g. 20, 100 years, etc). The traditional approach in evaluating the mentioned precipitation intensities is to process the pluviometer's records and fit probability distribution to samples of intensities valid for certain locations ore regions. Those estimates further become part of the state regulations to be used for various economic activities. Two problems occur using the mentioned approach: 1. Due to various factors the climate conditions are changed and the precipitation intensity estimates need regular update; 2. As far as the extremes of the probability distribution are of particular importance for the practice, the methodology of the distribution fitting needs specific attention to those parts of the distribution. The aim of this paper is to make review of the existing methodologies for processing the intensive rainfalls and to refresh some of the statistical estimates for the studied areas. The methodologies used in Bulgaria for analyzing the intensive rainfalls and produce relevant statistical estimates: The method of the maximum intensity, used in the National Institute of Meteorology and Hydrology to process and decode the pluviometer's records, followed by distribution fitting for each precipitation duration period; As the above, but with separate modeling of probability distribution for the middle and high probability quantiles. Method is similar to the first one, but with a threshold of 0,36 mm/min of intensity; Another method proposed by the Russian hydrologist G. A. Aleksiev for regionalization of estimates over some territory, improved and adapted by S. Gerasimov for Bulgaria; Next method is considering only the intensive rainfalls (if any) during the day with the maximal annual daily precipitation total for a given year; Conclusions are drown on the relevance and adequacy of the applied methods.
De-identification Methods for Open Health Data: The Case of the Heritage Health Prize Claims Dataset
Arbuckle, Luk; Koru, Gunes; Eze, Benjamin; Gaudette, Lisa; Neri, Emilio; Rose, Sean; Howard, Jeremy; Gluck, Jonathan
2012-01-01
Background There are many benefits to open datasets. However, privacy concerns have hampered the widespread creation of open health data. There is a dearth of documented methods and case studies for the creation of public-use health data. We describe a new methodology for creating a longitudinal public health dataset in the context of the Heritage Health Prize (HHP). The HHP is a global data mining competition to predict, by using claims data, the number of days patients will be hospitalized in a subsequent year. The winner will be the team or individual with the most accurate model past a threshold accuracy, and will receive a US $3 million cash prize. HHP began on April 4, 2011, and ends on April 3, 2013. Objective To de-identify the claims data used in the HHP competition and ensure that it meets the requirements in the US Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule. Methods We defined a threshold risk consistent with the HIPAA Privacy Rule Safe Harbor standard for disclosing the competition dataset. Three plausible re-identification attacks that can be executed on these data were identified. For each attack the re-identification probability was evaluated. If it was deemed too high then a new de-identification algorithm was applied to reduce the risk to an acceptable level. We performed an actual evaluation of re-identification risk using simulated attacks and matching experiments to confirm the results of the de-identification and to test sensitivity to assumptions. The main metric used to evaluate re-identification risk was the probability that a record in the HHP data can be re-identified given an attempted attack. Results An evaluation of the de-identified dataset estimated that the probability of re-identifying an individual was .0084, below the .05 probability threshold specified for the competition. The risk was robust to violations of our initial assumptions. Conclusions It was possible to ensure that the probability of re-identification for a large longitudinal dataset was acceptably low when it was released for a global user community in support of an analytics competition. This is an example of, and methodology for, achieving open data principles for longitudinal health data. PMID:22370452
Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe
2011-04-08
HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Arnone, E.; Noto, L. V.; Dialynas, Y. G.; Caracciolo, D.; Bras, R. L.
2015-12-01
This work presents the capabilities of a model, i.e. the tRIBS-VEGGIE-Landslide, in two different versions, i.e. developed within a probabilistic framework and coupled with a root cohesion module. The probabilistic model treats geotechnical and soil retention curve parameters as random variables across the basin and estimates theoretical probability distributions of slope stability and the associated "factor of safety" commonly used to describe the occurrence of shallow landslides. The derived distributions are used to obtain the spatio-temporal dynamics of probability of failure, conditioned on soil moisture dynamics at each watershed location. The framework has been tested in the Luquillo Experimental Forest (Puerto Rico) where shallow landslides are common. In particular, the methodology was used to evaluate how the spatial and temporal patterns of precipitation, whose variability is significant over the basin, affect the distribution of probability of failure. Another version of the model accounts for the additional cohesion exerted by vegetation roots. The approach is to use the Fiber Bundle Model (FBM) framework that allows for the evaluation of the root strength as a function of the stress-strain relationships of bundles of fibers. The model requires the knowledge of the root architecture to evaluate the additional reinforcement from each root diameter class. The root architecture is represented with a branching topology model based on Leonardo's rule. The methodology has been tested on a simple case study to explore the role of both hydrological and mechanical root effects. Results demonstrate that the effects of root water uptake can at times be more significant than the mechanical reinforcement; and that the additional resistance provided by roots depends heavily on the vegetation root structure and length.
A methodology for estimating risks associated with landslides of contaminated soil into rivers.
Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars
2014-02-15
Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load within one year is also high. Copyright © 2013 Elsevier B.V. All rights reserved.
Five-year-olds do not show ambiguity aversion in a risk and ambiguity task with physical objects.
Li, Rosa; Roberts, Rachel C; Huettel, Scott A; Brannon, Elizabeth M
2017-07-01
Ambiguity aversion arises when a decision maker prefers risky gambles with known probabilities over equivalent ambiguous gambles with unknown probabilities. This phenomenon has been consistently observed in adults across a large body of empirical work. Evaluating ambiguity aversion in young children, however, has posed methodological challenges because probabilistic representations appropriate for adults might not be understood by young children. Here, we established a novel method for representing risk and ambiguity with physical objects that overcomes previous methodological limitations and allows us to measure ambiguity aversion in young children. We found that individual 5-year-olds exhibited consistent choice preferences and, as a group, exhibited no ambiguity aversion in a task that evokes ambiguity aversion in adults. Across individuals, 5-year-olds exhibited greater variance in ambiguity preferences compared with adults tested under similar conditions. This suggests that ambiguity aversion is absent during early childhood and emerges over the course of development. Copyright © 2017 Elsevier Inc. All rights reserved.
Methodology for Collision Risk Assessment of an Airspace Flow Corridor Concept
NASA Astrophysics Data System (ADS)
Zhang, Yimin
This dissertation presents a methodology to estimate the collision risk associated with a future air-transportation concept called the flow corridor. The flow corridor is a Next Generation Air Transportation System (NextGen) concept to reduce congestion and increase throughput in en-route airspace. The flow corridor has the potential to increase throughput by reducing the controller workload required to manage aircraft outside the corridor and by reducing separation of aircraft within corridor. The analysis in this dissertation is a starting point for the safety analysis required by the Federal Aviation Administration (FAA) to eventually approve and implement the corridor concept. This dissertation develops a hybrid risk analysis methodology that combines Monte Carlo simulation with dynamic event tree analysis. The analysis captures the unique characteristics of the flow corridor concept, including self-separation within the corridor, lane change maneuvers, speed adjustments, and the automated separation assurance system. Monte Carlo simulation is used to model the movement of aircraft in the flow corridor and to identify precursor events that might lead to a collision. Since these precursor events are not rare, standard Monte Carlo simulation can be used to estimate these occurrence rates. Dynamic event trees are then used to model the subsequent series of events that may lead to collision. When two aircraft are on course for a near-mid-air collision (NMAC), the on-board automated separation assurance system provides a series of safety layers to prevent the impending NNAC or collision. Dynamic event trees are used to evaluate the potential failures of these layers in order to estimate the rare-event collision probabilities. The results show that the throughput can be increased by reducing separation to 2 nautical miles while maintaining the current level of safety. A sensitivity analysis shows that the most critical parameters in the model related to the overall collision probability are the minimum separation, the probability that both flights fail to respond to traffic collision avoidance system, the probability that an NMAC results in a collision, the failure probability of the automatic dependent surveillance broadcast in receiver, and the conflict detection probability.
Probabilistic evaluation of fuselage-type composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1992-01-01
A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.
An operational system of fire danger rating over Mediterranean Europe
NASA Astrophysics Data System (ADS)
Pinto, Miguel M.; DaCamara, Carlos C.; Trigo, Isabel F.; Trigo, Ricardo M.
2017-04-01
A methodology is presented to assess fire danger based on the probability of exceedance of prescribed thresholds of daily released energy. The procedure is developed and tested over Mediterranean Europe, defined by latitude circles of 35 and 45°N and meridians of 10°W and 27.5°E, for the period 2010-2016. The procedure involves estimating the so-called static and daily probabilities of exceedance. For a given point, the static probability is estimated by the ratio of the number of daily fire occurrences releasing energy above a given threshold to the total number of occurrences inside a cell centred at the point. The daily probability of exceedance which takes into account meteorological factors by means of the Canadian Fire Weather Index (FWI) is in turn estimated based on a Generalized Pareto distribution with static probability and FWI as covariates of the scale parameter. The rationale of the procedure is that small fires, assessed by the static probability, have a weak dependence on weather, whereas the larger fires strongly depend on concurrent meteorological conditions. It is shown that observed frequencies of exceedance over the study area for the period 2010-2016 match with the estimated values of probability based on the developed models for static and daily probabilities of exceedance. Some (small) variability is however found between different years suggesting that refinements can be made in future works by using a larger sample to further increase the robustness of the method. The developed methodology presents the advantage of evaluating fire danger with the same criteria for all the study area, making it a good parameter to harmonize fire danger forecasts and forest management studies. Research was performed within the framework of EUMETSAT Satellite Application Facility for Land Surface Analysis (LSA SAF). Part of methods developed and results obtained are on the basis of the platform supported by The Navigator Company that is currently providing information about fire meteorological danger for Portugal for a wide range of users.
Code of Federal Regulations, 2010 CFR
2010-10-01
... by ACF statistical staff from the Adoption and Foster Care Analysis and Reporting System (AFCARS... primary review utilizing probability sampling methodologies. Usually, the chosen methodology will be simple random sampling, but other probability samples may be utilized, when necessary and appropriate. (3...
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
NASA Technical Reports Server (NTRS)
Temple, Enoch C.
1994-01-01
The space industry has developed many composite materials that have high durability in proportion to their weights. Many of these materials have a likelihood for flaws that is higher than in traditional metals. There are also coverings (such as paint) that develop flaws that may adversely affect the performance of the system in which they are used. Therefore there is a need to monitor the soundness of composite structures. To meet this monitoring need, many nondestructive evaluation (NDE) systems have been developed. An NDE system is designed to detect material flaws and make flaw measurements without destroying the inspected item. Also, the detection operation is expected to be performed in a rapid manner in a field or production environment. Some of the most recent video-based NDE methodologies are shearography, holography, thermography, and video image correlation.
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon
2015-01-01
This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.
NASA Astrophysics Data System (ADS)
Guler Yigitoglu, Askin
In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.
Evaluation of the potential carcinogenicity of benzotrichloride (97-07-7). Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-06-01
Benzotrichloride is a probable human carcinogen, classified as weight-of-evidence Group B1 under the EPA Guidelines for Carcinogen Risk Assessment. Evidence on potential carcinogenicity from animal studies is Sufficient, and the evidence from human studies is Limited. The potency factor (F) for benzotrichloride is estimated to be 58.0 (mg/kg/day)(-1), placing it in potency group 2 according to the CAG's methodology for evaluating potential carcinogens. Combining the weight-of-evidence group and the potency group, benzotrichloride is assigned a MEDIUM hazard ranking.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Evaluation of Individuals With Pulmonary Nodules: When Is It Lung Cancer?
Donington, Jessica; Lynch, William R.; Mazzone, Peter J.; Midthun, David E.; Naidich, David P.; Wiener, Renda Soylemez
2013-01-01
Objectives: The objective of this article is to update previous evidence-based recommendations for evaluation and management of individuals with solid pulmonary nodules and to generate new recommendations for those with nonsolid nodules. Methods: We updated prior literature reviews, synthesized evidence, and formulated recommendations by using the methods described in the “Methodology for Development of Guidelines for Lung Cancer” in the American College of Chest Physicians Lung Cancer Guidelines, 3rd ed. Results: We formulated recommendations for evaluating solid pulmonary nodules that measure > 8 mm in diameter, solid nodules that measure ≤ 8 mm in diameter, and subsolid nodules. The recommendations stress the value of assessing the probability of malignancy, the utility of imaging tests, the need to weigh the benefits and harms of different management strategies (nonsurgical biopsy, surgical resection, and surveillance with chest CT imaging), and the importance of eliciting patient preferences. Conclusions: Individuals with pulmonary nodules should be evaluated and managed by estimating the probability of malignancy, performing imaging tests to better characterize the lesions, evaluating the risks associated with various management alternatives, and eliciting their preferences for management. PMID:23649456
Evaluation of ultrasonic array imaging algorithms for inspection of a coarse grained material
NASA Astrophysics Data System (ADS)
Van Pamel, A.; Lowe, M. J. S.; Brett, C. R.
2014-02-01
Improving the ultrasound inspection capability for coarse grain metals remains of longstanding interest to industry and the NDE research community and is expected to become increasingly important for next generation power plants. A test sample of coarse grained Inconel 625 which is representative of future power plant components has been manufactured to test the detectability of different inspection techniques. Conventional ultrasonic A, B, and C-scans showed the sample to be extraordinarily difficult to inspect due to its scattering behaviour. However, in recent years, array probes and Full Matrix Capture (FMC) imaging algorithms, which extract the maximum amount of information possible, have unlocked exciting possibilities for improvements. This article proposes a robust methodology to evaluate the detection performance of imaging algorithms, applying this to three FMC imaging algorithms; Total Focusing Method (TFM), Phase Coherent Imaging (PCI), and Decomposition of the Time Reversal Operator with Multiple Scattering (DORT MSF). The methodology considers the statistics of detection, presenting the detection performance as Probability of Detection (POD) and probability of False Alarm (PFA). The data is captured in pulse-echo mode using 64 element array probes at centre frequencies of 1MHz and 5MHz. All three algorithms are shown to perform very similarly when comparing their flaw detection capabilities on this particular case.
NASA Astrophysics Data System (ADS)
Zobin, V. M.; Cruz-Bravo, A. A.; Ventura-Ramírez, F.
2010-06-01
A macroseismic methodology of seismic risk microzonation in a low-rise city based on the vulnerability of residential buildings is proposed and applied to Colima city, Mexico. The seismic risk microzonation for Colima consists of two elements: the mapping of residential blocks according to their vulnerability level and the calculation of an expert-opinion based damage probability matrix (DPM) for a given level of earthquake intensity and a given type of residential block. A specified exposure time to the seismic risk for this zonation is equal to the interval between two destructive earthquakes. The damage probability matrices were calculated for three types of urban buildings and five types of residential blocks in Colima. It was shown that only 9% of 1409 residential blocks are able to resist to the Modify Mercalli (MM) intensity VII and VIII earthquakes without significant damage. The proposed DPM-2007 is in good accordance with the experimental damage curves based on the macroseismic evaluation of 3332 residential buildings in Colima that was carried out after the 21 January 2003 intensity MM VII earthquake. This methodology and the calculated PDM-2007 curves may be applied also to seismic risk microzonation for many low-rise cities in Latin America, Asia, and Africa.
The added value of thorough economic evaluation of telemedicine networks.
Le Goff-Pronost, Myriam; Sicotte, Claude
2010-02-01
This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.
NASA Technical Reports Server (NTRS)
1978-01-01
Analytical and quantitative economic techniques are applied to the evaluation of the economic benefits of a wide range of substances for space bioprocessing. On the basis of expected clinical applications, as well as the size of the patient that could be affected by the clinical applications, eight substances are recommended for further benefit evaluation. Results show that a transitional probability methodology can be used to model at least one clinical application for each of these substances. In each recommended case, the disease and its therapy are sufficiently well understood and documented, and the statistical data is available to operate the model and produce estimates of the impact of new therapy systems on the cost of treatment, morbidity, and mortality. Utilizing the morbidity and mortality information produced by the model, a standard economic technique called the Value of Human Capital is used to estimate the social welfare benefits that could be attributable to the new therapy systems.
Sadowski, Lukasz
2013-01-01
In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm × 750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Venable, J M; Ma, Q L; Ginter, P M; Duncan, W J
1993-01-01
Scenario analysis is a strategic planning technique used to describe and evaluate an organization's external environment. A methodology for conducting scenario analysis using the Jefferson County Department of Health and the national, State, and county issues confronting it is outlined. Key health care and organizational issues were identified using published sources, focus groups, questionnaires, and personal interviews. The most important of these issues were selected by asking health department managers to evaluate the issues according to their probability of occurrence and likely impact on the health department. The high-probability, high-impact issues formed the basis for developing scenario logics that constitute the story line holding the scenario together. The results were a set of plausible scenarios that aided in strategic planning, encouraged strategic thinking among managers, eliminated or reduced surprise about environmental changes, and improved managerial discussion and communication. PMID:8265754
Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations
NASA Astrophysics Data System (ADS)
Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.
2014-02-01
The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.
2015-12-01
Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the new methodology as web services and incorporated the system into the Cloud. We have also developed a provenance management system for CMDA where CMDA service semantics modeling, service search and recommendation, and service execution history management are designed and implemented.
Methodologies For A Physically Based Rockfall Hazard Assessment
NASA Astrophysics Data System (ADS)
Agliardi, F.; Crosta, G. B.; Guzzetti, F.; Marian, M.
Rockfall hazard assessment is an important land planning tool in alpine areas, where settlements progressively expand across rockfall prone areas, rising the vulnerability of the elements at risk, the worth of potential losses and the restoration costs. Nev- ertheless, hazard definition is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. In addition, the high mobility of rockfalls implies a more difficult hazard definition with respect to other slope insta- bilities for which runout is minimal. When coping with rockfalls, hazard assessment involves complex definitions for "occurrence probability" and "intensity". The local occurrence probability must derive from the combination of the triggering probability (related to the geomechanical susceptibility of rock masses to fail) and the transit or impact probability at a given location (related to the motion of falling blocks). The intensity (or magnitude) of a rockfall is a complex function of mass, velocity and fly height of involved blocks that can be defined in many different ways depending on the adopted physical description and "destructiveness" criterion. This work is an attempt to evaluate rockfall hazard using the results of numerical modelling performed by an original 3D rockfall simulation program. This is based on a kinematic algorithm and allows the spatially distributed simulation of rockfall motions on a three-dimensional topography described by a DTM. The code provides raster maps portraying the max- imum frequency of transit, velocity and height of blocks at each model cell, easily combined in a GIS in order to produce physically based rockfall hazard maps. The results of some three dimensional rockfall models, performed at both regional and lo- cal scale in areas where rockfall related problems are well known, have been used to assess rockfall hazard, by adopting an objective approach based on three-dimensional matrixes providing a positional "hazard index". Different hazard maps have been ob- tained combining and classifying variables in different ways. The performance of the different hazard maps has been evaluated on the basis of past rockfall events and com- pared to the results of existing methodologies. The sensitivity of the hazard index with respect to the included variables and their combinations is discussed in order to constrain as objective as possible assessment criteria.
Assessment of the probability of contaminating Mars
NASA Technical Reports Server (NTRS)
Judd, B. R.; North, D. W.; Pezier, J. P.
1974-01-01
New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought
NASA Astrophysics Data System (ADS)
Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.
2017-10-01
Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.
Abdulghani, Hamza Mohammad; Shaik, Shaffi Ahamed; Khamis, Nehal; Al-Drees, Abdulmajeed Abdulrahman; Irshad, Mohammad; Khalil, Mahmoud Salah; Alhaqwi, Ali Ibrahim; Isnani, Arthur
2014-04-01
Qualitative and quantitative evaluation of academic programs can enhance the development, effectiveness, and dissemination of comparative quality reports as well as quality improvement efforts. To evaluate the five research methodology workshops through assessing participants' satisfaction, knowledge and skills gain and impact on practices by the Kirkpatrick's evaluation model. The four level Kirkpatrick's model was applied for the evaluation. Training feedback questionnaires, pre and post tests, learner development plan reports and behavioral surveys were used to evaluate the effectiveness of the workshop programs. Of the 116 participants, 28 (24.1%) liked with appreciation, 62 (53.4%) liked with suggestions and 26 (22.4%) disliked the programs. Pre and post MCQs tests mean scores showed significant improvement of relevant basic knowledge and cognitive skills by 17.67% (p ≤ 0.005). Pre-and-post tests scores on workshops sub-topics also significantly improved for the manuscripts (p ≤ 0.031) and proposal writing (p ≤ 0.834). As for the impact, 56.9% of participants started research, and 6.9% published their studies. The results from participants' performance revealed an overall positive feedback and 79% of participant reported transfer of training skills at their workplace. The course outcomes achievement and suggestions given for improvements offer insight into the program which were encouraging and very useful. Encouraging "research culture" and work-based learning are probably the most powerful determinants for research promotion. These findings therefore encourage faculty development unit to continue its training and development in the research methodology aspects.
Warship Combat System Selection Methodology Based on Discrete Event Simulation
2010-09-01
Platform (from Spanish) PD Damage Probability xiv PHit Hit Probability PKill Kill Probability RSM Response Surface Model SAM Surface-Air Missile...such a large target allows an assumption that the probability of a hit ( PHit ) is one. This structure can be considered as a bridge; therefore, the
A risk-based multi-objective model for optimal placement of sensors in water distribution system
NASA Astrophysics Data System (ADS)
Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein
2018-02-01
In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.
Bayesian Inference on Proportional Elections
Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio
2015-01-01
Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259
Bayesian inference on proportional elections.
Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio
2015-01-01
Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.
Nielsen, Joseph; Tokuhiro, Akira; Hiromoto, Robert; ...
2015-11-13
Evaluation of the impacts of uncertainty and sensitivity in modeling presents a significant set of challenges in particular to high fidelity modeling. Computational costs and validation of models creates a need for cost effective decision making with regards to experiment design. Experiments designed to validate computation models can be used to reduce uncertainty in the physical model. In some cases, large uncertainty in a particular aspect of the model may or may not have a large impact on the final results. For example, modeling of a relief valve may result in large uncertainty, however, the actual effects on final peakmore » clad temperature in a reactor transient may be small and the large uncertainty with respect to valve modeling may be considered acceptable. Additionally, the ability to determine the adequacy of a model and the validation supporting it should be considered within a risk informed framework. Low fidelity modeling with large uncertainty may be considered adequate if the uncertainty is considered acceptable with respect to risk. In other words, models that are used to evaluate the probability of failure should be evaluated more rigorously with the intent of increasing safety margin. Probabilistic risk assessment (PRA) techniques have traditionally been used to identify accident conditions and transients. Traditional classical event tree methods utilize analysts’ knowledge and experience to identify the important timing of events in coordination with thermal-hydraulic modeling. These methods lack the capability to evaluate complex dynamic systems. In these systems, time and energy scales associated with transient events may vary as a function of transition times and energies to arrive at a different physical state. Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. Unfortunately DPRA methods introduce issues associated with combinatorial explosion of states. This study presents a methodology to address combinatorial explosion using a Branch-and-Bound algorithm applied to Dynamic Event Trees (DET), which utilize LENDIT (L – Length, E – Energy, N – Number, D – Distribution, I – Information, and T – Time) as well as a set theory to describe system, state, resource, and response (S2R2) sets to create bounding functions for the DET. The optimization of the DET in identifying high probability failure branches is extended to create a Phenomenological Identification and Ranking Table (PIRT) methodology to evaluate modeling parameters important to safety of those failure branches that have a high probability of failure. The PIRT can then be used as a tool to identify and evaluate the need for experimental validation of models that have the potential to reduce risk. Finally, in order to demonstrate this methodology, a Boiling Water Reactor (BWR) Station Blackout (SBO) case study is presented.« less
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
Evidence Base Update for Psychosocial Treatments for Pediatric Obsessive-Compulsive Disorder
Freeman, Jennifer; Garcia, Abbe; Frank, Hannah; Benito, Kristen; Conelea, Christine; Walther, Michael; Edmunds, Julie
2013-01-01
Objective Pediatric Obsessive Compulsive Disorder (OCD) is a chronic and impairing condition that often persists into adulthood. Barrett and colleagues (2008), in this journal, provided a detailed review of evidence based psychosocial treatments for youth with OCD. The current review provides an evidence base update of the pediatric OCD psychosocial treatment literature with particular attention to advances in the field as well as to the methodological challenges inherent in evaluating such findings. Method Psychosocial treatment studies conducted since the last review are described and evaluated according to methodological rigor and evidence-based classification using the JCCAP evidence based treatment (EBT) evaluation criteria (Southam-Gerow and Prinstein, this issue). Results Findings from this review clearly converge in support of CBT as an effective and appropriate first line treatment for youth with OCD (either alone or in combination with medication). Although no treatment for pediatric OCD has yet to be designated as “well established”, both individual and individual family based treatments have been shown to be “probably efficacious.” Conclusions Moderators and predictors of treatment outcome are discussed as are the areas where we have advanced the field and the areas where we have room to grow. The methodological and clinical challenges inherent in a review of the evidence base are reviewed. Finally, future research directions are outlined. PMID:23746138
Risk-Based Explosive Safety Analysis
2016-11-30
safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...liquids or propellants . 15. SUBJECT TERMS N/A 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the
2013-01-01
In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm × 750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures. PMID:23766706
On Applying the Prognostic Performance Metrics
NASA Technical Reports Server (NTRS)
Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai
2009-01-01
Prognostics performance evaluation has gained significant attention in the past few years. As prognostics technology matures and more sophisticated methods for prognostic uncertainty management are developed, a standardized methodology for performance evaluation becomes extremely important to guide improvement efforts in a constructive manner. This paper is in continuation of previous efforts where several new evaluation metrics tailored for prognostics were introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. Several shortcomings identified, while applying these metrics to a variety of real applications, are also summarized along with discussions that attempt to alleviate these problems. Further, these metrics have been enhanced to include the capability of incorporating probability distribution information from prognostic algorithms as opposed to evaluation based on point estimates only. Several methods have been suggested and guidelines have been provided to help choose one method over another based on probability distribution characteristics. These approaches also offer a convenient and intuitive visualization of algorithm performance with respect to some of these new metrics like prognostic horizon and alpha-lambda performance, and also quantify the corresponding performance while incorporating the uncertainty information.
Defender-Attacker Decision Tree Analysis to Combat Terrorism.
Garcia, Ryan J B; von Winterfeldt, Detlof
2016-12-01
We propose a methodology, called defender-attacker decision tree analysis, to evaluate defensive actions against terrorist attacks in a dynamic and hostile environment. Like most game-theoretic formulations of this problem, we assume that the defenders act rationally by maximizing their expected utility or minimizing their expected costs. However, we do not assume that attackers maximize their expected utilities. Instead, we encode the defender's limited knowledge about the attacker's motivations and capabilities as a conditional probability distribution over the attacker's decisions. We apply this methodology to the problem of defending against possible terrorist attacks on commercial airplanes, using one of three weapons: infrared-guided MANPADS (man-portable air defense systems), laser-guided MANPADS, or visually targeted RPGs (rocket propelled grenades). We also evaluate three countermeasures against these weapons: DIRCMs (directional infrared countermeasures), perimeter control around the airport, and hardening airplanes. The model includes deterrence effects, the effectiveness of the countermeasures, and the substitution of weapons and targets once a specific countermeasure is selected. It also includes a second stage of defensive decisions after an attack occurs. Key findings are: (1) due to the high cost of the countermeasures, not implementing countermeasures is the preferred defensive alternative for a large range of parameters; (2) if the probability of an attack and the associated consequences are large, a combination of DIRCMs and ground perimeter control are preferred over any single countermeasure. © 2016 Society for Risk Analysis.
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael
2017-09-01
The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumgartner, S.; Bieli, R.; Bergmann, U. C.
2012-07-01
An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This ismore » considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)« less
Martinez, Marie-José; Durand, Benoit; Calavas, Didier; Ducrot, Christian
2010-06-01
Demonstrating disease freedom is becoming important in different fields including animal disease control. Most methods consider sampling only from a homogeneous population in which each animal has the same probability of becoming infected. In this paper, we propose a new methodology to calculate the probability of detecting the disease if it is present in a heterogeneous population of small size with potentially different risk groups, differences in risk being defined using relative risks. To calculate this probability, for each possible arrangement of the infected animals in the different groups, the probability that all the animals tested are test-negative given this arrangement is multiplied by the probability that this arrangement occurs. The probability formula is developed using the assumption of a perfect test and hypergeometric sampling for finite small size populations. The methodology is applied to scrapie, a disease affecting small ruminants and characterized in sheep by a strong genetic susceptibility defining different risk groups. It illustrates that the genotypes of the tested animals influence heavily the confidence level of detecting scrapie. The results present the statistical power for substantiating disease freedom in a small heterogeneous population as a function of the design prevalence, the structure of the sample tested, the structure of the herd and the associated relative risks. (c) 2010 Elsevier B.V. All rights reserved.
Common Mental Disorders among Occupational Groups: Contributions of the Latent Class Model
Martins Carvalho, Fernando; de Araújo, Tânia Maria
2016-01-01
Background. The Self-Reporting Questionnaire (SRQ-20) is widely used for evaluating common mental disorders. However, few studies have evaluated the SRQ-20 measurements performance in occupational groups. This study aimed to describe manifestation patterns of common mental disorders symptoms among workers populations, by using latent class analysis. Methods. Data derived from 9,959 Brazilian workers, obtained from four cross-sectional studies that used similar methodology, among groups of informal workers, teachers, healthcare workers, and urban workers. Common mental disorders were measured by using SRQ-20. Latent class analysis was performed on each database separately. Results. Three classes of symptoms were confirmed in the occupational categories investigated. In all studies, class I met better criteria for suspicion of common mental disorders. Class II discriminated workers with intermediate probability of answers to the items belonging to anxiety, sadness, and energy decrease that configure common mental disorders. Class III was composed of subgroups of workers with low probability to respond positively to questions for screening common mental disorders. Conclusions. Three patterns of symptoms of common mental disorders were identified in the occupational groups investigated, ranging from distinctive features to low probabilities of occurrence. The SRQ-20 measurements showed stability in capturing nonpsychotic symptoms. PMID:27630999
A Method for the Evaluation of Thousands of Automated 3D Stem Cell Segmentations
Bajcsy, Peter; Simon, Mylene; Florczyk, Stephen; Simon, Carl G.; Juba, Derek; Brady, Mary
2016-01-01
There is no segmentation method that performs perfectly with any data set in comparison to human segmentation. Evaluation procedures for segmentation algorithms become critical for their selection. The problems associated with segmentation performance evaluations and visual verification of segmentation results are exaggerated when dealing with thousands of 3D image volumes because of the amount of computation and manual inputs needed. We address the problem of evaluating 3D segmentation performance when segmentation is applied to thousands of confocal microscopy images (z-stacks). Our approach is to incorporate experimental imaging and geometrical criteria, and map them into computationally efficient segmentation algorithms that can be applied to a very large number of z-stacks. This is an alternative approach to considering existing segmentation methods and evaluating most state-of-the-art algorithms. We designed a methodology for 3D segmentation performance characterization that consists of design, evaluation and verification steps. The characterization integrates manual inputs from projected surrogate “ground truth” of statistically representative samples and from visual inspection into the evaluation. The novelty of the methodology lies in (1) designing candidate segmentation algorithms by mapping imaging and geometrical criteria into algorithmic steps, and constructing plausible segmentation algorithms with respect to the order of algorithmic steps and their parameters, (2) evaluating segmentation accuracy using samples drawn from probability distribution estimates of candidate segmentations, and (3) minimizing human labor needed to create surrogate “truth” by approximating z-stack segmentations with 2D contours from three orthogonal z-stack projections and by developing visual verification tools. We demonstrate the methodology by applying it to a dataset of 1253 mesenchymal stem cells. The cells reside on 10 different types of biomaterial scaffolds, and are stained for actin and nucleus yielding 128 460 image frames (on average 125 cells/scaffold × 10 scaffold types × 2 stains × 51 frames/cell). After constructing and evaluating six candidates of 3D segmentation algorithms, the most accurate 3D segmentation algorithm achieved an average precision of 0.82 and an accuracy of 0.84 as measured by the Dice similarity index where values greater than 0.7 indicate a good spatial overlap. A probability of segmentation success was 0.85 based on visual verification, and a computation time was 42.3 h to process all z-stacks. While the most accurate segmentation technique was 4.2 times slower than the second most accurate algorithm, it consumed on average 9.65 times less memory per z-stack segmentation. PMID:26268699
2003-04-01
34action orientetion ". T^ks concerned pre-flight safety assessments for military combat aircraft and were performed 1^ Army Cobra aviators. Dependent...evaluations are vital during future assessments of team performance and especially for modeling purposes, as the literature lacks empirical...a similar scale, and then assign probabilities to likelihood’s for these in the future . Once completed, one can multiply expected feature values of
Advancement in modern approaches to mineral production quality control
NASA Astrophysics Data System (ADS)
Freidina, EV; Botvinnik, AA; Dvornikova, AN
2017-02-01
The natural resource potential of mineral deposits is represented by three categories: upside, attainable and investment. A modern methodology is proposed in this paper for production quality control, and its tools aimed at ensuring agreement between the product quality and the market requirements are described. The definitions of the costs of the product quality compliance and incompliance with the consumer requirements are introduced; the latter is suggested to use in evaluating resource potential of mineral deposits at a certain degree of probability.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-06-01
4-Chloro-o-toluidine hydrochloride is a probable human carcinogen, classified as weight-of-evidence Group B2 under the EPA Guidelines for Carcinogen Risk Assessment. Evidence on potential carcinogenicity from animal studies is Sufficient, and the evidence from human studies is No Data. The potency factor (F) for 4-chloro-o-toluidine hydrochloride is estimated to be 0.40 (mg/kg/day)(-1), placing it in potency group 3 according to the CAG's methodology for evaluating potential carcinogens. Combining the weight-of-evidence group and the potency group, 4-chloro-o-toluidine hydrochloride is assigned a LOW hazard ranking.
Improbable Outcomes: Infrequent or Extraordinary?
ERIC Educational Resources Information Center
Teigen, Karl Halvor; Juanchich, Marie; Riege, Anine H.
2013-01-01
Research on verbal probabilities has shown that "unlikely" or "improbable" events are believed to correspond to numerical probability values between 10% and 30%. However, building on a pragmatic approach of verbal probabilities and a new methodology, the present paper shows that unlikely outcomes are most often associated with outcomes that have a…
Cost benefit analysis of space communications technology: Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Holland, L. D.; Sassone, P. G.; Gallagher, J. J.; Robinette, S. L.; Vogler, F. H.; Zimmer, R. P.
1976-01-01
The questions of (1) whether or not NASA should support the further development of space communications technology, and, if so, (2) which technology's support should be given the highest priority are addressed. Insofar as the issues deal principally with resource allocation, an economics perspective is adopted. The resultant cost benefit methodology utilizes the net present value concept in three distinct analysis stages to evaluate and rank those technologies which pass a qualification test based upon probable (private sector) market failure. User-preference and technology state-of-the-art surveys were conducted (in 1975) to form a data base for the technology evaluation. The program encompassed near-future technologies in space communications earth stations and satellites, including the noncommunication subsystems of the satellite (station keeping, electrical power system, etc.). Results of the research program include confirmation of the applicability of the methodology as well as a list of space communications technologies ranked according to the estimated net present value of their support (development) by NASA.
Stefanov, V T
2000-01-01
A methodology is introduced for numerical evaluation, with any given accuracy, of the cumulative probabilities of the proportion of genome shared identical by descent (IBD) on chromosome segments by two individuals in a grandparent-type relationship. Programs are provided in the popular software package Maple for rapidly implementing such evaluations in the cases of grandchild-grandparent and great-grandchild-great-grandparent relationships. Our results can be used to identify chromosomal segments that may contain disease genes. Also, exact P values in significance testing for resemblance of either a grandparent with a grandchild or a great-grandparent with a great-grandchild can be calculated. The genomic continuum model, with Haldane's model for the crossover process, is assumed. This is the model that has been used recently in the genetics literature devoted to IBD calculations. Our methodology is based on viewing the model as a special exponential family and elaborating on recent research results for such families. PMID:11063711
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
Multiple data sources improve DNA-based mark-recapture population estimates of grizzly bears.
Boulanger, John; Kendall, Katherine C; Stetz, Jeffrey B; Roon, David A; Waits, Lisette P; Paetkau, David
2008-04-01
A fundamental challenge to estimating population size with mark-recapture methods is heterogeneous capture probabilities and subsequent bias of population estimates. Confronting this problem usually requires substantial sampling effort that can be difficult to achieve for some species, such as carnivores. We developed a methodology that uses two data sources to deal with heterogeneity and applied this to DNA mark-recapture data from grizzly bears (Ursus arctos). We improved population estimates by incorporating additional DNA "captures" of grizzly bears obtained by collecting hair from unbaited bear rub trees concurrently with baited, grid-based, hair snag sampling. We consider a Lincoln-Petersen estimator with hair snag captures as the initial session and rub tree captures as the recapture session and develop an estimator in program MARK that treats hair snag and rub tree samples as successive sessions. Using empirical data from a large-scale project in the greater Glacier National Park, Montana, USA, area and simulation modeling we evaluate these methods and compare the results to hair-snag-only estimates. Empirical results indicate that, compared with hair-snag-only data, the joint hair-snag-rub-tree methods produce similar but more precise estimates if capture and recapture rates are reasonably high for both methods. Simulation results suggest that estimators are potentially affected by correlation of capture probabilities between sample types in the presence of heterogeneity. Overall, closed population Huggins-Pledger estimators showed the highest precision and were most robust to sparse data, heterogeneity, and capture probability correlation among sampling types. Results also indicate that these estimators can be used when a segment of the population has zero capture probability for one of the methods. We propose that this general methodology may be useful for other species in which mark-recapture data are available from multiple sources.
Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties
NASA Astrophysics Data System (ADS)
Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris
2014-08-01
We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.
Daikoku, Tatsuya
2018-01-01
Learning and knowledge of transitional probability in sequences like music, called statistical learning and knowledge, are considered implicit processes that occur without intention to learn and awareness of what one knows. This implicit statistical knowledge can be alternatively expressed via abstract medium such as musical melody, which suggests this knowledge is reflected in melodies written by a composer. This study investigates how statistics in music vary over a composer's lifetime. Transitional probabilities of highest-pitch sequences in Ludwig van Beethoven's Piano Sonata were calculated based on different hierarchical Markov models. Each interval pattern was ordered based on the sonata opus number. The transitional probabilities of sequential patterns that are musical universal in music gradually decreased, suggesting that time-course variations of statistics in music reflect time-course variations of a composer's statistical knowledge. This study sheds new light on novel methodologies that may be able to evaluate the time-course variation of composer's implicit knowledge using musical scores.
NASA Astrophysics Data System (ADS)
Ando, T.; Kawasaki, A.; Koike, T.
2017-12-01
IPCC AR5 (2014) reported that rainfall in the middle latitudes of the Northern Hemisphere has been increasing since 1901, and it is claimed that warmer climate will increase the risk of floods. In contrast, world water demand is forecasted to exceed a sustainable supply by 40 percent by 2030. In order to avoid this expectable water shortage, securing new water resources has become an utmost challenge. However, flood risk prevention and the secure of water resources are contradictory. To solve this problem, we can use existing hydroelectric dams not only as energy resources but also for flood control. However, in case of Japan, hydroelectric dams take no responsibility for it, and benefits have not been discussed accrued by controlling flood by hydroelectric dams, namely by using preliminary water release from them. Therefore, our paper proposes methodology for assessing those benefits. This methodology has three stages as shown in Fig. 1. First, RRI model is used to model flood events, taking account of the probability of rainfall. Second, flood damage is calculated using assets in inundation areas multiplied by the inundation depths generated by that RRI model. Third, the losses stemming from preliminary water release are calculated, and adding them to flood damage, overall losses are calculated. We can evaluate the benefits by changing the volume of preliminary release. As a result, shown in Fig. 2, the use of hydroelectric dams to control flooding creates 20 billion Yen benefits, in the probability of three-day-ahead rainfall prediction of the assumed maximum rainfall in Oi River, in the Shizuoka Pref. of Japan. As the third priority in the Sendai Framework for Disaster Risk Reduction 2015-2030, `investing in disaster risk reduction for resilience - public and private investment in disaster risk prevention and reduction through structural and non-structural measures' was adopted. The accuracy of rainfall prediction is the key factor in maximizing the benefits. Therefore, if the accrued 20 billion Yen benefits by adopting this evaluation methodology are invested in improving rainfall prediction, the accuracy of the forecasts will increase and so will the benefits. This positive feedback loop will benefit society. The results of this study may stimulate further discussion on the role of hydroelectric dams in flood control.
Dimitrov, S; Detroyer, A; Piroird, C; Gomes, C; Eilstein, J; Pauloin, T; Kuseva, C; Ivanova, H; Popova, I; Karakolev, Y; Ringeissen, S; Mekenyan, O
2016-12-01
When searching for alternative methods to animal testing, confidently rescaling an in vitro result to the corresponding in vivo classification is still a challenging problem. Although one of the most important factors affecting good correlation is sample characteristics, they are very rarely integrated into correlation studies. Usually, in these studies, it is implicitly assumed that both compared values are error-free numbers, which they are not. In this work, we propose a general methodology to analyze and integrate data variability and thus confidence estimation when rescaling from one test to another. The methodology is demonstrated through the case study of rescaling the in vitro Direct Peptide Reactivity Assay (DPRA) reactivity to the in vivo Local Lymph Node Assay (LLNA) skin sensitization potency classifications. In a first step, a comprehensive statistical analysis evaluating the reliability and variability of LLNA and DPRA as such was done. These results allowed us to link the concept of gray zones and confidence probability, which in turn represents a new perspective for a more precise knowledge of the classification of chemicals within their in vivo OR in vitro test. Next, the novelty and practical value of our methodology introducing variability into the threshold optimization between the in vitro AND in vivo test resides in the fact that it attributes a confidence probability to the predicted classification. The methodology, classification and screening approach presented in this study are not restricted to skin sensitization only. They could be helpful also for fate, toxicity and health hazard assessment where plenty of in vitro and in chemico assays and/or QSARs models are available. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Functional-diversity indices can be driven by methodological choices and species richness.
Poos, Mark S; Walker, Steven C; Jackson, Donald A
2009-02-01
Functional diversity is an important concept in community ecology because it captures information on functional traits absent in measures of species diversity. One popular method of measuring functional diversity is the dendrogram-based method, FD. To calculate FD, a variety of methodological choices are required, and it has been debated about whether biological conclusions are sensitive to such choices. We studied the probability that conclusions regarding FD were sensitive, and that patterns in sensitivity were related to alpha and beta components of species richness. We developed a randomization procedure that iteratively calculated FD by assigning species into two assemblages and calculating the probability that the community with higher FD varied across methods. We found evidence of sensitivity in all five communities we examined, ranging from a probability of sensitivity of 0 (no sensitivity) to 0.976 (almost completely sensitive). Variations in these probabilities were driven by differences in alpha diversity between assemblages and not by beta diversity. Importantly, FD was most sensitive when it was most useful (i.e., when differences in alpha diversity were low). We demonstrate that trends in functional-diversity analyses can be largely driven by methodological choices or species richness, rather than functional trait information alone.
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
O'Mahony, James F; Newall, Anthony T; van Rosmalen, Joost
2015-12-01
Time is an important aspect of health economic evaluation, as the timing and duration of clinical events, healthcare interventions and their consequences all affect estimated costs and effects. These issues should be reflected in the design of health economic models. This article considers three important aspects of time in modelling: (1) which cohorts to simulate and how far into the future to extend the analysis; (2) the simulation of time, including the difference between discrete-time and continuous-time models, cycle lengths, and converting rates and probabilities; and (3) discounting future costs and effects to their present values. We provide a methodological overview of these issues and make recommendations to help inform both the conduct of cost-effectiveness analyses and the interpretation of their results. For choosing which cohorts to simulate and how many, we suggest analysts carefully assess potential reasons for variation in cost effectiveness between cohorts and the feasibility of subgroup-specific recommendations. For the simulation of time, we recommend using short cycles or continuous-time models to avoid biases and the need for half-cycle corrections, and provide advice on the correct conversion of transition probabilities in state transition models. Finally, for discounting, analysts should not only follow current guidance and report how discounting was conducted, especially in the case of differential discounting, but also seek to develop an understanding of its rationale. Our overall recommendations are that analysts explicitly state and justify their modelling choices regarding time and consider how alternative choices may impact on results.
The Animism Controversy Revisited: A Probability Analysis
ERIC Educational Resources Information Center
Smeets, Paul M.
1973-01-01
Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)
Bayesian probability of success for clinical trials using historical data
Ibrahim, Joseph G.; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F.; Heyse, Joseph F.
2015-01-01
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein’s work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. PMID:25339499
Bayesian probability of success for clinical trials using historical data.
Ibrahim, Joseph G; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F; Heyse, Joseph F
2015-01-30
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein's work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Skilling, John
2005-11-01
This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth
Probability theory versus simulation of petroleum potential in play analysis
Crovelli, R.A.
1987-01-01
An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.
To P or Not to P: Backing Bayesian Statistics.
Buchinsky, Farrel J; Chadha, Neil K
2017-12-01
In biomedical research, it is imperative to differentiate chance variation from truth before we generalize what we see in a sample of subjects to the wider population. For decades, we have relied on null hypothesis significance testing, where we calculate P values for our data to decide whether to reject a null hypothesis. This methodology is subject to substantial misinterpretation and errant conclusions. Instead of working backward by calculating the probability of our data if the null hypothesis were true, Bayesian statistics allow us instead to work forward, calculating the probability of our hypothesis given the available data. This methodology gives us a mathematical means of incorporating our "prior probabilities" from previous study data (if any) to produce new "posterior probabilities." Bayesian statistics tell us how confidently we should believe what we believe. It is time to embrace and encourage their use in our otolaryngology research.
CProb: a computational tool for conducting conditional probability analysis.
Hollister, Jeffrey W; Walker, Henry A; Paul, John F
2008-01-01
Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.
NASA Astrophysics Data System (ADS)
Beheshti Aval, Seyed Bahram; Kouhestani, Hamed Sadegh; Mottaghi, Lida
2017-07-01
This study investigates the efficiency of two types of rehabilitation methods based on economic justification that can lead to logical decision making between the retrofitting schemes. Among various rehabilitation methods, concentric chevron bracing (CCB) and cylindrical friction damper (CFD) were selected. The performance assessment procedure of the frames is divided into two distinct phases. First, the limit state probabilities of the structures before and after rehabilitation are investigated. In the second phase, the seismic risk of structures in terms of life safety and financial losses (decision variables) using the recently published FEMA P58 methodology is evaluated. The results show that the proposed retrofitting methods improve the serviceability and life safety performance levels of steel and RC structures at different rates when subjected to earthquake loads. Moreover, these procedures reveal that financial losses are greatly decreased, and were more tangible by the application of CFD rather than using CCB. Although using both retrofitting methods reduced damage state probabilities, incorporation of a site-specific seismic hazard curve to evaluate mean annual occurrence frequency at the collapse prevention limit state caused unexpected results to be obtained. Contrary to CFD, the collapse probability of the structures retrofitted with CCB increased when compared with the primary structures.
Wong, Carlos K H; Guo, Vivian Y W; Chen, Jing; Lam, Cindy L K
2016-11-01
Health-related quality of life is an important outcome measure in patients with colorectal cancer. Comparison with normative data has been increasingly undertaken to assess the additional impact of colorectal cancer on health-related quality of life. This review aimed to critically appraise the methodological details and reporting characteristics of comparative studies evaluating differences in health-related quality of life between patients and controls. A systematic search of English-language literature published between January 1985 and May 2014 was conducted through a database search of PubMed, Web of Science, Embase, and Medline. Comparative studies reporting health-related quality-of-life outcomes among patients who have colorectal cancer and controls were selected. Methodological and reporting quality per comparison study was evaluated based on a 11-item methodological checklist proposed by Efficace in 2003 and a set of criteria predetermined by reviewers. Thirty-one comparative studies involving >10,000 patients and >10,000 controls were included. Twenty-three studies (74.2%) originated from European countries, with the largest number from the Netherlands (n = 6). Twenty-eight studies (90.3%) compared the health-related quality of life of patients with normative data published elsewhere, whereas the remaining studies recruited a group of patients who had colorectal cancer and a group of control patients within the same studies. The European Organisation for Research and Treatment of Cancer Quality-of-Life Questionnaire Core 30 was the most extensively used instrument (n = 16; 51.6%). Eight studies (25.8%) were classified as "probably robust" for clinical decision making according to the Efficace standard methodological checklist. Our further quality assessment revealed the lack of score differences reported (61.3%), contemporary comparisons (36.7%), statistical significance tested (38.7%), and matching of control group (58.1%), possibly leading to inappropriate control groups for fair comparisons. Meta-analysis of differences between the 2 groups was not available. In general, one-fourth of comparative studies that evaluated health-related quality of life of patients who had colorectal cancer achieved high quality in reporting characteristics and methodological details. Future studies are encouraged to undertake health-related quality-of-life measurement and adhere to a methodological checklist in comparison with controls.
Cook, D A
2006-04-01
Models that estimate the probability of death of intensive care unit patients can be used to stratify patients according to the severity of their condition and to control for casemix and severity of illness. These models have been used for risk adjustment in quality monitoring, administration, management and research and as an aid to clinical decision making. Models such as the Mortality Prediction Model family, SAPS II, APACHE II, APACHE III and the organ system failure models provide estimates of the probability of in-hospital death of ICU patients. This review examines methods to assess the performance of these models. The key attributes of a model are discrimination (the accuracy of the ranking in order of probability of death) and calibration (the extent to which the model's prediction of probability of death reflects the true risk of death). These attributes should be assessed in existing models that predict the probability of patient mortality, and in any subsequent model that is developed for the purposes of estimating these probabilities. The literature contains a range of approaches for assessment which are reviewed and a survey of the methodologies used in studies of intensive care mortality models is presented. The systematic approach used by Standards for Reporting Diagnostic Accuracy provides a framework to incorporate these theoretical considerations of model assessment and recommendations are made for evaluation and presentation of the performance of models that estimate the probability of death of intensive care patients.
A novel quantitative methodology for age evaluation of the human corneal endothelium
NASA Astrophysics Data System (ADS)
Rannou, Klervi; Thuret, Gilles; Gain, Philippe; Pinoli, Jean-Charles; Gavet, Yann
2017-03-01
The human corneal endothelium regulates the cornea transparency. Its cells, that cannot regenerate after birth, form a tesselated mosaic with almost perfect hexagonal cells during childhood, becoming progressively bigger and less ordered during aging. This study included 50 patients (in 10 decades groups) and 10 specular microscopy observations per patient. Five different criteria were measured on the manually segmented cells: area and perimeter of the cells as well as reduced Minkowski functionals. All these criteria were used to assess the probability of age group membership. We demonstrated that the age evaluation is near the reality, although a high variability was observed for patients between 30 and 70 years old.
NASA Astrophysics Data System (ADS)
Yusof, Muhammad Mat; Sulaiman, Tajularipin; Khalid, Ruzelan; Hamid, Mohamad Shukri Abdul; Mansor, Rosnalini
2014-12-01
In professional sporting events, rating competitors before tournament start is a well-known approach to distinguish the favorite team and the weaker teams. Various methodologies are used to rate competitors. In this paper, we explore four ways to rate competitors; least squares rating, maximum likelihood strength ratio, standing points in large round robin simulation and previous league rank position. The tournament metric we used to evaluate different types of rating approach is tournament outcome characteristics measure. The tournament outcome characteristics measure is defined by the probability that a particular team in the top 100q pre-tournament rank percentile progress beyond round R, for all q and R. Based on simulation result, we found that different rating approach produces different effect to the team. Our simulation result shows that from eight teams participate in knockout standard seeding, Perak has highest probability to win for tournament that use the least squares rating approach, PKNS has highest probability to win using the maximum likelihood strength ratio and the large round robin simulation approach, while Perak has the highest probability to win a tournament using previous league season approach.
[A quickly methodology for drug intelligence using profiling of illicit heroin samples].
Zhang, Jianxin; Chen, Cunyi
2012-07-01
The aim of the paper was to evaluate a link between two heroin seizures using a descriptive method. The system involved the derivation and gas chromatographic separation of samples followed by a fully automatic data analysis and transfer to a database. Comparisons used the square cosine function between two chromatograms assimilated to vectors. The method showed good discriminatory capabilities. The probability of false positives was extremely slight. In conclusion, this method proved to be efficient and reliable, which appeared suitable for estimating the links between illicit heroin samples.
NASA Astrophysics Data System (ADS)
Martino, P.
1980-12-01
A general methodology is presented for conducting an analysis of the various aspects of the hazards associated with the storage and transportation of liquefied natural gas (LNG) which should be considered during the planning stages of a typical LNG ship terminal. The procedure includes the performance of a hazards and system analysis of the proposed site, a probability analysis of accident scenarios and safety impacts, an analysis of the consequences of credible accidents such as tanker accidents, spills and fires, the assessment of risks and the design and evaluation of risk mitigation measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seigler, R.S.; Luttrell, L.J.
Aircraft hazards were evaluated to determine the total annual probability of an aircraft crash occurring at any structure located on the US Department of Energy (DOE) reservation in Oak Ridge, Tennessee. This report documents the use of an accepted methodology for calculating the probability of an aircraft crash as applied to the three Oak Ridge plant sites including the adjoining facilities. Based on the data contained herein, the evaluation concluded that the probability of an aircraft crash occurrence at a single facility is generally considered ``not credible`` as defined in DOE/OR-901. Additionally, reevaluation of probabilities would be necessary if significantmore » changes were made to local air traffic. The probability of an aircraft crash could increase as a result of the opening of any new airport or heliport in the vicinity; a greater volume of air traffic from McGhee Tyson airport in Knoxville, should the airport status change from feeder airport to hub airport; the rerouting of commercial and/or military flights at the McGhee Tyson airport; and finally, a change in direction or the addition of a federal airway. At one time, DOE planned to establish a zone of prohibited airspace over the Y-12 plant; if the plans are enacted in the future, the probability of an aircraft crash at the Y-12 plant could decrease. Pilots since have been voluntarily requested not to fly below 3000 feet over the Y-12 plant. Also, the Federal Aviation Administration plans to reroute air traffic in the spring of 1993 on federal airway V16. However, the section of V16 which traverses the three plant sites and five adjoining facilities will not be altered. If this plan is implemented, the air traffic over the Oak Ridge facilities would not be affected significantly, and the probability of an aircraft crash as determined herein would be unchanged.« less
Montserrat, A; Bosch, Ll; Kiser, M A; Poch, M; Corominas, Ll
2015-02-01
Using low-cost sensors, data can be collected on the occurrence and duration of overflows in each combined sewer overflow (CSO) structure in a combined sewer system (CSS). The collection and analysis of real data can be used to assess, improve, and maintain CSSs in order to reduce the number and impact of overflows. The objective of this study was to develop a methodology to evaluate the performance of CSSs using low-cost monitoring. This methodology includes (1) assessing the capacity of a CSS using overflow duration and rain volume data, (2) characterizing the performance of CSO structures with statistics, (3) evaluating the compliance of a CSS with government guidelines, and (4) generating decision tree models to provide support to managers for making decisions about system maintenance. The methodology is demonstrated with a case study of a CSS in La Garriga, Spain. The rain volume breaking point from which CSO structures started to overflow ranged from 0.6 mm to 2.8 mm. The structures with the best and worst performance in terms of overflow (overflow probability, order, duration and CSO ranking) were characterized. Most of the obtained decision trees to predict overflows from rain data had accuracies ranging from 70% to 83%. The results obtained from the proposed methodology can greatly support managers and engineers dealing with real-world problems, improvements, and maintenance of CSSs. Copyright © 2014 Elsevier B.V. All rights reserved.
Advancing Usability Evaluation through Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman
2005-07-01
This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less
2013-01-01
Background As there are limited patients for chronic lymphocytic leukaemia trials, it is important that statistical methodologies in Phase II efficiently select regimens for subsequent evaluation in larger-scale Phase III trials. Methods We propose the screened selection design (SSD), which is a practical multi-stage, randomised Phase II design for two experimental arms. Activity is first evaluated by applying Simon’s two-stage design (1989) on each arm. If both are active, the play-the-winner selection strategy proposed by Simon, Wittes and Ellenberg (SWE) (1985) is applied to select the superior arm. A variant of the design, Modified SSD, also allows the arm with the higher response rates to be recommended only if its activity rate is greater by a clinically-relevant value. The operating characteristics are explored via a simulation study and compared to a Bayesian Selection approach. Results Simulations showed that with the proposed SSD, it is possible to retain the sample size as required in SWE and obtain similar probabilities of selecting the correct superior arm of at least 90%; with the additional attractive benefit of reducing the probability of selecting ineffective arms. This approach is comparable to a Bayesian Selection Strategy. The Modified SSD performs substantially better than the other designs in selecting neither arm if the underlying rates for both arms are desirable but equivalent, allowing for other factors to be considered in the decision making process. Though its probability of correctly selecting a superior arm might be reduced, it still performs reasonably well. It also reduces the probability of selecting an inferior arm. Conclusions SSD provides an easy to implement randomised Phase II design that selects the most promising treatment that has shown sufficient evidence of activity, with available R codes to evaluate its operating characteristics. PMID:23819695
Yap, Christina; Pettitt, Andrew; Billingham, Lucinda
2013-07-03
As there are limited patients for chronic lymphocytic leukaemia trials, it is important that statistical methodologies in Phase II efficiently select regimens for subsequent evaluation in larger-scale Phase III trials. We propose the screened selection design (SSD), which is a practical multi-stage, randomised Phase II design for two experimental arms. Activity is first evaluated by applying Simon's two-stage design (1989) on each arm. If both are active, the play-the-winner selection strategy proposed by Simon, Wittes and Ellenberg (SWE) (1985) is applied to select the superior arm. A variant of the design, Modified SSD, also allows the arm with the higher response rates to be recommended only if its activity rate is greater by a clinically-relevant value. The operating characteristics are explored via a simulation study and compared to a Bayesian Selection approach. Simulations showed that with the proposed SSD, it is possible to retain the sample size as required in SWE and obtain similar probabilities of selecting the correct superior arm of at least 90%; with the additional attractive benefit of reducing the probability of selecting ineffective arms. This approach is comparable to a Bayesian Selection Strategy. The Modified SSD performs substantially better than the other designs in selecting neither arm if the underlying rates for both arms are desirable but equivalent, allowing for other factors to be considered in the decision making process. Though its probability of correctly selecting a superior arm might be reduced, it still performs reasonably well. It also reduces the probability of selecting an inferior arm. SSD provides an easy to implement randomised Phase II design that selects the most promising treatment that has shown sufficient evidence of activity, with available R codes to evaluate its operating characteristics.
López-Picazo Ferrer, J
2001-05-15
To determine the applicability of the acceptance of lot quality assurance sampling (LQAS) in the primary care service portfolio, comparing its results with those given by classic evaluation. Compliance with the minimum technical norms (MTN) of the service of diabetic care was evaluated through the classic methodology (confidence 95%, accuracy 5%, representativeness of area, sample of 376 histories) and by LQAS (confidence 95%, power 80%, representativeness of primary care team (PCT), defining a lot by MTN and PCT, sample of 13 histories/PCT). Effort, information obtained and its operative nature were assessed. 44 PCTs from Murcia Primary Care Region. Classic methodology: compliance with MTN ranged between 91.1% (diagnosis, 95% CI, 84.2-94.0) and 30% (repercussion in viscera, 95% CI, 25.4-34.6). Objectives in three MTN were reached (diagnosis, history and EKG). LQAS: no MTN was accepted in all the PCTs, <01-diagnosis> being the most accepted (42 PCT, 95.6%) and <07-Funduscopy> the least accepted (24 PCT, 55.6%). In 9 PCT all were accepted (20.4%), and in 2 none were accepted (4.5%). Data were analysed through Pareto charts. Classic methodology offered accurate results, but did not identify which centres were those that did not comply (general focus). LQAS was preferable for evaluating MTN and probably coverage because: 1) it uses small samples, which foment internal quality-improvement initiatives; 2) it is easy and rapid to execute; 3) it identifies the PCT and criteria where there is an opportunity for improvement (specific focus), and 4) it can be used operatively for monitoring.
2015-01-01
We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed. PMID:25879067
Schultz, Elise V; Schultz, Christopher J; Carey, Lawrence D; Cecil, Daniel J; Bateman, Monte
2016-01-01
This study develops a fully automated lightning jump system encompassing objective storm tracking, Geostationary Lightning Mapper proxy data, and the lightning jump algorithm (LJA), which are important elements in the transition of the LJA concept from a research to an operational based algorithm. Storm cluster tracking is based on a product created from the combination of a radar parameter (vertically integrated liquid, VIL), and lightning information (flash rate density). Evaluations showed that the spatial scale of tracked features or storm clusters had a large impact on the lightning jump system performance, where increasing spatial scale size resulted in decreased dynamic range of the system's performance. This framework will also serve as a means to refine the LJA itself to enhance its operational applicability. Parameters within the system are isolated and the system's performance is evaluated with adjustments to parameter sensitivity. The system's performance is evaluated using the probability of detection (POD) and false alarm ratio (FAR) statistics. Of the algorithm parameters tested, sigma-level (metric of lightning jump strength) and flash rate threshold influenced the system's performance the most. Finally, verification methodologies are investigated. It is discovered that minor changes in verification methodology can dramatically impact the evaluation of the lightning jump system.
NASA Technical Reports Server (NTRS)
Schultz, Elise; Schultz, Christopher Joseph; Carey, Lawrence D.; Cecil, Daniel J.; Bateman, Monte
2016-01-01
This study develops a fully automated lightning jump system encompassing objective storm tracking, Geostationary Lightning Mapper proxy data, and the lightning jump algorithm (LJA), which are important elements in the transition of the LJA concept from a research to an operational based algorithm. Storm cluster tracking is based on a product created from the combination of a radar parameter (vertically integrated liquid, VIL), and lightning information (flash rate density). Evaluations showed that the spatial scale of tracked features or storm clusters had a large impact on the lightning jump system performance, where increasing spatial scale size resulted in decreased dynamic range of the system's performance. This framework will also serve as a means to refine the LJA itself to enhance its operational applicability. Parameters within the system are isolated and the system's performance is evaluated with adjustments to parameter sensitivity. The system's performance is evaluated using the probability of detection (POD) and false alarm ratio (FAR) statistics. Of the algorithm parameters tested, sigma-level (metric of lightning jump strength) and flash rate threshold influenced the system's performance the most. Finally, verification methodologies are investigated. It is discovered that minor changes in verification methodology can dramatically impact the evaluation of the lightning jump system.
SCHULTZ, ELISE V.; SCHULTZ, CHRISTOPHER J.; CAREY, LAWRENCE D.; CECIL, DANIEL J.; BATEMAN, MONTE
2017-01-01
This study develops a fully automated lightning jump system encompassing objective storm tracking, Geostationary Lightning Mapper proxy data, and the lightning jump algorithm (LJA), which are important elements in the transition of the LJA concept from a research to an operational based algorithm. Storm cluster tracking is based on a product created from the combination of a radar parameter (vertically integrated liquid, VIL), and lightning information (flash rate density). Evaluations showed that the spatial scale of tracked features or storm clusters had a large impact on the lightning jump system performance, where increasing spatial scale size resulted in decreased dynamic range of the system’s performance. This framework will also serve as a means to refine the LJA itself to enhance its operational applicability. Parameters within the system are isolated and the system’s performance is evaluated with adjustments to parameter sensitivity. The system’s performance is evaluated using the probability of detection (POD) and false alarm ratio (FAR) statistics. Of the algorithm parameters tested, sigma-level (metric of lightning jump strength) and flash rate threshold influenced the system’s performance the most. Finally, verification methodologies are investigated. It is discovered that minor changes in verification methodology can dramatically impact the evaluation of the lightning jump system. PMID:29303164
Pei, Yan
2015-01-01
We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed.
Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.
Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor
2015-11-01
Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.
Nichols, James D.; Hines, James E.; Pollock, Kenneth H.; Hinz, Robert L.; Link, William A.
1994-01-01
The proportion of animals in a population that breeds is an important determinant of population growth rate. Usual estimates of this quantity from field sampling data assume that the probability of appearing in the capture or count statistic is the same for animals that do and do not breed. A similar assumption is required by most existing methods used to test ecologically interesting hypotheses about reproductive costs using field sampling data. However, in many field sampling situations breeding and nonbreeding animals are likely to exhibit different probabilities of being seen or caught. In this paper, we propose the use of multistate capture-recapture models for these estimation and testing problems. This methodology permits a formal test of the hypothesis of equal capture/sighting probabilities for breeding and nonbreeding individuals. Two estimators of breeding proportion (and associated standard errors) are presented, one for the case of equal capture probabilities and one for the case of unequal capture probabilities. The multistate modeling framework also yields formal tests of hypotheses about reproductive costs to future reproduction or survival or both fitness components. The general methodology is illustrated using capture-recapture data on female meadow voles, Microtus pennsylvanicus. Resulting estimates of the proportion of reproductively active females showed strong seasonal variation, as expected, with low breeding proportions in midwinter. We found no evidence of reproductive costs extracted in subsequent survival or reproduction. We believe that this methodological framework has wide application to problems in animal ecology concerning breeding proportions and phenotypic reproductive costs.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2011-01-01
The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.
Deriving Laws from Ordering Relations
NASA Technical Reports Server (NTRS)
Knuth, Kevin H.
2004-01-01
The effect of Richard T. Cox's contribution to probability theory was to generalize Boolean implication among logical statements to degrees of implication, which are manipulated using rules derived from consistency with Boolean algebra. These rules are known as the sum rule, the product rule and Bayes Theorem, and the measure resulting from this generalization is probability. In this paper, I will describe how Cox s technique can be further generalized to include other algebras and hence other problems in science and mathematics. The result is a methodology that can be used to generalize an algebra to a calculus by relying on consistency with order theory to derive the laws of the calculus. My goals are to clear up the mysteries as to why the same basic structure found in probability theory appears in other contexts, to better understand the foundations of probability theory, and to extend these ideas to other areas by developing new mathematics and new physics. The relevance of this methodology will be demonstrated using examples from probability theory, number theory, geometry, information theory, and quantum mechanics.
The Long Exercise Test in Periodic Paralysis: A Bayesian Analysis.
Simmons, Daniel B; Lanning, Julie; Cleland, James C; Puwanant, Araya; Twydell, Paul T; Griggs, Robert C; Tawil, Rabi; Logigian, Eric L
2018-05-12
The long exercise test (LET) is used to assess the diagnosis of periodic paralysis (PP), but LET methodology and normal "cut-off" values vary. To determine optimal LET methodology and cut-offs, we reviewed LET data (abductor digiti minimi (ADM) motor response amplitude, area) from 55 PP patients (32 genetically definite) and 125 controls. Receiver operating characteristic (ROC) curves were constructed and area-under-the-curve (AUC) calculated to compare 1) peak-to-nadir versus baseline-to-nadir methodologies, and 2) amplitude versus area decrements. Using Bayesian principles, optimal "cut-off" decrements that achieved 95% post-test probability of PP were calculated for various pre-test probabilities (PreTPs). AUC was highest for peak-to-nadir methodology and equal for amplitude and area decrements. For PreTP ≤50%, optimal decrement cut-offs (peak-to-nadir) were >40% (amplitude) or >50% (area). For confirmation of PP, our data endorse the diagnostic utility of peak-to-nadir LET methodology using 40% amplitude or 50% area decrement cut-offs for PreTPs ≤50%. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.
Incorporating detection probability into northern Great Plains pronghorn population estimates
Jacques, Christopher N.; Jenks, Jonathan A.; Grovenburg, Troy W.; Klaver, Robert W.; DePerno, Christopher S.
2014-01-01
Pronghorn (Antilocapra americana) abundances commonly are estimated using fixed-wing surveys, but these estimates are likely to be negatively biased because of violations of key assumptions underpinning line-transect methodology. Reducing bias and improving precision of abundance estimates through use of detection probability and mark-resight models may allow for more responsive pronghorn management actions. Given their potential application in population estimation, we evaluated detection probability and mark-resight models for use in estimating pronghorn population abundance. We used logistic regression to quantify probabilities that detecting pronghorn might be influenced by group size, animal activity, percent vegetation, cover type, and topography. We estimated pronghorn population size by study area and year using mixed logit-normal mark-resight (MLNM) models. Pronghorn detection probability increased with group size, animal activity, and percent vegetation; overall detection probability was 0.639 (95% CI = 0.612–0.667) with 396 of 620 pronghorn groups detected. Despite model selection uncertainty, the best detection probability models were 44% (range = 8–79%) and 180% (range = 139–217%) greater than traditional pronghorn population estimates. Similarly, the best MLNM models were 28% (range = 3–58%) and 147% (range = 124–180%) greater than traditional population estimates. Detection probability of pronghorn was not constant but depended on both intrinsic and extrinsic factors. When pronghorn detection probability is a function of animal group size, animal activity, landscape complexity, and percent vegetation, traditional aerial survey techniques will result in biased pronghorn abundance estimates. Standardizing survey conditions, increasing resighting occasions, or accounting for variation in individual heterogeneity in mark-resight models will increase the accuracy and precision of pronghorn population estimates.
A methodology for probabilistic assessment of solar thermal power plants yield
NASA Astrophysics Data System (ADS)
Fernández-Peruchena, Carlos M.; Lara-Faneho, Vicente; Ramírez, Lourdes; Zarzalejo, Luis F.; Silva, Manuel; Bermejo, Diego; Gastón, Martín; Moreno, Sara; Pulgar, Jesús; Pavon, Manuel; Macías, Sergio; Valenzuela, Rita X.
2017-06-01
A detailed knowledge of the solar resource is a critical point to perform an economic feasibility analysis of Concentrating Solar Power (CSP) plants. This knowledge must include its magnitude (how much solar energy is available at an area of interest over a long time period), and its variability over time. In particular, DNI inter-annual variations may be large, increasing the return of investment risk in CSP plant projects. This risk is typically evaluated by means of the simulation of the energy delivered by the CSP plant during years with low solar irradiation, which are typically characterized by annual solar radiation datasets with high probability of exceedance of their annual DNI values. In this context, this paper proposes the use meteorological years representative of a given probability of exceedance of annual DNI in order to realistically assess the inter-annual variability of energy yields. The performance of this approach is evaluated in the location of Burns station (University of Oregon Solar Radiation Monitoring Laboratory), where a 34-year (from 1980 to 2013) measured data set of solar irradiance and temperature is available.
Reliability of Radioisotope Stirling Convertor Linear Alternator
NASA Technical Reports Server (NTRS)
Shah, Ashwin; Korovaichuk, Igor; Geng, Steven M.; Schreiber, Jeffrey G.
2006-01-01
Onboard radioisotope power systems being developed and planned for NASA s deep-space missions would require reliable design lifetimes of up to 14 years. Critical components and materials of Stirling convertors have been undergoing extensive testing and evaluation in support of a reliable performance for the specified life span. Of significant importance to the successful development of the Stirling convertor is the design of a lightweight and highly efficient linear alternator. Alternator performance could vary due to small deviations in the permanent magnet properties, operating temperature, and component geometries. Durability prediction and reliability of the alternator may be affected by these deviations from nominal design conditions. Therefore, it is important to evaluate the effect of these uncertainties in predicting the reliability of the linear alternator performance. This paper presents a study in which a reliability-based methodology is used to assess alternator performance. The response surface characterizing the induced open-circuit voltage performance is constructed using 3-D finite element magnetic analysis. Fast probability integration method is used to determine the probability of the desired performance and its sensitivity to the alternator design parameters.
Lachenmeier, Dirk W; Schoeberl, Kerstin; Kanteres, Fotis; Kuballa, Thomas; Sohnius, Eva-Maria; Rehm, Jürgen
2011-03-01
Some European countries with high levels of unrecorded alcohol consumption have anomalously high rates of death attributable to liver cirrhosis. Hepatotoxic compounds in illegally produced spirits may be partly responsible. Based on a review of the evidence on the chemical composition and potential harm from unrecorded alcohol, the Alcohol Measures for Public Health Research Alliance (AMPHORA) project's methodology for identifying, analysing and toxicologically evaluating such alcohols is provided. A computer-assisted literature review concentrated on unrecorded alcohol. Additionally, we refer to our work in the capacity of governmental alcohol control authority and a number of pilot studies. The risk-oriented identification of substances resulted in the following compounds probably posing a public health risk in unrecorded alcohol: ethanol, methanol, acetaldehyde, higher alcohols, heavy metals, ethyl carbamate, biologically active flavourings (e.g. coumarin) and diethyl phthalate. Suggestions on a sampling strategy for identifying unrecorded alcohol that may be most prone to contamination include using probable distribution points such as local farmers and flea markets for selling surrogate alcohol (including denatured alcohol) to focusing on lower socio-economic status or alcohol-dependent individuals, and selecting home-produced fruit spirits prone to ethyl carbamate contamination. Standardized guidelines for the chemical and toxicological evaluation of unrecorded alcohol that will be used in a European-wide sampling and are applicable globally are provided. These toxicological guidelines may also be used by alcohol control laboratories for recorded alcohol products, and form a scientific foundation for establishing legislative limits. © 2011 The Authors, Addiction © 2011 Society for the Study of Addiction.
Ximenes, Ricardo Arraes de Alencar; Pereira, Leila Maria Beltrão; Martelli, Celina Maria Turchi; Merchán-Hamann, Edgar; Stein, Airton Tetelbom; Figueiredo, Gerusa Maria; Braga, Maria Cynthia; Montarroyos, Ulisses Ramos; Brasil, Leila Melo; Turchi, Marília Dalva; Fonseca, José Carlos Ferraz da; Lima, Maria Luiza Carvalho de; Alencar, Luis Cláudio Arraes de; Costa, Marcelo; Coral, Gabriela; Moreira, Regina Celia; Cardoso, Maria Regina Alves
2010-09-01
A population-based survey to provide information on the prevalence of hepatitis viral infection and the pattern of risk factors was carried out in the urban population of all Brazilian state capitals and the Federal District, between 2005 and 2009. This paper describes the design and methodology of the study which involved a population aged 5 to 19 for hepatitis A and 10 to 69 for hepatitis B and C. Interviews and blood samples were obtained through household visits. The sample was selected using stratified multi-stage cluster sampling and was drawn with equal probability from each domain of study (region and age-group). Nationwide, 19,280 households and ~31,000 residents were selected. The study is large enough to detect prevalence of viral infection around 0.1% and risk factor assessments within each region. The methodology seems to be a viable way of differentiating between distinct epidemiological patterns of hepatitis A, B and C. These data will be of value for the evaluation of vaccination policies and for the design of control program strategies.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
1985-11-26
etc.).., Major decisions involving reliability ptudies, based on competing risk methodology , have been made in the past and will continue to be made...censoring mechanism. In such instances, the methodology for estimating relevant reliabili- ty probabilities has received considerable attention (cf. David...proposal for a discussion of the general methodology . .,4..% . - ’ -. - ’ . ’ , . * I - " . . - - - - . . ,_ . . . . . . . . .4
Background for Joint Systems Aspects of AIR 6000
2000-04-01
Checkland’s Soft Systems Methodology [7, 8,9]. The analytical techniques that are proposed for joint systems work are based on calculating probability...Supporting Global Interests 21 DSTO-CR-0155 SLMP Structural Life Management Plan SOW Stand-Off Weapon SSM Soft Systems Methodology UAV Uninhabited Aerial... Systems Methodology in Action, John Wiley & Sons, Chichester, 1990. [101 Pearl, Judea, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.
2017-12-01
Severe heat provides one of the most direct, acute, and rapidly changing impacts of climate on people and ecostystems. Theory, historical observations, and climate model simulations all suggest that global warming should increase the probability of hot events that fall outside of our historical experience. Given the acutre impacts of extreme heat, quantifying the probability of historically unprecedented hot events at different levels of climate forcing is critical for climate adaptation and mitigation decisions. However, in practice that quantification presents a number of methodological challenges. This presentation will review those methodological challenges, including the limitations of the observational record and of climate model fidelity. The presentation will detail a comprehensive approach to addressing these challenges. It will then demonstrate the application of that approach to quantifying uncertainty in the probability of record-setting hot events in the current climate, as well as periods with lower and higher greenhouse gas concentrations than the present.
Optical detection of chemical warfare agents and toxic industrial chemicals
NASA Astrophysics Data System (ADS)
Webber, Michael E.; Pushkarsky, Michael B.; Patel, C. Kumar N.
2004-12-01
We present an analytical model evaluating the suitability of optical absorption based spectroscopic techniques for detection of chemical warfare agents (CWAs) and toxic industrial chemicals (TICs) in ambient air. The sensor performance is modeled by simulating absorption spectra of a sample containing both the target and multitude of interfering species as well as an appropriate stochastic noise and determining the target concentrations from the simulated spectra via a least square fit (LSF) algorithm. The distribution of the LSF target concentrations determines the sensor sensitivity, probability of false positives (PFP) and probability of false negatives (PFN). The model was applied to CO2 laser based photoacosutic (L-PAS) CWA sensor and predicted single digit ppb sensitivity with very low PFP rates in the presence of significant amount of interferences. This approach will be useful for assessing sensor performance by developers and users alike; it also provides methodology for inter-comparison of different sensing technologies.
A Computational Framework to Control Verification and Robustness Analysis
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2010-01-01
This paper presents a methodology for evaluating the robustness of a controller based on its ability to satisfy the design requirements. The framework proposed is generic since it allows for high-fidelity models, arbitrary control structures and arbitrary functional dependencies between the requirements and the uncertain parameters. The cornerstone of this contribution is the ability to bound the region of the uncertain parameter space where the degradation in closed-loop performance remains acceptable. The size of this bounding set, whose geometry can be prescribed according to deterministic or probabilistic uncertainty models, is a measure of robustness. The robustness metrics proposed herein are the parametric safety margin, the reliability index, the failure probability and upper bounds to this probability. The performance observed at the control verification setting, where the assumptions and approximations used for control design may no longer hold, will fully determine the proposed control assessment.
Farmer, William H.; Koltun, Greg
2017-01-01
Study regionThe state of Ohio in the United States, a humid, continental climate.Study focusThe estimation of nonexceedance probabilities of daily streamflows as an alternative means of establishing the relative magnitudes of streamflows associated with hydrologic and water-quality observations.New hydrological insights for the regionSeveral methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index) and geospatial tools (kriging and topological kriging). These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.
High lifetime probability of screen-detected cervical abnormalities.
Pankakoski, Maiju; Heinävaara, Sirpa; Sarkeala, Tytti; Anttila, Ahti
2017-12-01
Objective Regular screening and follow-up is an important key to cervical cancer prevention; however, screening inevitably detects mild or borderline abnormalities that would never progress to a more severe stage. We analysed the cumulative probability and recurrence of cervical abnormalities in the Finnish organized screening programme during a 22-year follow-up. Methods Screening histories were collected for 364,487 women born between 1950 and 1965. Data consisted of 1 207,017 routine screens and 88,143 follow-up screens between 1991 and 2012. Probabilities of cervical abnormalities by age were estimated using logistic regression and generalized estimating equations methodology. Results The probability of experiencing any abnormality at least once at ages 30-64 was 34.0% (95% confidence interval [CI]: 33.3-34.6%) . Probability was 5.4% (95% CI: 5.0-5.8%) for results warranting referral and 2.2% (95% CI: 2.0-2.4%) for results with histologically confirmed findings. Previous occurrences were associated with an increased risk of detecting new ones, specifically in older women. Conclusion A considerable proportion of women experience at least one abnormal screening result during their lifetime, and yet very few eventually develop an actual precancerous lesion. Re-evaluation of diagnostic criteria concerning mild abnormalities might improve the balance of harms and benefits of screening. Special monitoring of women with recurrent abnormalities especially at older ages may also be needed.
Methodological Gaps in Left Atrial Function Assessment by 2D Speckle Tracking Echocardiography
Rimbaş, Roxana Cristina; Dulgheru, Raluca Elena; Vinereanu, Dragoş
2015-01-01
The assessment of left atrial (LA) function is used in various cardiovascular diseases. LA plays a complementary role in cardiac performance by modulating left ventricular (LV) function. Transthoracic two-dimensional (2D) phasic volumes and Doppler echocardiography can measure LA function non-invasively. However, evaluation of LA deformation derived from 2D speckle tracking echocardiography (STE) is a new feasible and promising approach for assessment of LA mechanics. These parameters are able to detect subclinical LA dysfunction in different pathological condition. Normal ranges for LA deformation and cut-off values to diagnose LA dysfunction with different diseases have been reported, but data are still conflicting, probably because of some methodological and technical issues. This review highlights the importance of an unique standardized technique to assess the LA phasic functions by STE, and discusses recent studies on the most important clinical applications of this technique. PMID:26761370
Evaluating Micrometeoroid and Orbital Debris Risk Assessments Using Anomaly Data
NASA Technical Reports Server (NTRS)
Squire, Michael
2017-01-01
The accuracy of micrometeoroid and orbital debris (MMOD) risk assessments can be difficult to evaluate. A team from the National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) has completed a study that compared MMOD-related failures on operational satellites to predictions of how many of those failures should occur using NASA's TM"s MMOD risk assessment methodology and tools. The study team used the Poisson probability to quantify the degree of inconsistency between the predicted and reported numbers of failures. Many elements go into a risk assessment, and each of those elements represent a possible source of uncertainty or bias that will influence the end result. There are also challenges in obtaining accurate and useful data on MMOD-related failures.
Fuzzy risk analysis of a modern γ-ray industrial irradiator.
Castiglia, F; Giardina, M
2011-06-01
Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.
Liu, Zhao; Zheng, Chaorong; Wu, Yue
2018-02-01
Recently, the government installed a boundary layer profiler (BLP), which is operated under the Doppler beam swinging mode, in a coastal area of China, to acquire useful wind field information in the atmospheric boundary layer for several purposes. And under strong wind conditions, the performance of the BLP is evaluated. It is found that, even though the quality controlled BLP data show good agreement with the balloon observations, a systematic bias can always be found for the BLP data. For the low wind velocities, the BLP data tend to overestimate the atmospheric wind. However, with the increment of wind velocity, the BLP data show a tendency of underestimation. In order to remove the effect of poor quality data on bias correction, the probability distribution function of the differences between the two instruments is discussed, and it is found that the t location scale distribution is the most suitable probability model when compared to other probability models. After the outliers with a large discrepancy, which are outside of 95% confidence interval of the t location scale distribution, are discarded, the systematic bias can be successfully corrected using a first-order polynomial correction function. The methodology of bias correction used in the study not only can be referred for the correction of other wind profiling radars, but also can lay a solid basis for further analysis of the wind profiles.
NASA Astrophysics Data System (ADS)
Liu, Zhao; Zheng, Chaorong; Wu, Yue
2018-02-01
Recently, the government installed a boundary layer profiler (BLP), which is operated under the Doppler beam swinging mode, in a coastal area of China, to acquire useful wind field information in the atmospheric boundary layer for several purposes. And under strong wind conditions, the performance of the BLP is evaluated. It is found that, even though the quality controlled BLP data show good agreement with the balloon observations, a systematic bias can always be found for the BLP data. For the low wind velocities, the BLP data tend to overestimate the atmospheric wind. However, with the increment of wind velocity, the BLP data show a tendency of underestimation. In order to remove the effect of poor quality data on bias correction, the probability distribution function of the differences between the two instruments is discussed, and it is found that the t location scale distribution is the most suitable probability model when compared to other probability models. After the outliers with a large discrepancy, which are outside of 95% confidence interval of the t location scale distribution, are discarded, the systematic bias can be successfully corrected using a first-order polynomial correction function. The methodology of bias correction used in the study not only can be referred for the correction of other wind profiling radars, but also can lay a solid basis for further analysis of the wind profiles.
Methodology for building confidence measures
NASA Astrophysics Data System (ADS)
Bramson, Aaron L.
2004-04-01
This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.
NASA Astrophysics Data System (ADS)
Alfano, M.; Bisagni, C.
2017-01-01
The objective of the running EU project DESICOS (New Robust DESign Guideline for Imperfection Sensitive COmposite Launcher Structures) is to formulate an improved shell design methodology in order to meet the demand of aerospace industry for lighter structures. Within the project, this article discusses the development of a probability-based methodology developed at Politecnico di Milano. It is based on the combination of the Stress-Strength Interference Method and the Latin Hypercube Method with the aim to predict the bucking response of three sandwich composite cylindrical shells, assuming a loading condition of pure compression. The three shells are made of the same material, but have different stacking sequence and geometric dimensions. One of them presents three circular cut-outs. Different types of input imperfections, treated as random variables, are taken into account independently and in combination: variability in longitudinal Young's modulus, ply misalignment, geometric imperfections, and boundary imperfections. The methodology enables a first assessment of the structural reliability of the shells through the calculation of a probabilistic buckling factor for a specified level of probability. The factor depends highly on the reliability level, on the number of adopted samples, and on the assumptions made in modeling the input imperfections. The main advantage of the developed procedure is the versatility, as it can be applied to the buckling analysis of laminated composite shells and sandwich composite shells including different types of imperfections.
Seismic Evaluation of A Historical Structure In Kastamonu - Turkey
NASA Astrophysics Data System (ADS)
Pınar, USTA; Işıl ÇARHOĞLU, Asuman; EVCİ, Ahmet
2018-01-01
The Kastomonu province is a seismically active zone. the city has many historical buildings made of stone-masonry. In case of any probable future earthquakes, existing buildings may suffer substantial or heavy damages. In the present study, one of the historical traditional house located in Kastamonu were structurally investigated through probabilistic seismic risk assessment methodology. In the study, the building was modeled by using the Finite Element Modeling (FEM) software, SAP2000. Time history analyses were carried out using 10 different ground motion data on the FEM models. Displacements were interpreted, and the results were displayed graphically and discussed.
Konchak, Chad; Prasad, Kislaya
2012-01-01
Objectives To develop a methodology for integrating social networks into traditional cost-effectiveness analysis (CEA) studies. This will facilitate the economic evaluation of treatment policies in settings where health outcomes are subject to social influence. Design This is a simulation study based on a Markov model. The lifetime health histories of a cohort are simulated, and health outcomes compared, under alternative treatment policies. Transition probabilities depend on the health of others with whom there are shared social ties. Setting The methodology developed is shown to be applicable in any healthcare setting where social ties affect health outcomes. The example of obesity prevention is used for illustration under the assumption that weight changes are subject to social influence. Main outcome measures Incremental cost-effectiveness ratio (ICER). Results When social influence increases, treatment policies become more cost effective (have lower ICERs). The policy of only treating individuals who span multiple networks can be more cost effective than the policy of treating everyone. This occurs when the network is more fragmented. Conclusions (1) When network effects are accounted for, they result in very different values of incremental cost-effectiveness ratios (ICERs). (2) Treatment policies can be devised to take network structure into account. The integration makes it feasible to conduct a cost-benefit evaluation of such policies. PMID:23117559
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Conditional, Time-Dependent Probabilities for Segmented Type-A Faults in the WGCEP UCERF 2
Field, Edward H.; Gupta, Vipin
2008-01-01
This appendix presents elastic-rebound-theory (ERT) motivated time-dependent probabilities, conditioned on the date of last earthquake, for the segmented type-A fault models of the 2007 Working Group on California Earthquake Probabilities (WGCEP). These probabilities are included as one option in the WGCEP?s Uniform California Earthquake Rupture Forecast 2 (UCERF 2), with the other options being time-independent Poisson probabilities and an ?Empirical? model based on observed seismicity rate changes. A more general discussion of the pros and cons of all methods for computing time-dependent probabilities, as well as the justification of those chosen for UCERF 2, are given in the main body of this report (and the 'Empirical' model is also discussed in Appendix M). What this appendix addresses is the computation of conditional, time-dependent probabilities when both single- and multi-segment ruptures are included in the model. Computing conditional probabilities is relatively straightforward when a fault is assumed to obey strict segmentation in the sense that no multi-segment ruptures occur (e.g., WGCEP (1988, 1990) or see Field (2007) for a review of all previous WGCEPs; from here we assume basic familiarity with conditional probability calculations). However, and as we?ll see below, the calculation is not straightforward when multi-segment ruptures are included, in essence because we are attempting to apply a point-process model to a non point process. The next section gives a review and evaluation of the single- and multi-segment rupture probability-calculation methods used in the most recent statewide forecast for California (WGCEP UCERF 1; Petersen et al., 2007). We then present results for the methodology adopted here for UCERF 2. We finish with a discussion of issues and possible alternative approaches that could be explored and perhaps applied in the future. A fault-by-fault comparison of UCERF 2 probabilities with those of previous studies is given in the main part of this report.
Michaleff, Zoe A.; Maher, Chris G.; Verhagen, Arianne P.; Rebbeck, Trudy; Lin, Chung-Wei Christine
2012-01-01
Background: There is uncertainty about the optimal approach to screen for clinically important cervical spine (C-spine) injury following blunt trauma. We conducted a systematic review to investigate the diagnostic accuracy of the Canadian C-spine rule and the National Emergency X-Radiography Utilization Study (NEXUS) criteria, 2 rules that are available to assist emergency physicians to assess the need for cervical spine imaging. Methods: We identified studies by an electronic search of CINAHL, Embase and MEDLINE. We included articles that reported on a cohort of patients who experienced blunt trauma and for whom clinically important cervical spine injury detectable by diagnostic imaging was the differential diagnosis; evaluated the diagnostic accuracy of the Canadian C-spine rule or NEXUS or both; and used an adequate reference standard. We assessed the methodologic quality using the Quality Assessment of Diagnostic Accuracy Studies criteria. We used the extracted data to calculate sensitivity, specificity, likelihood ratios and post-test probabilities. Results: We included 15 studies of modest methodologic quality. For the Canadian C-spine rule, sensitivity ranged from 0.90 to 1.00 and specificity ranged from 0.01 to 0.77. For NEXUS, sensitivity ranged from 0.83 to 1.00 and specificity ranged from 0.02 to 0.46. One study directly compared the accuracy of these 2 rules using the same cohort and found that the Canadian C-spine rule had better accuracy. For both rules, a negative test was more informative for reducing the probability of a clinically important cervical spine injury. Interpretation: Based on studies with modest methodologic quality and only one direct comparison, we found that the Canadian C-spine rule appears to have better diagnostic accuracy than the NEXUS criteria. Future studies need to follow rigorous methodologic procedures to ensure that the findings are as free of bias as possible. PMID:23048086
What Is the Probability You Are a Bayesian?
ERIC Educational Resources Information Center
Wulff, Shaun S.; Robinson, Timothy J.
2014-01-01
Bayesian methodology continues to be widely used in statistical applications. As a result, it is increasingly important to introduce students to Bayesian thinking at early stages in their mathematics and statistics education. While many students in upper level probability courses can recite the differences in the Frequentist and Bayesian…
Exact Tests for the Rasch Model via Sequential Importance Sampling
ERIC Educational Resources Information Center
Chen, Yuguo; Small, Dylan
2005-01-01
Rasch proposed an exact conditional inference approach to testing his model but never implemented it because it involves the calculation of a complicated probability. This paper furthers Rasch's approach by (1) providing an efficient Monte Carlo methodology for accurately approximating the required probability and (2) illustrating the usefulness…
Risk analysis for roadways subjected to multiple landslide-related hazards
NASA Astrophysics Data System (ADS)
Corominas, Jordi; Mavrouli, Olga
2014-05-01
Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.
Joint probability of statistical success of multiple phase III trials.
Zhang, Jianliang; Zhang, Jenny J
2013-01-01
In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.
de Dianous, Valérie; Fiévez, Cécile
2006-03-31
Over the last two decades a growing interest for risk analysis has been noted in the industries. The ARAMIS project has defined a methodology for risk assessment. This methodology has been built to help the industrialist to demonstrate that they have a sufficient risk control on their site. Risk analysis consists first in the identification of all the major accidents, assuming that safety functions in place are inefficient. This step of identification of the major accidents uses bow-tie diagrams. Secondly, the safety barriers really implemented on the site are taken into account. The barriers are identified on the bow-ties. An evaluation of their performance (response time, efficiency, and level of confidence) is performed to validate that they are relevant for the expected safety function. At last, the evaluation of their probability of failure enables to assess the frequency of occurrence of the accident. The demonstration of the risk control based on a couple gravity/frequency of occurrence is also possible for all the accident scenarios. During the risk analysis, a practical tool called risk graph is used to assess if the number and the reliability of the safety functions for a given cause are sufficient to reach a good risk control.
CARES/Life Ceramics Durability Evaluation Software Enhanced for Cyclic Fatigue
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.
1999-01-01
The CARES/Life computer program predicts the probability of a monolithic ceramic component's failure as a function of time in service. The program has many features and options for materials evaluation and component design. It couples commercial finite element programs--which resolve a component's temperature and stress distribution--to reliability evaluation and fracture mechanics routines for modeling strength-limiting defects. The capability, flexibility, and uniqueness of CARES/Life have attracted many users representing a broad range of interests and has resulted in numerous awards for technological achievements and technology transfer. Recent work with CARES/Life was directed at enhancing the program s capabilities with regards to cyclic fatigue. Only in the last few years have ceramics been recognized to be susceptible to enhanced degradation from cyclic loading. To account for cyclic loads, researchers at the NASA Lewis Research Center developed a crack growth model that combines the Power Law (time-dependent) and the Walker Law (cycle-dependent) crack growth models. This combined model has the characteristics of Power Law behavior (decreased damage) at high R ratios (minimum load/maximum load) and of Walker law behavior (increased damage) at low R ratios. In addition, a parameter estimation methodology for constant-amplitude, steady-state cyclic fatigue experiments was developed using nonlinear least squares and a modified Levenberg-Marquardt algorithm. This methodology is used to give best estimates of parameter values from cyclic fatigue specimen rupture data (usually tensile or flexure bar specimens) for a relatively small number of specimens. Methodology to account for runout data (unfailed specimens over the duration of the experiment) was also included.
FASP, an analytic resource appraisal program for petroleum play analysis
Crovelli, R.A.; Balay, R.H.
1986-01-01
An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.
van Voorn, George A. K.; Ligtenberg, Arend; Molenaar, Jaap
2017-01-01
Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover’s distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system. PMID:28196372
Resilience through adaptation.
Ten Broeke, Guus A; van Voorn, George A K; Ligtenberg, Arend; Molenaar, Jaap
2017-01-01
Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover's distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system.
Luhnen, Miriam; Prediger, Barbara; Neugebauer, Edmund A M; Mathes, Tim
2017-12-02
The number of systematic reviews of economic evaluations is steadily increasing. This is probably related to the continuing pressure on health budgets worldwide which makes an efficient resource allocation increasingly crucial. In particular in recent years, the introduction of several high-cost interventions presents enormous challenges regarding universal accessibility and sustainability of health care systems. An increasing number of health authorities, inter alia, feel the need for analyzing economic evidence. Economic evidence might effectively be generated by means of systematic reviews. Nevertheless, no standard methods seem to exist for their preparation so far. The objective of this study was to analyze the methods applied for systematic reviews of health economic evaluations (SR-HE) with a focus on the identification of common challenges. The planned study is a systematic review of the characteristics and methods actually applied in SR-HE. We will combine validated search filters developed for the retrieval of economic evaluations and systematic reviews to identify relevant studies in MEDLINE (via Ovid, 2015-present). To be eligible for inclusion, studies have to conduct a systematic review of full economic evaluations. Articles focusing exclusively on methodological aspects and secondary publications of health technology assessment (HTA) reports will be excluded. Two reviewers will independently assess titles and abstracts and then full-texts of studies for eligibility. Methodological features will be extracted in a standardized, beforehand piloted data extraction form. Data will be summarized with descriptive statistical measures and systematically analyzed focusing on differences/similarities and methodological weaknesses. The systematic review will provide a detailed overview of characteristics of SR-HE and the applied methods. Differences and methodological shortcomings will be detected and their implications will be discussed. The findings of our study can improve the recommendations on the preparation of SR-HE. This can increase the acceptance and usefulness of systematic reviews in health economics for researchers and medical decision makers. The review will not be registered with PROSPERO as it does not meet the eligibility criterion of dealing with clinical outcomes.
Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.
2003-01-01
Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.
Fragility Analysis of Concrete Gravity Dams
NASA Astrophysics Data System (ADS)
Tekie, Paulos B.; Ellingwood, Bruce R.
2002-09-01
Concrete gravity dams are an important part ofthe nation's infrastructure. Many dams have been in service for over 50 years, during which time important advances in the methodologies for evaluation of natural phenomena hazards have caused the design-basis events to be revised upwards, in some cases significantly. Many existing dams fail to meet these revised safety criteria and structural rehabilitation to meet newly revised criteria may be costly and difficult. A probabilistic safety analysis (PSA) provides a rational safety assessment and decision-making tool managing the various sources of uncertainty that may impact dam performance. Fragility analysis, which depicts fl%e uncertainty in the safety margin above specified hazard levels, is a fundamental tool in a PSA. This study presents a methodology for developing fragilities of concrete gravity dams to assess their performance against hydrologic and seismic hazards. Models of varying degree of complexity and sophistication were considered and compared. The methodology is illustrated using the Bluestone Dam on the New River in West Virginia, which was designed in the late 1930's. The hydrologic fragilities showed that the Eluestone Dam is unlikely to become unstable at the revised probable maximum flood (PMF), but it is likely that there will be significant cracking at the heel ofthe dam. On the other hand, the seismic fragility analysis indicated that sliding is likely, if the dam were to be subjected to a maximum credible earthquake (MCE). Moreover, there will likely be tensile cracking at the neck of the dam at this level of seismic excitation. Probabilities of relatively severe limit states appear to be only marginally affected by extremely rare events (e.g. the PMF and MCE). Moreover, the risks posed by the extreme floods and earthquakes were not balanced for the Bluestone Dam, with seismic hazard posing a relatively higher risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R; Gobbato, Maurizio; Conte, Joel
2009-01-01
The extensive use of lightweight advanced composite materials in unmanned aerial vehicles (UAVs) drastically increases the sensitivity to both fatigue- and impact-induced damage of their critical structural components (e.g., wings and tail stabilizers) during service life. The spar-to-skin adhesive joints are considered one of the most fatigue sensitive subcomponents of a lightweight UAV composite wing with damage progressively evolving from the wing root. This paper presents a comprehensive probabilistic methodology for predicting the remaining service life of adhesively-bonded joints in laminated composite structural components of UAVs. Non-destructive evaluation techniques and Bayesian inference are used to (i) assess the current statemore » of damage of the system and, (ii) update the probability distribution of the damage extent at various locations. A probabilistic model for future loads and a mechanics-based damage model are then used to stochastically propagate damage through the joint. Combined local (e.g., exceedance of a critical damage size) and global (e.g.. flutter instability) failure criteria are finally used to compute the probability of component failure at future times. The applicability and the partial validation of the proposed methodology are then briefly discussed by analyzing the debonding propagation, along a pre-defined adhesive interface, in a simply supported laminated composite beam with solid rectangular cross section, subjected to a concentrated load applied at mid-span. A specially developed Eliler-Bernoulli beam finite element with interlaminar slip along the damageable interface is used in combination with a cohesive zone model to study the fatigue-induced degradation in the adhesive material. The preliminary numerical results presented are promising for the future validation of the methodology.« less
Metric optimisation for analogue forecasting by simulated annealing
NASA Astrophysics Data System (ADS)
Bliefernicht, J.; Bárdossy, A.
2009-04-01
It is well known that weather patterns tend to recur from time to time. This property of the atmosphere is used by analogue forecasting techniques. They have a long history in weather forecasting and there are many applications predicting hydrological variables at the local scale for different lead times. The basic idea of the technique is to identify past weather situations which are similar (analogue) to the predicted one and to take the local conditions of the analogues as forecast. But the forecast performance of the analogue method depends on user-defined criteria like the choice of the distance function and the size of the predictor domain. In this study we propose a new methodology of optimising both criteria by minimising the forecast error with simulated annealing. The performance of the methodology is demonstrated for the probability forecast of daily areal precipitation. It is compared with a traditional analogue forecasting algorithm, which is used operational as an element of a hydrological forecasting system. The study is performed for several meso-scale catchments located in the Rhine basin in Germany. The methodology is validated by a jack-knife method in a perfect prognosis framework for a period of 48 years (1958-2005). The predictor variables are derived from the NCEP/NCAR reanalysis data set. The Brier skill score and the economic value are determined to evaluate the forecast skill and value of the technique. In this presentation we will present the concept of the optimisation algorithm and the outcome of the comparison. It will be also demonstrated how a decision maker should apply a probability forecast to maximise the economic benefit from it.
Bearing damage assessment using Jensen-Rényi Divergence based on EEMD
NASA Astrophysics Data System (ADS)
Singh, Jaskaran; Darpe, A. K.; Singh, S. P.
2017-03-01
An Ensemble Empirical Mode Decomposition (EEMD) and Jensen Rényi divergence (JRD) based methodology is proposed for the degradation assessment of rolling element bearings using vibration data. The EEMD decomposes vibration signals into a set of intrinsic mode functions (IMFs). A systematic methodology to select IMFs that are sensitive and closely related to the fault is proposed in the paper. The change in probability distribution of the energies of the sensitive IMFs is measured through JRD which acts as a damage identification parameter. Evaluation of JRD with sensitive IMFs makes it largely unaffected by change/fluctuations in operating conditions. Further, an algorithm based on Chebyshev's inequality is applied to JRD to identify exact points of change in bearing health and remove outliers. The identified change points are investigated for fault classification as possible locations where specific defect initiation could have taken place. For fault classification, two new parameters are proposed: 'α value' and Probable Fault Index, which together classify the fault. To standardize the degradation process, a Confidence Value parameter is proposed to quantify the bearing degradation value in a range of zero to unity. A simulation study is first carried out to demonstrate the robustness of the proposed JRD parameter under variable operating conditions of load and speed. The proposed methodology is then validated on experimental data (seeded defect data and accelerated bearing life test data). The first validation on two different vibration datasets (inner/outer) obtained from seeded defect experiments demonstrate the effectiveness of JRD parameter in detecting a change in health state as the severity of fault changes. The second validation is on two accelerated life tests. The results demonstrate the proposed approach as a potential tool for bearing performance degradation assessment.
Identification of phreatophytic groundwater dependent ecosystems using geospatial technologies
NASA Astrophysics Data System (ADS)
Perez Hoyos, Isabel Cristina
The protection of groundwater dependent ecosystems (GDEs) is increasingly being recognized as an essential aspect for the sustainable management and allocation of water resources. Ecosystem services are crucial for human well-being and for a variety of flora and fauna. However, the conservation of GDEs is only possible if knowledge about their location and extent is available. Several studies have focused on the identification of GDEs at specific locations using ground-based measurements. However, recent progress in technologies such as remote sensing and their integration with geographic information systems (GIS) has provided alternative ways to map GDEs at much larger spatial extents. This study is concerned with the discovery of patterns in geospatial data sets using data mining techniques for mapping phreatophytic GDEs in the United States at 1 km spatial resolution. A methodology to identify the probability of an ecosystem to be groundwater dependent is developed. Probabilities are obtained by modeling the relationship between the known locations of GDEs and main factors influencing groundwater dependency, namely water table depth (WTD) and aridity index (AI). A methodology is proposed to predict WTD at 1 km spatial resolution using relevant geospatial data sets calibrated with WTD observations. An ensemble learning algorithm called random forest (RF) is used in order to model the distribution of groundwater in three study areas: Nevada, California, and Washington, as well as in the entire United States. RF regression performance is compared with a single regression tree (RT). The comparison is based on contrasting training error, true prediction error, and variable importance estimates of both methods. Additionally, remote sensing variables are omitted from the process of fitting the RF model to the data to evaluate the deterioration in the model performance when these variables are not used as an input. Research results suggest that although the prediction accuracy of a single RT is reduced in comparison with RFs, single trees can still be used to understand the interactions that might be taking place between predictor variables and the response variable. Regarding RF, there is a great potential in using the power of an ensemble of trees for prediction of WTD. The superior capability of RF to accurately map water table position in Nevada, California, and Washington demonstrate that this technique can be applied at scales larger than regional levels. It is also shown that the removal of remote sensing variables from the RF training process degrades the performance of the model. Using the predicted WTD, the probability of an ecosystem to be groundwater dependent (GDE probability) is estimated at 1 km spatial resolution. The modeling technique is evaluated in the state of Nevada, USA to develop a systematic approach for the identification of GDEs and it is then applied in the United States. The modeling approach selected for the development of the GDE probability map results from a comparison of the performance of classification trees (CT) and classification forests (CF). Predictive performance evaluation for the selection of the most accurate model is achieved using a threshold independent technique, and the prediction accuracy of both models is assessed in greater detail using threshold-dependent measures. The resulting GDE probability map can potentially be used for the definition of conservation areas since it can be translated into a binary classification map with two classes: GDE and NON-GDE. These maps are created by selecting a probability threshold. It is demonstrated that the choice of this threshold has dramatic effects on deterministic model performance measures.
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
NASA Astrophysics Data System (ADS)
Zhang, Zhong
In this work, motivated by the need to coordinate transmission maintenance scheduling among a multiplicity of self-interested entities in restructured power industry, a distributed decision support framework based on multiagent negotiation systems (MANS) is developed. An innovative risk-based transmission maintenance optimization procedure is introduced. Several models for linking condition monitoring information to the equipment's instantaneous failure probability are presented, which enable quantitative evaluation of the effectiveness of maintenance activities in terms of system cumulative risk reduction. Methodologies of statistical processing, equipment deterioration evaluation and time-dependent failure probability calculation are also described. A novel framework capable of facilitating distributed decision-making through multiagent negotiation is developed. A multiagent negotiation model is developed and illustrated that accounts for uncertainty and enables social rationality. Some issues of multiagent negotiation convergence and scalability are discussed. The relationships between agent-based negotiation and auction systems are also identified. A four-step MAS design methodology for constructing multiagent systems for power system applications is presented. A generic multiagent negotiation system, capable of inter-agent communication and distributed decision support through inter-agent negotiations, is implemented. A multiagent system framework for facilitating the automated integration of condition monitoring information and maintenance scheduling for power transformers is developed. Simulations of multiagent negotiation-based maintenance scheduling among several independent utilities are provided. It is shown to be a viable alternative solution paradigm to the traditional centralized optimization approach in today's deregulated environment. This multiagent system framework not only facilitates the decision-making among competing power system entities, but also provides a tool to use in studying competitive industry relative to monopolistic industry.
Evaluating a Modular Design Approach to Collecting Survey Data Using Text Messages
West, Brady T.; Ghimire, Dirgha; Axinn, William G.
2015-01-01
This article presents analyses of data from a pilot study in Nepal that was designed to provide an initial examination of the errors and costs associated with an innovative methodology for survey data collection. We embedded a randomized experiment within a long-standing panel survey, collecting data on a small number of items with varying sensitivity from a probability sample of 450 young Nepalese adults. Survey items ranged from simple demographics to indicators of substance abuse and mental health problems. Sampled adults were randomly assigned to one of three different modes of data collection: 1) a standard one-time telephone interview, 2) a “single sitting” back-and-forth interview with an interviewer using text messaging, and 3) an interview using text messages within a modular design framework (which generally involves breaking the survey response task into distinct parts over a short period of time). Respondents in the modular group were asked to respond (via text message exchanges with an interviewer) to only one question on a given day, rather than complete the entire survey. Both bivariate and multivariate analyses demonstrate that the two text messaging modes increased the probability of disclosing sensitive information relative to the telephone mode, and that respondents in the modular design group, while responding less frequently, found the survey to be significantly easier. Further, those who responded in the modular group were not unique in terms of available covariates, suggesting that the reduced item response rates only introduced limited nonresponse bias. Future research should consider enhancing this methodology, applying it with other modes of data collection (e. g., web surveys), and continuously evaluating its effectiveness from a total survey error perspective. PMID:26322137
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.
1998-10-01
The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality duringmore » the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).« less
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
USGS Methodology for Assessing Continuous Petroleum Resources
Charpentier, Ronald R.; Cook, Troy A.
2011-01-01
The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.
NASA Technical Reports Server (NTRS)
Tavana, Madjid
1995-01-01
The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.
Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach
ERIC Educational Resources Information Center
Khoumsi, Ahmed; Hadjou, Brahim
2005-01-01
Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…
Teaching Probability for Conceptual Change (La Ensenanza de la Probabilidad por Cambio Conceptual).
ERIC Educational Resources Information Center
Castro, Cesar Saenz
1998-01-01
Presents a theoretical proposal of a methodology for the teaching of probability theory. Discusses the importance of the epistemological approach of Lakatos and the perspective of the conceptual change. Discusses research using a proposed didactic method with Spanish high school students (N=6). Concludes that significant differences on all…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tom Elicson; Bentley Harwood; Jim Bouchard
Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. Themore » fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.« less
Achana, Felix; Petrou, Stavros; Khan, Kamran; Gaye, Amadou; Modi, Neena
2018-01-01
A new methodological framework for assessing agreement between cost-effectiveness endpoints generated using alternative sources of data on treatment costs and effects for trial-based economic evaluations is proposed. The framework can be used to validate cost-effectiveness endpoints generated from routine data sources when comparable data is available directly from trial case report forms or from another source. We illustrate application of the framework using data from a recent trial-based economic evaluation of the probiotic Bifidobacterium breve strain BBG administered to babies less than 31 weeks of gestation. Cost-effectiveness endpoints are compared using two sources of information; trial case report forms and data extracted from the National Neonatal Research Database (NNRD), a clinical database created through collaborative efforts of UK neonatal services. Focusing on mean incremental net benefits at £30,000 per episode of sepsis averted, the study revealed no evidence of discrepancy between the data sources (two-sided p values >0.4), low probability estimates of miscoverage (ranging from 0.039 to 0.060) and concordance correlation coefficients greater than 0.86. We conclude that the NNRD could potentially serve as a reliable source of data for future trial-based economic evaluations of neonatal interventions. We also discuss the potential implications of increasing opportunity to utilize routinely available data for the conduct of trial-based economic evaluations.
Methodology for finding and evaluating safe landing sites on small bodies
NASA Astrophysics Data System (ADS)
Rodgers, Douglas J.; Ernst, Carolyn M.; Barnouin, Olivier S.; Murchie, Scott L.; Chabot, Nancy L.
2016-12-01
Here we develop and demonstrate a three-step strategy for finding a safe landing ellipse for a legged spacecraft on a small body such as an asteroid or planetary satellite. The first step, acquisition of a high-resolution terrain model of a candidate landing region, is simulated using existing statistics on block abundances measured at Phobos, Eros, and Itokawa. The synthetic terrain model is generated by randomly placing hemispheric shaped blocks with the empirically determined size-frequency distribution. The resulting terrain is much rockier than typical lunar or martian landing sites. The second step, locating a landing ellipse with minimal hazards, is demonstrated for an assumed approach to landing that uses Autonomous Landing and Hazard Avoidance Technology. The final step, determination of the probability distribution for orientation of the landed spacecraft, is demonstrated for cases of differing regional slope. The strategy described here is both a prototype for finding a landing site during a flight mission and provides tools for evaluating the design of small-body landers. We show that for bodies with Eros-like block distributions, there may be >99% probability of landing stably at a low tilt without blocks impinging on spacecraft structures so as to pose a survival hazard.
Dose coverage calculation using a statistical shape model—applied to cervical cancer radiotherapy
NASA Astrophysics Data System (ADS)
Tilly, David; van de Schoot, Agustinus J. A. J.; Grusell, Erik; Bel, Arjan; Ahnesjö, Anders
2017-05-01
A comprehensive methodology for treatment simulation and evaluation of dose coverage probabilities is presented where a population based statistical shape model (SSM) provide samples of fraction specific patient geometry deformations. The learning data consists of vector fields from deformable image registration of repeated imaging giving intra-patient deformations which are mapped to an average patient serving as a common frame of reference. The SSM is created by extracting the most dominating eigenmodes through principal component analysis of the deformations from all patients. The sampling of a deformation is thus reduced to sampling weights for enough of the most dominating eigenmodes that describe the deformations. For the cervical cancer patient datasets in this work, we found seven eigenmodes to be sufficient to capture 90% of the variance in the deformations of the, and only three eigenmodes for stability in the simulated dose coverage probabilities. The normality assumption of the eigenmode weights was tested and found relevant for the 20 most dominating eigenmodes except for the first. Individualization of the SSM is demonstrated to be improved using two deformation samples from a new patient. The probabilistic evaluation provided additional information about the trade-offs compared to the conventional single dataset treatment planning.
Cigrang, J A; Todd, S L; Carbone, E G
2000-01-01
A significant proportion of people entering the military are discharged within the first 6 months of enlistment. Mental health related problems are often cited as the cause of discharge. This study evaluated the utility of stress inoculation training in helping reduce the attrition of a sample of Air Force trainees at risk for discharge from basic military training. Participants were 178 trainees referred for a psychological evaluation from basic training. Participants were randomly assigned to a 2-session stress management group or a usual-care control condition. Compared with past studies that used less rigorous methodology, this study did not find that exposure to stress management information increased the probability of graduating basic military training. Results are discussed in terms of possible reasons for the lack of treatment effects and directions for future research.
The Many Hazards of Trend Evaluation
NASA Astrophysics Data System (ADS)
Henebry, G. M.; de Beurs, K.; Zhang, X.; Kimball, J. S.; Small, C.
2014-12-01
Given the awareness in the scientific community of global scale drivers such as population growth, globalization, and climatic variation and change, many studies seek to identify temporal patterns in data that may be plausibly related to one or more aspect of global change. Here we explore two questions: "What constitutes a trend in a time series?" and "How can a trend be misinterpreted?" There are manifold hazards—both methodological and psychological—in detecting a trend, quantifying its magnitude, assessing its significance, identifying probable causes, and evaluating the implications of the trend. These hazards can combine to elevate the risk of misinterpreting the trend. In contrast, evaluation of multiple trends within a biogeophysical framework can attenuate the risk of misinterpretation. We review and illustrate these hazards and demonstrate the efficacy of an approach using multiple indicators detecting significant trends (MIDST) applied to time series of remote sensing data products.
On the use of Bayesian Monte-Carlo in evaluation of nuclear data
NASA Astrophysics Data System (ADS)
De Saint Jean, Cyrille; Archier, Pascal; Privas, Edwin; Noguere, Gilles
2017-09-01
As model parameters, necessary ingredients of theoretical models, are not always predicted by theory, a formal mathematical framework associated to the evaluation work is needed to obtain the best set of parameters (resonance parameters, optical models, fission barrier, average width, multigroup cross sections) with Bayesian statistical inference by comparing theory to experiment. The formal rule related to this methodology is to estimate the posterior density probability function of a set of parameters by solving an equation of the following type: pdf(posterior) ˜ pdf(prior) × a likelihood function. A fitting procedure can be seen as an estimation of the posterior density probability of a set of parameters (referred as x→?) knowing a prior information on these parameters and a likelihood which gives the probability density function of observing a data set knowing x→?. To solve this problem, two major paths could be taken: add approximations and hypothesis and obtain an equation to be solved numerically (minimum of a cost function or Generalized least Square method, referred as GLS) or use Monte-Carlo sampling of all prior distributions and estimate the final posterior distribution. Monte Carlo methods are natural solution for Bayesian inference problems. They avoid approximations (existing in traditional adjustment procedure based on chi-square minimization) and propose alternative in the choice of probability density distribution for priors and likelihoods. This paper will propose the use of what we are calling Bayesian Monte Carlo (referred as BMC in the rest of the manuscript) in the whole energy range from thermal, resonance and continuum range for all nuclear reaction models at these energies. Algorithms will be presented based on Monte-Carlo sampling and Markov chain. The objectives of BMC are to propose a reference calculation for validating the GLS calculations and approximations, to test probability density distributions effects and to provide the framework of finding global minimum if several local minimums exist. Application to resolved resonance, unresolved resonance and continuum evaluation as well as multigroup cross section data assimilation will be presented.
A Bayesian Method for Evaluating and Discovering Disease Loci Associations
Jiang, Xia; Barmada, M. Michael; Cooper, Gregory F.; Becich, Michael J.
2011-01-01
Background A genome-wide association study (GWAS) typically involves examining representative SNPs in individuals from some population. A GWAS data set can concern a million SNPs and may soon concern billions. Researchers investigate the association of each SNP individually with a disease, and it is becoming increasingly commonplace to also analyze multi-SNP associations. Techniques for handling so many hypotheses include the Bonferroni correction and recently developed Bayesian methods. These methods can encounter problems. Most importantly, they are not applicable to a complex multi-locus hypothesis which has several competing hypotheses rather than only a null hypothesis. A method that computes the posterior probability of complex hypotheses is a pressing need. Methodology/Findings We introduce the Bayesian network posterior probability (BNPP) method which addresses the difficulties. The method represents the relationship between a disease and SNPs using a directed acyclic graph (DAG) model, and computes the likelihood of such models using a Bayesian network scoring criterion. The posterior probability of a hypothesis is computed based on the likelihoods of all competing hypotheses. The BNPP can not only be used to evaluate a hypothesis that has previously been discovered or suspected, but also to discover new disease loci associations. The results of experiments using simulated and real data sets are presented. Our results concerning simulated data sets indicate that the BNPP exhibits both better evaluation and discovery performance than does a p-value based method. For the real data sets, previous findings in the literature are confirmed and additional findings are found. Conclusions/Significance We conclude that the BNPP resolves a pressing problem by providing a way to compute the posterior probability of complex multi-locus hypotheses. A researcher can use the BNPP to determine the expected utility of investigating a hypothesis further. Furthermore, we conclude that the BNPP is a promising method for discovering disease loci associations. PMID:21853025
A methodology for physically based rockfall hazard assessment
NASA Astrophysics Data System (ADS)
Crosta, G. B.; Agliardi, F.
Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.
Reliability and Maintainability Analysis of a High Air Pressure Compressor Facility
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Ring, Robert W.; Cole, Stuart K.
2013-01-01
This paper discusses a Reliability, Availability, and Maintainability (RAM) independent assessment conducted to support the refurbishment of the Compressor Station at the NASA Langley Research Center (LaRC). The paper discusses the methodologies used by the assessment team to derive the repair by replacement (RR) strategies to improve the reliability and availability of the Compressor Station (Ref.1). This includes a RAPTOR simulation model that was used to generate the statistical data analysis needed to derive a 15-year investment plan to support the refurbishment of the facility. To summarize, study results clearly indicate that the air compressors are well past their design life. The major failures of Compressors indicate that significant latent failure causes are present. Given the occurrence of these high-cost failures following compressor overhauls, future major failures should be anticipated if compressors are not replaced. Given the results from the RR analysis, the study team recommended a compressor replacement strategy. Based on the data analysis, the RR strategy will lead to sustainable operations through significant improvements in reliability, availability, and the probability of meeting the air demand with acceptable investment cost that should translate, in the long run, into major cost savings. For example, the probability of meeting air demand improved from 79.7 percent for the Base Case to 97.3 percent. Expressed in terms of a reduction in the probability of failing to meet demand (1 in 5 days to 1 in 37 days), the improvement is about 700 percent. Similarly, compressor replacement improved the operational availability of the facility from 97.5 percent to 99.8 percent. Expressed in terms of a reduction in system unavailability (1 in 40 to 1 in 500), the improvement is better than 1000 percent (an order of magnitude improvement). It is worthy to note that the methodologies, tools, and techniques used in the LaRC study can be used to evaluate similar high value equipment components and facilities. Also, lessons learned in data collection and maintenance practices derived from the observations, findings, and recommendations of the study are extremely important in the evaluation and sustainment of new compressor facilities.
Toward uniform probabilistic seismic hazard assessments for Southeast Asia
NASA Astrophysics Data System (ADS)
Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.
2017-12-01
Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.
Application of tolerance limits to the characterization of image registration performance.
Fedorov, Andriy; Wells, William M; Kikinis, Ron; Tempany, Clare M; Vangel, Mark G
2014-07-01
Deformable image registration is used increasingly in image-guided interventions and other applications. However, validation and characterization of registration performance remain areas that require further study. We propose an analysis methodology for deriving tolerance limits on the initial conditions for deformable registration that reliably lead to a successful registration. This approach results in a concise summary of the probability of registration failure, while accounting for the variability in the test data. The (β, γ) tolerance limit can be interpreted as a value of the input parameter that leads to successful registration outcome in at least 100β% of cases with the 100γ% confidence. The utility of the methodology is illustrated by summarizing the performance of a deformable registration algorithm evaluated in three different experimental setups of increasing complexity. Our examples are based on clinical data collected during MRI-guided prostate biopsy registered using publicly available deformable registration tool. The results indicate that the proposed methodology can be used to generate concise graphical summaries of the experiments, as well as a probabilistic estimate of the registration outcome for a future sample. Its use may facilitate improved objective assessment, comparison and retrospective stress-testing of deformable.
A risk assessment method for multi-site damage
NASA Astrophysics Data System (ADS)
Millwater, Harry Russell, Jr.
This research focused on developing probabilistic methods suitable for computing small probabilities of failure, e.g., 10sp{-6}, of structures subject to multi-site damage (MSD). MSD is defined as the simultaneous development of fatigue cracks at multiple sites in the same structural element such that the fatigue cracks may coalesce to form one large crack. MSD is modeled as an array of collinear cracks with random initial crack lengths with the centers of the initial cracks spaced uniformly apart. The data used was chosen to be representative of aluminum structures. The structure is considered failed whenever any two adjacent cracks link up. A fatigue computer model is developed that can accurately and efficiently grow a collinear array of arbitrary length cracks from initial size until failure. An algorithm is developed to compute the stress intensity factors of all cracks considering all interaction effects. The probability of failure of two to 100 cracks is studied. Lower bounds on the probability of failure are developed based upon the probability of the largest crack exceeding a critical crack size. The critical crack size is based on the initial crack size that will grow across the ligament when the neighboring crack has zero length. The probability is evaluated using extreme value theory. An upper bound is based on the probability of the maximum sum of initial cracks being greater than a critical crack size. A weakest link sampling approach is developed that can accurately and efficiently compute small probabilities of failure. This methodology is based on predicting the weakest link, i.e., the two cracks to link up first, for a realization of initial crack sizes, and computing the cycles-to-failure using these two cracks. Criteria to determine the weakest link are discussed. Probability results using the weakest link sampling method are compared to Monte Carlo-based benchmark results. The results indicate that very small probabilities can be computed accurately in a few minutes using a Hewlett-Packard workstation.
Willems, Sjw; Schat, A; van Noorden, M S; Fiocco, M
2018-02-01
Censored data make survival analysis more complicated because exact event times are not observed. Statistical methodology developed to account for censored observations assumes that patients' withdrawal from a study is independent of the event of interest. However, in practice, some covariates might be associated to both lifetime and censoring mechanism, inducing dependent censoring. In this case, standard survival techniques, like Kaplan-Meier estimator, give biased results. The inverse probability censoring weighted estimator was developed to correct for bias due to dependent censoring. In this article, we explore the use of inverse probability censoring weighting methodology and describe why it is effective in removing the bias. Since implementing this method is highly time consuming and requires programming and mathematical skills, we propose a user friendly algorithm in R. Applications to a toy example and to a medical data set illustrate how the algorithm works. A simulation study was carried out to investigate the performance of the inverse probability censoring weighted estimators in situations where dependent censoring is present in the data. In the simulation process, different sample sizes, strengths of the censoring model, and percentages of censored individuals were chosen. Results show that in each scenario inverse probability censoring weighting reduces the bias induced in the traditional Kaplan-Meier approach where dependent censoring is ignored.
Improving default risk prediction using Bayesian model uncertainty techniques.
Kazemi, Reza; Mosleh, Ali
2012-11-01
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.
Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D
2016-08-31
The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.
NASA Astrophysics Data System (ADS)
Pereira, Robson A.; Anconi, Cleber P. A.; Nascimento, Clebio S.; De Almeida, Wagner B.; Dos Santos, Hélio F.
2015-07-01
The present letter reports results from a comprehensive theoretical analysis of the inclusion process involving 2,4-dichlorophenoxyacetic acid (2,4-D) and β-cyclodextrin (β-CD) for which the experimental data of formation is available. Spatial arrangement and stabilization energies were evaluated in gas phase and aqueous solution through density functional theory (DFT) and through the use of SMD implicit solvation approach. The discussed methodology was applied to predict the stability and identify the most favorable form (deprotonated or neutral) as well as the most probable spatial arrangement of the studied inclusion compound.
Tsunami hazard assessments with consideration of uncertain earthquakes characteristics
NASA Astrophysics Data System (ADS)
Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.
2017-12-01
The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis
2012-01-01
probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY
Systematic analysis of EOS data system for operations
NASA Technical Reports Server (NTRS)
Moe, K. L.; Dasgupta, R.
1985-01-01
A data management analysis methodology is being proposed. The objective of the methodology is to assist mission managers by identifying a series of ordered activities to be systematically followed in order to arrive at an effective ground system design. Existing system engineering tools and concepts have been assembled into a structured framework to facilitate the work of a mission planner. It is intended that this methodology can be gainfully applied (with probable modifications and/or changes) to the EOS payloads and their associated data systems.
A design methodology for nonlinear systems containing parameter uncertainty
NASA Technical Reports Server (NTRS)
Young, G. E.; Auslander, D. M.
1983-01-01
In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.
Field, Edward H.
2015-01-01
A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.
Analyzing seasonal patterns of wildfire exposure factors in Sardinia, Italy.
Salis, Michele; Ager, Alan A; Alcasena, Fermin J; Arca, Bachisio; Finney, Mark A; Pellizzaro, Grazia; Spano, Donatella
2015-01-01
In this paper, we applied landscape scale wildfire simulation modeling to explore the spatiotemporal patterns of wildfire likelihood and intensity in the island of Sardinia (Italy). We also performed wildfire exposure analysis for selected highly valued resources on the island to identify areas characterized by high risk. We observed substantial variation in burn probability, fire size, and flame length among time periods within the fire season, which starts in early June and ends in late September. Peak burn probability and flame length were observed in late July. We found that patterns of wildfire likelihood and intensity were mainly related to spatiotemporal variation in ignition locations, fuel moisture, and wind vectors. Our modeling approach allowed consideration of historical patterns of winds, ignition locations, and live and dead fuel moisture on fire exposure factors. The methodology proposed can be useful for analyzing potential wildfire risk and effects at landscape scale, evaluating historical changes and future trends in wildfire exposure, as well as for addressing and informing fuel management and risk mitigation issues.
Optimal Mission Abort Policy for Systems Operating in a Random Environment.
Levitin, Gregory; Finkelstein, Maxim
2018-04-01
Many real-world critical systems, e.g., aircrafts, manned space flight systems, and submarines, utilize mission aborts to enhance their survivability. Specifically, a mission can be aborted when a certain malfunction condition is met and a rescue or recovery procedure is then initiated. For systems exposed to external impacts, the malfunctions are often caused by the consequences of these impacts. Traditional system reliability models typically cannot address a possibility of mission aborts. Therefore, in this article, we first develop the corresponding methodology for modeling and evaluation of the mission success probability and survivability of systems experiencing both internal failures and external shocks. We consider a policy when a mission is aborted and a rescue procedure is activated upon occurrence of the mth shock. We demonstrate the tradeoff between the system survivability and the mission success probability that should be balanced by the proper choice of the decision variable m. A detailed illustrative example of a mission performed by an unmanned aerial vehicle is presented. © 2017 Society for Risk Analysis.
Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.
2017-11-15
The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.
Probabilistic design of fibre concrete structures
NASA Astrophysics Data System (ADS)
Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.
2017-09-01
Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).
Karnon, Jonathan; Caffrey, Orla; Pham, Clarabelle; Grieve, Richard; Ben-Tovim, David; Hakendorf, Paul; Crotty, Maria
2013-06-01
Cost-effectiveness analysis is well established for pharmaceuticals and medical technologies but not for evaluating variations in clinical practice. This paper describes a novel methodology--risk adjusted cost-effectiveness (RAC-E)--that facilitates the comparative evaluation of applied clinical practice processes. In this application, risk adjustment is undertaken with a multivariate matching algorithm that balances the baseline characteristics of patients attending different settings (e.g., hospitals). Linked, routinely collected data are used to analyse patient-level costs and outcomes over a 2-year period, as well as to extrapolate costs and survival over patient lifetimes. The study reports the relative cost-effectiveness of alternative forms of clinical practice, including a full representation of the statistical uncertainty around the mean estimates. The methodology is illustrated by a case study that evaluates the relative cost-effectiveness of services for patients presenting with acute chest pain across the four main public hospitals in South Australia. The evaluation finds that services provided at two hospitals were dominated, and of the remaining services, the more effective hospital gained life years at a low mean additional cost and had an 80% probability of being the most cost-effective hospital at realistic cost-effectiveness thresholds. Potential determinants of the estimated variation in costs and effects were identified, although more detailed analyses to identify specific areas of variation in clinical practice are required to inform improvements at the less cost-effective institutions. Copyright © 2012 John Wiley & Sons, Ltd.
Evaluation of the National Weather Service Extreme Cold Warning Experiment in North Dakota
Chiu, Cindy H.; Vagi, Sara J.; Wolkin, Amy F.; Martin, John Paul; Noe, Rebecca S.
2016-01-01
Dangerously cold weather threatens life and property. During periods of extreme cold due to wind chill, the National Weather Service (NWS) issues wind chill warnings to prompt the public to take action to mitigate risks. Wind chill warnings are based on ambient temperatures and wind speeds. Since 2010, NWS has piloted a new extreme cold warning issued for cold temperatures in wind and nonwind conditions. The North Dakota Department of Health, NWS, and the Centers for Disease Control and Prevention collaborated in conducting household surveys in Burleigh County, North Dakota, to evaluate this new warning. The objectives of the evaluation were to assess whether residents heard the new warning and to determine if protective behaviors were prompted by the warning. This was a cross-sectional survey design using the Community Assessment for Public Health Emergency Response (CASPER) methodology to select a statistically representative sample of households from Burleigh County. From 10 to 11 April 2012, 188 door-to-door household interviews were completed. The CASPER methodology uses probability sampling with weighted analysis to estimate the number and percentage of households with a specific response within Burleigh County. The majority of households reported having heard both the extreme cold and wind chill warnings, and both warnings prompted protective behaviors. These results suggest this community heard the new warning and took protective actions after hearing the warning. PMID:27239260
A New Method for Generating Probability Tables in the Unresolved Resonance Region
Holcomb, Andrew M.; Leal, Luiz C.; Rahnema, Farzad; ...
2017-04-18
One new method for constructing probability tables in the unresolved resonance region (URR) has been developed. This new methodology is an extensive modification of the single-level Breit-Wigner (SLBW) pseudo-resonance pair sequence method commonly used to generate probability tables in the URR. The new method uses a Monte Carlo process to generate many pseudo-resonance sequences by first sampling the average resonance parameter data in the URR and then converting the sampled resonance parameters to the more robust R-matrix limited (RML) format. Furthermore, for each sampled set of pseudo-resonance sequences, the temperature-dependent cross sections are reconstructed on a small grid around themore » energy of reference using the Reich-Moore formalism and the Leal-Hwang Doppler broadening methodology. We then use the effective cross sections calculated at the energies of reference to construct probability tables in the URR. The RML cross-section reconstruction algorithm has been rigorously tested for a variety of isotopes, including 16O, 19F, 35Cl, 56Fe, 63Cu, and 65Cu. The new URR method also produced normalized cross-section factor probability tables for 238U that were found to be in agreement with current standards. The modified 238U probability tables were shown to produce results in excellent agreement with several standard benchmarks, including the IEU-MET-FAST-007 (BIG TEN), IEU-MET-FAST-003, and IEU-COMP-FAST-004 benchmarks.« less
Rediscovery of Good-Turing estimators via Bayesian nonparametrics.
Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye
2016-03-01
The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library. © 2015, The International Biometric Society.
Application of a Probalistic Sizing Methodology for Ceramic Structures
NASA Astrophysics Data System (ADS)
Rancurel, Michael; Behar-Lafenetre, Stephanie; Cornillon, Laurence; Leroy, Francois-Henri; Coe, Graham; Laine, Benoit
2012-07-01
Ceramics are increasingly used in the space industry to take advantage of their stability and high specific stiffness properties. Their brittle behaviour often leads to size them by increasing the safety factors that are applied on the maximum stresses. It induces to oversize the structures. This is inconsistent with the major driver in space architecture, the mass criteria. This paper presents a methodology to size ceramic structures based on their failure probability. Thanks to failure tests on samples, the Weibull law which characterizes the strength distribution of the material is obtained. A-value (Q0.0195%) and B-value (Q0.195%) are then assessed to take into account the limited number of samples. A knocked-down Weibull law that interpolates the A- & B- values is also obtained. Thanks to these two laws, a most-likely and a knocked- down prediction of failure probability are computed for complex ceramic structures. The application of this methodology and its validation by test is reported in the paper.
Interplanetary approach optical navigation with applications
NASA Technical Reports Server (NTRS)
Jerath, N.
1978-01-01
The use of optical data from onboard television cameras for the navigation of interplanetary spacecraft during the planet approach phase is investigated. Three optical data types were studied: the planet limb with auxiliary celestial references, the satellite-star, and the planet-star two-camera methods. Analysis and modelling issues related to the nature and information content of the optical methods were examined. Dynamic and measurement system modelling, data sequence design, measurement extraction, model estimation and orbit determination, as relating optical navigation, are discussed, and the various error sources were analyzed. The methodology developed was applied to the Mariner 9 and the Viking Mars missions. Navigation accuracies were evaluated at the control and knowledge points, with particular emphasis devoted to the combined use of radio and optical data. A parametric probability analysis technique was developed to evaluate navigation performance as a function of system reliabilities.
Scott Pippin; Shana Jones; Cassandra Johnson Gaither
2017-01-01
This report presents a methodology for identifying land parcels that have an increased probability of being heirs property. Heirs property is inherited land passed to successive generations intestate, without clear title, typically to family members. This land ownership type is widespread among rural, African-American populations and is also thought to be pervasive in...
Probabilistic Based Modeling and Simulation Assessment
2010-06-01
different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The
Nuclear Targeting Terms for Engineers and Scientists
DOE Office of Scientific and Technical Information (OSTI.GOV)
St Ledger, John W.
The Department of Defense has a methodology for targeting nuclear weapons, and a jargon that is used to communicate between the analysts, planners, aircrews, and missile crews. The typical engineer or scientist in the Department of Energy may not have been exposed to the nuclear weapons targeting terms and methods. This report provides an introduction to the terms and methodologies used for nuclear targeting. Its purpose is to prepare engineers and scientists to participate in wargames, exercises, and discussions with the Department of Defense. Terms such as Circular Error Probable, probability of hit and damage, damage expectancy, and the physicalmore » vulnerability system are discussed. Methods for compounding damage from multiple weapons applied to one target are presented.« less
Assessment of precursory information in seismo-electromagnetic phenomena
NASA Astrophysics Data System (ADS)
Han, P.; Hattori, K.; Zhuang, J.
2017-12-01
Previous statistical studies showed that there were correlations between seismo-electromagnetic phenomena and sizeable earthquakes in Japan. In this study, utilizing Molchan's error diagram, we evaluate whether these phenomena contain precursory information and discuss how they can be used in short-term forecasting of large earthquake events. In practice, for given series of precursory signals and related earthquake events, each prediction strategy is characterized by the leading time of alarms, the length of alarm window, the alarm radius (area) and magnitude. The leading time is the time length between a detected anomaly and its following alarm, and the alarm window is the duration that an alarm lasts. The alarm radius and magnitude are maximum predictable distance and minimum predictable magnitude of earthquake events, respectively. We introduce the modified probability gain (PG') and the probability difference (D') to quantify the forecasting performance and to explore the optimal prediction parameters for a given electromagnetic observation. The above methodology is firstly applied to ULF magnetic data and GPS-TEC data. The results show that the earthquake predictions based on electromagnetic anomalies are significantly better than random guesses, indicating the data contain potential useful precursory information. Meanwhile, we reveal the optimal prediction parameters for both observations. The methodology proposed in this study could be also applied to other pre-earthquake phenomena to find out whether there is precursory information, and then on this base explore the optimal alarm parameters in practical short-term forecast.
Will current probabilistic climate change information, as such, improve adaptation?
NASA Astrophysics Data System (ADS)
Lopez, A.; Smith, L. A.
2012-04-01
Probabilistic climate scenarios are currently being provided to end users, to employ as probabilities in adaptation decision making, with the explicit suggestion that they quantify the impacts of climate change relevant to a variety of sectors. These "probabilities" are, however, rather sensitive to the assumptions in, and the structure of the modelling approaches used to generate them. It is often argued that stakeholders require probabilistic climate change information to adequately evaluate and plan adaptation pathways. On the other hand, some circumstantial evidence suggests that on the ground decision making rarely uses well defined probability distributions of climate change as inputs. Nevertheless it is within this context of probability distributions of climate change that we discuss possible drawbacks of supplying information that, while presented as robust and decision relevant, , is in fact unlikely to be so due to known flaws both in the underlying models and in the methodology used to "account for" those known flaws. How might one use a probability forecast that is expected to change in the future, not due to a refinement in our information but due to fundamental flaws in its construction? What then are the alternatives? While the answer will depend on the context of the problem at hand, a good approach will be strongly informed by the timescale of the given planning decision, and the consideration of all the non-climatic factors that have to be taken into account in the corresponding risk assessment. Using a water resources system as an example, we illustrate an alternative approach to deal with these challenges and make robust adaptation decisions today.
Le Moual, Nicole; Zock, Jan-Paul; Dumas, Orianne; Lytras, Theodore; Andersson, Eva; Lillienberg, Linnéa; Schlünssen, Vivi; Benke, Geza; Kromhout, Hans
2018-07-01
We aimed to update an asthmagen job exposure matrix (JEM) developed in the late 1990s. Main reasons were: the number of suspected and recognised asthmagens has since tripled; understanding of the aetiological role of irritants in asthma and methodological insights in application of JEMs have emerged in the period. For each agent of the new occupational asthma-specific JEM (OAsJEM), a working group of three experts out of eight evaluated exposure for each International Standard Classification of Occupations, 1988 (ISCO-88) job code into three categories: 'high' (high probability of exposure and moderate-to-high intensity), 'medium' (low-to-moderate probability or low intensity) and 'unexposed'. Within a working group, experts evaluated exposures independently from each other. If expert assessments were inconsistent the final decision was taken by consensus. Specificity was favoured over sensitivity, that is, jobs were classified with high exposure only if the probability of exposure was high and the intensity moderate-to-high. In the final review, all experts checked assigned exposures and proposed/improved recommendations for expert re-evaluation after default application of the JEM. The OAsJEM covers exposures to 30 sensitisers/irritants, including 12 newly recognised, classified into seven broad groups. Initial agreement between the three experts was mostly fair to moderate (κ values 0.2-0.5). Out of 506 ISCO-88 codes, the majority was classified as unexposed (from 82.6% (organic solvents) to 99.8% (persulfates)) and a minority as 'high-exposed' (0.2% (persulfates) to 2.6% (organic solvents)). The OAsJEM developed to improve occupational exposure assessment may improve evaluations of associations with asthma in epidemiological studies and contribute to assessment of the burden of work-related asthma. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Tan, Elcin
A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the physically possible upper limits of precipitation due to climate change. The simulation results indicate that the meridional shift in atmospheric conditions is the optimum method to determine maximum precipitation in consideration of cost and efficiency. Finally, exceedance probability analyses of the model results of 42 historical extreme precipitation events demonstrate that the 72-hr basin averaged probable maximum precipitation is 21.72 inches for the exceedance probability of 0.5 percent. On the other hand, the current operational PMP estimation for the American River Watershed is 28.57 inches as published in the hydrometeorological report no. 59 and a previous PMP value was 31.48 inches as published in the hydrometeorological report no. 36. According to the exceedance probability analyses of this proposed method, the exceedance probabilities of these two estimations correspond to 0.036 percent and 0.011 percent, respectively.
Particle Filtering Methods for Incorporating Intelligence Updates
2017-03-01
methodology for incorporating intelligence updates into a stochastic model for target tracking. Due to the non -parametric assumptions of the PF...samples are taken with replacement from the remaining non -zero weighted particles at each iteration. With this methodology , a zero-weighted particle is...incorporation of information updates. A common method for incorporating information updates is Kalman filtering. However, given the probable nonlinear and non
ERIC Educational Resources Information Center
Singer, Judith D.; Willett, John B.
The National Center for Education Statistics (NCES) is exploring the possibility of conducting a large-scale multi-year study of teachers' careers. The proposed new study is intended to follow a national probability sample of teachers over an extended period of time. A number of methodological issues need to be addressed before the study can be…
Theory and methodology for utilizing genes as biomarkers to determine potential biological mixtures.
Shrestha, Sadeep; Smith, Michael W; Beaty, Terri H; Strathdee, Steffanie A
2005-01-01
Genetically determined mixture information can be used as a surrogate for physical or behavioral characteristics in epidemiological studies examining research questions related to socially stigmatized behaviors and horizontally transmitted infections. A new measure, the probability of mixture discrimination (PMD), was developed to aid mixture analysis that estimates the ability to differentiate single from multiple genomes in biological mixtures. Four autosomal short tandem repeats (STRs) were identified, genotyped and evaluated in African American, European American, Hispanic, and Chinese individuals to estimate PMD. Theoretical PMD frameworks were also developed for autosomal and sex-linked (X and Y) STR markers in potential male/male, male/female and female/female mixtures. Autosomal STRs genetically determine the presence of multiple genomes in mixture samples of unknown genders with more power than the apparently simpler X and Y chromosome STRs. Evaluation of four autosomal STR loci enables the detection of mixtures of DNA from multiple sources with above 99% probability in all four racial/ethnic populations. The genetic-based approach has applications in epidemiology that provide viable alternatives to survey-based study designs. The analysis of genes as biomarkers can be used as a gold standard for validating measurements from self-reported behaviors that tend to be sensitive or socially stigmatizing, such as those involving sex and drugs.
NASA Technical Reports Server (NTRS)
Bowles, Roland L.; Buck, Bill K.
2009-01-01
The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.
Population trends, survival, and sampling methodologies for a population of Rana draytonii
Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A.W.; Halstead, Brian J.
2017-01-01
Estimating population trends provides valuable information for resource managers, but monitoring programs face trade-offs between the quality and quantity of information gained and the number of sites surveyed. We compared the effectiveness of monitoring techniques for estimating population trends of Rana draytonii (California Red-legged Frog) at Point Reyes National Seashore, California, USA, over a 13-yr period. Our primary goals were to: 1) estimate trends for a focal pond at Point Reyes National Seashore, and 2) evaluate whether egg mass counts could reliably estimate an index of abundance relative to more-intensive capture–mark–recapture methods. Capture–mark–recapture (CMR) surveys of males indicated a stable population from 2005 to 2009, despite low annual apparent survival (26.3%). Egg mass counts from 2000 to 2012 indicated that despite some large fluctuations, the breeding female population was generally stable or increasing, with annual abundance varying between 26 and 130 individuals. Minor modifications to egg mass counts, such as marking egg masses, can allow estimation of egg mass detection probabilities necessary to convert counts to abundance estimates, even when closure of egg mass abundance cannot be assumed within a breeding season. High egg mass detection probabilities (mean per-survey detection probability = 0.98 [0.89–0.99]) indicate that egg mass surveys can be an efficient and reliable method for monitoring population trends of federally threatened R. draytonii. Combining egg mass surveys to estimate trends at many sites with CMR methods to evaluate factors affecting adult survival at focal populations is likely a profitable path forward to enhance understanding and conservation of R. draytonii.
Automatic seed selection for segmentation of liver cirrhosis in laparoscopic sequences
NASA Astrophysics Data System (ADS)
Sinha, Rahul; Marcinczak, Jan Marek; Grigat, Rolf-Rainer
2014-03-01
For computer aided diagnosis based on laparoscopic sequences, image segmentation is one of the basic steps which define the success of all further processing. However, many image segmentation algorithms require prior knowledge which is given by interaction with the clinician. We propose an automatic seed selection algorithm for segmentation of liver cirrhosis in laparoscopic sequences which assigns each pixel a probability of being cirrhotic liver tissue or background tissue. Our approach is based on a trained classifier using SIFT and RGB features with PCA. Due to the unique illumination conditions in laparoscopic sequences of the liver, a very low dimensional feature space can be used for classification via logistic regression. The methodology is evaluated on 718 cirrhotic liver and background patches that are taken from laparoscopic sequences of 7 patients. Using a linear classifier we achieve a precision of 91% in a leave-one-patient-out cross-validation. Furthermore, we demonstrate that with logistic probability estimates, seeds with high certainty of being cirrhotic liver tissue can be obtained. For example, our precision of liver seeds increases to 98.5% if only seeds with more than 95% probability of being liver are used. Finally, these automatically selected seeds can be used as priors in Graph Cuts which is demonstrated in this paper.
Subjective expectations in the context of HIV/AIDS in Malawi
Delavande, Adeline; Kohler, Hans-Peter
2009-01-01
In this paper we present a newly developed interactive elicitation methodology for collecting probabilistic expectations in a developing country context with low levels of literacy and numeracy, and we evaluate the feasibility and success of this method for a wide range of outcomes in rural Malawi. We find that respondents’ answers about their subjective expectations take into account basic properties of probabilities, and vary meaningfully with observable characteristics and past experience. From a substantive point of view, the elicited expectations indicate that individuals are generally aware of differential risks. For example, individuals with lower incomes and less land rightly feel at greater risk of financial distress than people with higher socioeconomic status (SES), and people who are divorced or widowed rightly feel at greater risk of being infected with HIV than currently married individuals. Meanwhile many expectations—including the probability of being currently infected with HIV—are well-calibrated compared to actual probabilities, but mortality expectations are substantially overestimated compared to life table estimates. This overestimation may lead individuals to underestimate the benefits of adopting HIV risk-reduction strategies. The skewed distribution of expectations about condom use also suggests that a small group of innovators are the forerunners in the adoption of condoms within marriage for HIV prevention. PMID:19946378
NASA Astrophysics Data System (ADS)
Pai, Akshay; Samala, Ravi K.; Zhang, Jianying; Qian, Wei
2010-03-01
Mammography reading by radiologists and breast tissue image interpretation by pathologists often leads to high False Positive (FP) Rates. Similarly, current Computer Aided Diagnosis (CADx) methods tend to concentrate more on sensitivity, thus increasing the FP rates. A novel method is introduced here which employs similarity based method to decrease the FP rate in the diagnosis of microcalcifications. This method employs the Principal Component Analysis (PCA) and the similarity metrics in order to achieve the proposed goal. The training and testing set is divided into generalized (Normal and Abnormal) and more specific (Abnormal, Normal, Benign) classes. The performance of this method as a standalone classification system is evaluated in both the cases (general and specific). In another approach the probability of each case belonging to a particular class is calculated. If the probabilities are too close to classify, the augmented CADx system can be instructed to have a detailed analysis of such cases. In case of normal cases with high probability, no further processing is necessary, thus reducing the computation time. Hence, this novel method can be employed in cascade with CADx to reduce the FP rate and also avoid unnecessary computational time. Using this methodology, a false positive rate of 8% and 11% is achieved for mammography and cellular images respectively.
Modeling and clustering water demand patterns from real-world smart meter data
NASA Astrophysics Data System (ADS)
Cheifetz, Nicolas; Noumir, Zineb; Samé, Allou; Sandraz, Anne-Claire; Féliers, Cédric; Heim, Véronique
2017-08-01
Nowadays, drinking water utilities need an acute comprehension of the water demand on their distribution network, in order to efficiently operate the optimization of resources, manage billing and propose new customer services. With the emergence of smart grids, based on automated meter reading (AMR), a better understanding of the consumption modes is now accessible for smart cities with more granularities. In this context, this paper evaluates a novel methodology for identifying relevant usage profiles from the water consumption data produced by smart meters. The methodology is fully data-driven using the consumption time series which are seen as functions or curves observed with an hourly time step. First, a Fourier-based additive time series decomposition model is introduced to extract seasonal patterns from time series. These patterns are intended to represent the customer habits in terms of water consumption. Two functional clustering approaches are then used to classify the extracted seasonal patterns: the functional version of K-means, and the Fourier REgression Mixture (FReMix) model. The K-means approach produces a hard segmentation and K representative prototypes. On the other hand, the FReMix is a generative model and also produces K profiles as well as a soft segmentation based on the posterior probabilities. The proposed approach is applied to a smart grid deployed on the largest water distribution network (WDN) in France. The two clustering strategies are evaluated and compared. Finally, a realistic interpretation of the consumption habits is given for each cluster. The extensive experiments and the qualitative interpretation of the resulting clusters allow one to highlight the effectiveness of the proposed methodology.
Respondent-Driven Sampling: An Assessment of Current Methodology.
Gile, Krista J; Handcock, Mark S
2010-08-01
Respondent-Driven Sampling (RDS) employs a variant of a link-tracing network sampling strategy to collect data from hard-to-reach populations. By tracing the links in the underlying social network, the process exploits the social structure to expand the sample and reduce its dependence on the initial (convenience) sample.The current estimators of population averages make strong assumptions in order to treat the data as a probability sample. We evaluate three critical sensitivities of the estimators: to bias induced by the initial sample, to uncontrollable features of respondent behavior, and to the without-replacement structure of sampling.Our analysis indicates: (1) that the convenience sample of seeds can induce bias, and the number of sample waves typically used in RDS is likely insufficient for the type of nodal mixing required to obtain the reputed asymptotic unbiasedness; (2) that preferential referral behavior by respondents leads to bias; (3) that when a substantial fraction of the target population is sampled the current estimators can have substantial bias.This paper sounds a cautionary note for the users of RDS. While current RDS methodology is powerful and clever, the favorable statistical properties claimed for the current estimates are shown to be heavily dependent on often unrealistic assumptions. We recommend ways to improve the methodology.
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
Alba, A; Casal, J; Napp, S; Martin, P A J
2010-11-01
Compulsory surveillance programmes for avian influenza (AI) have been implemented in domestic poultry and wild birds in all the European Member States since 2005. The implementation of these programmes is complex and requires a close evaluation. A good indicator to assess their efficacy is the sensitivity (Se) of the surveillance system. In this study, the sensitivities for different sampling designs proposed by the Spanish authorities for the commercial poultry population of Catalonia were assessed, using the scenario tree model methodology. These samplings were stratified throughout the territory of Spain and took into account the species, the types of production and their specific risks. The probabilities of detecting infection at different prevalences at both individual and holding level were estimated. Furthermore, those subpopulations that contributed more to the Se of the system were identified. The model estimated that all the designs met the requirements of the European Commission. The probability of detecting AI circulating in Catalonian poultry did not change significantly when the within-holding design prevalence varied from 30% to 10%. In contrast, when the among-holding design prevalence decreased from 5% to 1%, the probability of detecting AI was drastically reduced. The sampling of duck and goose holdings, and to a lesser extent the sampling of turkey and game bird holdings, increased the Se substantially. The Se of passive surveillance in chickens for highly pathogenic avian influenza (HPAI) and low pathogenicity avian influenza (LPAI) were also assessed. The probability of the infected birds manifesting apparent clinical signs and the awareness of veterinarians and farmers had great influence on the probability of detecting AI. In order to increase the probability of an early detection of HPAI in chicken, the probability of performing AI specific tests when AI is suspected would need to be increased. Copyright © 2010 Elsevier B.V. All rights reserved.
Tsalatsanis, Athanasios; Barnes, Laura E; Hozo, Iztok; Djulbegovic, Benjamin
2011-12-23
Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned.
2011-01-01
Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned. PMID:22196308
Nuclear Electric Vehicle Optimization Toolset (NEVOT)
NASA Technical Reports Server (NTRS)
Tinker, Michael L.; Steincamp, James W.; Stewart, Eric T.; Patton, Bruce W.; Pannell, William P.; Newby, Ronald L.; Coffman, Mark E.; Kos, Larry D.; Qualls, A. Lou; Greene, Sherrell
2004-01-01
The Nuclear Electric Vehicle Optimization Toolset (NEVOT) optimizes the design of all major nuclear electric propulsion (NEP) vehicle subsystems for a defined mission within constraints and optimization parameters chosen by a user. The tool uses a genetic algorithm (GA) search technique to combine subsystem designs and evaluate the fitness of the integrated design to fulfill a mission. The fitness of an individual is used within the GA to determine its probability of survival through successive generations in which the designs with low fitness are eliminated and replaced with combinations or mutations of designs with higher fitness. The program can find optimal solutions for different sets of fitness metrics without modification and can create and evaluate vehicle designs that might never be considered through traditional design techniques. It is anticipated that the flexible optimization methodology will expand present knowledge of the design trade-offs inherent in designing nuclear powered space vehicles and lead to improved NEP designs.
The book availability study as an objective measure of performance in a health sciences library.
Kolner, S J; Welch, E C
1985-01-01
In its search for an objective overall diagnostic evaluation, the University of Illinois Library of the Health Sciences' Program Evaluation Committee selected a book availability measure; it is easy to administer and repeat, results are reproducible, and comparable data exist for other academic and health sciences libraries. The study followed the standard methodology in the literature with minor modifications. Patrons searching for particular books were asked to record item(s) needed and the outcome of the search. Library staff members then determined the reasons for failures in obtaining desired items. The results of the study are five performance scores. The first four represent the percentage probability of a library's operating with ideal effectiveness; the last provides an overall performance score. The scores of the Library of the Health Sciences demonstrated no unusual availability problems. The study was easy to implement and provided meaningful, quantitative, and objective data. PMID:3995202
Traverse Planning Experiments for Future Planetary Surface Exploration
NASA Technical Reports Server (NTRS)
Hoffman, Stephen J.; Voels, Stephen A.; Mueller, Robert P.; Lee, Pascal C.
2012-01-01
The purpose of the investigation is to evaluate methodology and data requirements for remotely-assisted robotic traverse of extraterrestrial planetary surface to support human exploration program, assess opportunities for in-transit science operations, and validate landing site survey and selection techniques during planetary surface exploration mission analog demonstration at Haughton Crater on Devon Island, Nunavut, Canada. Additionally, 1) identify quality of remote observation data sets (i.e., surface imagery from orbit) required for effective pre-traverse route planning and determine if surface level data (i.e., onboard robotic imagery or other sensor data) is required for a successful traverse, and if additional surface level data can improve traverse efficiency or probability of success (TRPF Experiment). 2) Evaluate feasibility and techniques for conducting opportunistic science investigations during this type of traverse. (OSP Experiment). 3) Assess utility of remotely-assisted robotic vehicle for landing site validation survey. (LSV Experiment).
2014-01-01
Background The objective of this study was to perform a systematic review and a meta-analysis in order to estimate the diagnostic accuracy of diffusion weighted imaging (DWI) in the preoperative assessment of deep myometrial invasion in patients with endometrial carcinoma. Methods Studies evaluating DWI for the detection of deep myometrial invasion in patients with endometrial carcinoma were systematically searched for in the MEDLINE, EMBASE, and Cochrane Library from January 1995 to January 2014. Methodologic quality was assessed by using the Quality Assessment of Diagnostic Accuracy Studies tool. Bivariate random-effects meta-analytic methods were used to obtain pooled estimates of sensitivity, specificity, diagnostic odds ratio (DOR) and receiver operating characteristic (ROC) curves. The study also evaluated the clinical utility of DWI in preoperative assessment of deep myometrial invasion. Results Seven studies enrolling a total of 320 individuals met the study inclusion criteria. The summary area under the ROC curve was 0.91. There was no evidence of publication bias (P = 0.90, bias coefficient analysis). Sensitivity and specificity of DWI for detection of deep myometrial invasion across all studies were 0.90 and 0.89, respectively. Positive and negative likelihood ratios with DWI were 8 and 0.11 respectively. In patients with high pre-test probabilities, DWI enabled confirmation of deep myometrial invasion; in patients with low pre-test probabilities, DWI enabled exclusion of deep myometrial invasion. The worst case scenario (pre-test probability, 50%) post-test probabilities were 89% and 10% for positive and negative DWI results, respectively. Conclusion DWI has high sensitivity and specificity for detecting deep myometrial invasion and more importantly can reliably rule out deep myometrial invasion. Therefore, it would be worthwhile to add a DWI sequence to the standard MRI protocols in preoperative evaluation of endometrial cancer in order to detect deep myometrial invasion, which along with other poor prognostic factors like age, tumor grade, and LVSI would be useful in stratifying high risk groups thereby helping in the tailoring of surgical approach in patient with low risk of endometrial carcinoma. PMID:25608571
Cipoli, Daniel E; Martinez, Edson Z; Castro, Margaret de; Moreira, Ayrton C
2012-12-01
To estimate the pretest probability of Cushing's syndrome (CS) diagnosis by a Bayesian approach using intuitive clinical judgment. Physicians were requested, in seven endocrinology meetings, to answer three questions: "Based on your personal expertise, after obtaining clinical history and physical examination, without using laboratorial tests, what is your probability of diagnosing Cushing's Syndrome?"; "For how long have you been practicing Endocrinology?"; and "Where do you work?". A Bayesian beta regression, using the WinBugs software was employed. We obtained 294 questionnaires. The mean pretest probability of CS diagnosis was 51.6% (95%CI: 48.7-54.3). The probability was directly related to experience in endocrinology, but not with the place of work. Pretest probability of CS diagnosis was estimated using a Bayesian methodology. Although pretest likelihood can be context-dependent, experience based on years of practice may help the practitioner to diagnosis CS.
Groth, Katrina M.; Smith, Curtis L.; Swiler, Laura P.
2014-04-05
In the past several years, several international agencies have begun to collect data on human performance in nuclear power plant simulators [1]. This data provides a valuable opportunity to improve human reliability analysis (HRA), but there improvements will not be realized without implementation of Bayesian methods. Bayesian methods are widely used in to incorporate sparse data into models in many parts of probabilistic risk assessment (PRA), but Bayesian methods have not been adopted by the HRA community. In this article, we provide a Bayesian methodology to formally use simulator data to refine the human error probabilities (HEPs) assigned by existingmore » HRA methods. We demonstrate the methodology with a case study, wherein we use simulator data from the Halden Reactor Project to update the probability assignments from the SPAR-H method. The case study demonstrates the ability to use performance data, even sparse data, to improve existing HRA methods. Furthermore, this paper also serves as a demonstration of the value of Bayesian methods to improve the technical basis of HRA.« less
Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis
2016-11-01
The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values. Copyright © 2016 Elsevier B.V. All rights reserved.
Rolling-Bearing Service Life Based on Probable Cause for Removal: A Tutorial
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Branzai, Emanuel V.
2017-01-01
In 1947 and 1952, Gustaf Lundberg and Arvid Palmgren developed what is now referred to as the Lundberg-Palmgren Model for Rolling Bearing Life Prediction based on classical rolling-element fatigue. Today, bearing fatigue probably accounts for less than 5 percent of bearings removed from service for cause. A bearing service life prediction methodology and tutorial indexed to eight probable causes for bearing removal, including fatigue, are presented, which incorporate strict series reliability; Weibull statistical analysis; available published field data from the Naval Air Rework Facility; and 224,000 rolling-element bearings removed for rework from commercial aircraft engines.
[Radiotherapy phase I trials' methodology: Features].
Rivoirard, R; Vallard, A; Langrand-Escure, J; Guy, J-B; Ben Mrad, M; Yaoxiong, X; Diao, P; Méry, B; Pigne, G; Rancoule, C; Magné, N
2016-12-01
In clinical research, biostatistical methods allow the rigorous analysis of data collection and should be defined from the trial design to obtain the appropriate experimental approach. Thus, if the main purpose of phase I is to determine the dose to use during phase II, methodology should be finely adjusted to experimental treatment(s). Today, the methodology for chemotherapy and targeted therapy is well known. For radiotherapy and chemoradiotherapy phase I trials, the primary endpoint must reflect both effectiveness and potential treatment toxicities. Methodology should probably be complex to limit failures in the following phases. However, there are very few data about methodology design in the literature. The present study focuses on these particular trials and their characteristics. It should help to raise existing methodological patterns shortcomings in order to propose new and better-suited designs. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Probability genotype imputation method and integrated weighted lasso for QTL identification.
Demetrashvili, Nino; Van den Heuvel, Edwin R; Wit, Ernst C
2013-12-30
Many QTL studies have two common features: (1) often there is missing marker information, (2) among many markers involved in the biological process only a few are causal. In statistics, the second issue falls under the headings "sparsity" and "causal inference". The goal of this work is to develop a two-step statistical methodology for QTL mapping for markers with binary genotypes. The first step introduces a novel imputation method for missing genotypes. Outcomes of the proposed imputation method are probabilities which serve as weights to the second step, namely in weighted lasso. The sparse phenotype inference is employed to select a set of predictive markers for the trait of interest. Simulation studies validate the proposed methodology under a wide range of realistic settings. Furthermore, the methodology outperforms alternative imputation and variable selection methods in such studies. The methodology was applied to an Arabidopsis experiment, containing 69 markers for 165 recombinant inbred lines of a F8 generation. The results confirm previously identified regions, however several new markers are also found. On the basis of the inferred ROC behavior these markers show good potential for being real, especially for the germination trait Gmax. Our imputation method shows higher accuracy in terms of sensitivity and specificity compared to alternative imputation method. Also, the proposed weighted lasso outperforms commonly practiced multiple regression as well as the traditional lasso and adaptive lasso with three weighting schemes. This means that under realistic missing data settings this methodology can be used for QTL identification.
Statistical analysis of the uncertainty related to flood hazard appraisal
NASA Astrophysics Data System (ADS)
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
METHODOLOGICAL QUALITY OF ECONOMIC EVALUATIONS ALONGSIDE TRIALS OF KNEE PHYSIOTHERAPY.
García-Pérez, Lidia; Linertová, Renata; Arvelo-Martín, Alejandro; Guerra-Marrero, Carolina; Martínez-Alberto, Carlos Enrique; Cuéllar-Pompa, Leticia; Escobar, Antonio; Serrano-Aguilar, Pedro
2017-01-01
The methodological quality of an economic evaluation performed alongside a clinical trial can be underestimated if the paper does not report key methodological features. This study discusses methodological assessment issues on the example of a systematic review on cost-effectiveness of physiotherapy for knee osteoarthritis. Six economic evaluation studies included in the systematic review and related clinical trials were assessed using the 10-question check-list by Drummond and the Physiotherapy Evidence Database (PEDro) scale. All economic evaluations were performed alongside a clinical trial but the studied interventions were too heterogeneous to be synthesized. Methodological quality of the economic evaluations reported in the papers was not free of drawbacks, and in some cases, it improved when information from the related clinical trial was taken into account. Economic evaluation papers dedicate little space to methodological features of related clinical trials; therefore, the methodological quality can be underestimated if evaluated separately from the trials. Future economic evaluations should follow more strictly the recommendations about methodology and the authors should pay special attention to the quality of reporting.
García-Alonso, Carlos; Pérez-Naranjo, Leonor
2009-01-01
Introduction Knowledge management, based on information transfer between experts and analysts, is crucial for the validity and usability of data envelopment analysis (DEA). Aim To design and develop a methodology: i) to assess technical efficiency of small health areas (SHA) in an uncertainty environment, and ii) to transfer information between experts and operational models, in both directions, for improving expert’s knowledge. Method A procedure derived from knowledge discovery from data (KDD) is used to select, interpret and weigh DEA inputs and outputs. Based on KDD results, an expert-driven Monte-Carlo DEA model has been designed to assess the technical efficiency of SHA in Andalusia. Results In terms of probability, SHA 29 is the most efficient being, on the contrary, SHA 22 very inefficient. 73% of analysed SHA have a probability of being efficient (Pe) >0.9 and 18% <0.5. Conclusions Expert knowledge is necessary to design and validate any operational model. KDD techniques make the transfer of information from experts to any operational model easy and results obtained from the latter improve expert’s knowledge.
Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis
NASA Technical Reports Server (NTRS)
Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.
Patriarca, Peter A; Van Auken, R Michael; Kebschull, Scott A
2018-01-01
Benefit-risk evaluations of drugs have been conducted since the introduction of modern regulatory systems more than 50 years ago. Such judgments are typically made on the basis of qualitative or semiquantitative approaches, often without the aid of quantitative assessment methods, the latter having often been applied asymmetrically to place emphasis on benefit more so than harm. In an effort to preliminarily evaluate the utility of lives lost or saved, or quality-adjusted life-years (QALY) lost and gained as a means of quantitatively assessing the potential benefits and risks of a new chemical entity, we focused our attention on the unique scenario in which a drug was initially approved based on one set of data, but later withdrawn from the market based on a second set of data. In this analysis, a dimensionless risk to benefit ratio was calculated in each instance, based on the risk and benefit quantified in similar units. The results indicated that FDA decisions to approve the drug corresponded to risk to benefit ratios less than or equal to 0.136, and that decisions to withdraw the drug from the US market corresponded to risk to benefit ratios greater than or equal to 0.092. The probability of FDA approval was then estimated using logistic regression analysis. The results of this analysis indicated that there was a 50% probability of FDA approval if the risk to benefit ratio was 0.121, and that the probability approaches 100% for values much less than 0.121, and the probability approaches 0% for values much greater than 0.121. The large uncertainty in these estimates due to the small sample size and overlapping data may be addressed in the future by applying the methodology to other drugs.
MODFLOW 2000 Head Uncertainty, a First-Order Second Moment Method
Glasgow, H.S.; Fortney, M.D.; Lee, J.; Graettinger, A.J.; Reeves, H.W.
2003-01-01
A computationally efficient method to estimate the variance and covariance in piezometric head results computed through MODFLOW 2000 using a first-order second moment (FOSM) approach is presented. This methodology employs a first-order Taylor series expansion to combine model sensitivity with uncertainty in geologic data. MODFLOW 2000 is used to calculate both the ground water head and the sensitivity of head to changes in input data. From a limited number of samples, geologic data are extrapolated and their associated uncertainties are computed through a conditional probability calculation. Combining the spatially related sensitivity and input uncertainty produces the variance-covariance matrix, the diagonal of which is used to yield the standard deviation in MODFLOW 2000 head. The variance in piezometric head can be used for calibrating the model, estimating confidence intervals, directing exploration, and evaluating the reliability of a design. A case study illustrates the approach, where aquifer transmissivity is the spatially related uncertain geologic input data. The FOSM methodology is shown to be applicable for calculating output uncertainty for (1) spatially related input and output data, and (2) multiple input parameters (transmissivity and recharge).
Del Giudice, G; Padulano, R; Siciliano, D
2016-01-01
The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements.
Barros, Lorena; Retamal, Christian; Torres, Héctor; Zúñiga, Rommy N; Troncoso, Elizabeth
2016-12-01
A new in vitro mechanical gastric system (IMGS) was fabricated which incorporates: a J-shaped stomach, a mechanical system with realistic peristaltic frequency and force magnitude, and a reproduction of the gastric pH curve. To evaluate the impact of a more realistic gastric peristalsis on the intestinal lipolysis of protein-stabilized O/W emulsions, the emulsions were subjected to two different in vitro digestion methodologies: (i) gastric digestion in the IMGS and intestinal digestion in a stirred beaker (SB), and (ii) gastric and intestinal digestion assays carried out in SBs. At the end of the intestinal digestion, the total amount of free fatty acids released was significantly higher for the first methodology (IMGS-SB) in comparison with the second one (27.5% vs. 23.0%), probably due to the higher physical instability induced by the IMGS in the gastric contents. These results reaffirm that O/W emulsion stability plays a crucial role in controlling the final extent of lipolysis of this kind of food-grade emulsions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Projecting adverse event incidence rates using empirical Bayes methodology.
Ma, Guoguang Julie; Ganju, Jitendra; Huang, Jing
2016-08-01
Although there is considerable interest in adverse events observed in clinical trials, projecting adverse event incidence rates in an extended period can be of interest when the trial duration is limited compared to clinical practice. A naïve method for making projections might involve modeling the observed rates into the future for each adverse event. However, such an approach overlooks the information that can be borrowed across all the adverse event data. We propose a method that weights each projection using a shrinkage factor; the adverse event-specific shrinkage is a probability, based on empirical Bayes methodology, estimated from all the adverse event data, reflecting evidence in support of the null or non-null hypotheses. Also proposed is a technique to estimate the proportion of true nulls, called the common area under the density curves, which is a critical step in arriving at the shrinkage factor. The performance of the method is evaluated by projecting from interim data and then comparing the projected results with observed results. The method is illustrated on two data sets. © The Author(s) 2013.
Capone, A; Cicchetti, A; Mennini, F S; Marcellusi, A; Baio, G; Favato, G
2016-01-01
Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate and analyse these data. A methodological approach relied upon the Health Data Entanglement (HDE) can be a suitable option. By definition, in the HDE a large amount of data sets having several sources are functionally interconnected and computed through learning machines that generate patterns of highly probable future health conditions of a population. Entanglement concept is borrowed from quantum physics and means that multiple particles (information) are linked together in a way such that the measurement of one particle's quantum state (individual health conditions and related economic requirements) determines the possible quantum states of other particles (population health forecasts to predict their impact). The value created by the HDE is based on the combined evaluation of clinical, economic and social effects generated by health interventions. To predict the future health conditions of a population, analyses of data are performed using self-learning AI, in which sequential decisions are based on Bayesian algorithmic probabilities. HDE and AI-based analysis can be adopted to improve the effectiveness of the health governance system in ways that also lead to better quality of care.
Performances of the New Real Time Tsunami Detection Algorithm applied to tide gauges data
NASA Astrophysics Data System (ADS)
Chierici, F.; Embriaco, D.; Morucci, S.
2017-12-01
Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection (TDA) based on the real-time tide removal and real-time band-pass filtering of seabed pressure time series acquired by Bottom Pressure Recorders. The TDA algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability with respect to the most widely used tsunami detection algorithm, while containing the computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. In this work we present the performance of the TDA algorithm applied to tide gauge data. We have adapted the new tsunami detection algorithm and the Monte Carlo test methodology to tide gauges. Sea level data acquired by coastal tide gauges in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event generated by Tohoku earthquake on March 11th 2011, using data recorded by several tide gauges scattered all over the Pacific area.
Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance
NASA Astrophysics Data System (ADS)
Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra
2017-06-01
In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.
Defining a reference set to support methodological research in drug safety.
Ryan, Patrick B; Schuemie, Martijn J; Welebob, Emily; Duke, Jon; Valentine, Sarah; Hartzema, Abraham G
2013-10-01
Methodological research to evaluate the performance of methods requires a benchmark to serve as a referent comparison. In drug safety, the performance of analyses of spontaneous adverse event reporting databases and observational healthcare data, such as administrative claims and electronic health records, has been limited by the lack of such standards. To establish a reference set of test cases that contain both positive and negative controls, which can serve the basis for methodological research in evaluating methods performance in identifying drug safety issues. Systematic literature review and natural language processing of structured product labeling was performed to identify evidence to support the classification of drugs as either positive controls or negative controls for four outcomes: acute liver injury, acute kidney injury, acute myocardial infarction, and upper gastrointestinal bleeding. Three-hundred and ninety-nine test cases comprised of 165 positive controls and 234 negative controls were identified across the four outcomes. The majority of positive controls for acute kidney injury and upper gastrointestinal bleeding were supported by randomized clinical trial evidence, while the majority of positive controls for acute liver injury and acute myocardial infarction were only supported based on published case reports. Literature estimates for the positive controls shows substantial variability that limits the ability to establish a reference set with known effect sizes. A reference set of test cases can be established to facilitate methodological research in drug safety. Creating a sufficient sample of drug-outcome pairs with binary classification of having no effect (negative controls) or having an increased effect (positive controls) is possible and can enable estimation of predictive accuracy through discrimination. Since the magnitude of the positive effects cannot be reliably obtained and the quality of evidence may vary across outcomes, assumptions are required to use the test cases in real data for purposes of measuring bias, mean squared error, or coverage probability.
An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations
Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.
2016-01-01
We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360
Pollitz, F.F.; Schwartz, D.P.
2008-01-01
We construct a viscoelastic cycle model of plate boundary deformation that includes the effect of time-dependent interseismic strain accumulation, coseismic strain release, and viscoelastic relaxation of the substrate beneath the seismogenic crust. For a given fault system, time-averaged stress changes at any point (not on a fault) are constrained to zero; that is, kinematic consistency is enforced for the fault system. The dates of last rupture, mean recurrence times, and the slip distributions of the (assumed) repeating ruptures are key inputs into the viscoelastic cycle model. This simple formulation allows construction of stress evolution at all points in the plate boundary zone for purposes of probabilistic seismic hazard analysis (PSHA). Stress evolution is combined with a Coulomb failure stress threshold at representative points on the fault segments to estimate the times of their respective future ruptures. In our PSHA we consider uncertainties in a four-dimensional parameter space: the rupture peridocities, slip distributions, time of last earthquake (for prehistoric ruptures) and Coulomb failure stress thresholds. We apply this methodology to the San Francisco Bay region using a recently determined fault chronology of area faults. Assuming single-segment rupture scenarios, we find that fature rupture probabilities of area faults in the coming decades are the highest for the southern Hayward, Rodgers Creek, and northern Calaveras faults. This conclusion is qualitatively similar to that of Working Group on California Earthquake Probabilities, but the probabilities derived here are significantly higher. Given that fault rupture probabilities are highly model-dependent, no single model should be used to assess to time-dependent rupture probabilities. We suggest that several models, including the present one, be used in a comprehensive PSHA methodology, as was done by Working Group on California Earthquake Probabilities.
Crovelli, R.A.; Balay, R.H.
1991-01-01
A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.
Wind/tornado design criteria, development to achieve required probabilistic performance goals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ng, D.S.
1991-06-01
This paper describes the strategy for developing new design criteria for a critical facility to withstand loading induced by the wind/tornado hazard. The proposed design requirements for resisting wind/tornado loads are based on probabilistic performance goals. The proposed design criteria were prepared by a Working Group consisting of six experts in wind/tornado engineering and meteorology. Utilizing their best technical knowledge and judgment in the wind/tornado field, they met and discussed the methodologies and reviewed available data. A review of the available wind/tornado hazard model for the site, structural response evaluation methods, and conservative acceptance criteria lead to proposed design criteriamore » that has a high probability of achieving the required performance goals.« less
Wind/tornado design criteria, development to achieve required probabilistic performance goals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ng, D.S.
This paper describes the strategy for developing new design criteria for a critical facility to withstand loading induced by the wind/tornado hazard. The proposed design requirements for resisting wind/tornado loads are based on probabilistic performance goals. The proposed design criteria were prepared by a Working Group consisting of six experts in wind/tornado engineering and meteorology. Utilizing their best technical knowledge and judgment in the wind/tornado field, they met and discussed the methodologies and reviewed available data. A review of the available wind/tornado hazard model for the site, structural response evaluation methods, and conservative acceptance criteria lead to proposed design criteriamore » that has a high probability of achieving the required performance goals.« less
Fracture Probability of MEMS Optical Devices for Space Flight Applications
NASA Technical Reports Server (NTRS)
Fettig, Rainer K.; Kuhn, Jonathan L.; Moseley, S. Harvey; Kutyrev, Alexander S.; Orloff, Jon
1999-01-01
A bending fracture test specimen design is presented for thin elements used in optical devices for space flight applications. The specimen design is insensitive to load position, avoids end effect complications, and can be used to measure strength of membranes less than 2 microns thick. The theoretical equations predicting stress at failure are presented, and a detailed finite element model is developed to validate the equations for this application. An experimental procedure using a focused ion beam machine is outlined, and results from preliminary tests of 1.9 microns thick single crystal silicon are presented. These tests are placed in the context of a methodology for the design and evaluation of mission critical devices comprised of large arrays of cells.
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
Experimental Evaluation Methodology for Spacecraft Proximity Maneuvers in a Dynamic Environment
2017-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA DISSERTATION EXPERIMENTAL EVALUATION METHODOLOGY FOR SPACECRAFT PROXIMITY MANEUVERS IN A DYNAMIC...29, 2014 – June 16, 2017 4. TITLE AND SUBTITLE EXPERIMENTAL EVALUATION METHODOLOGY FOR SPACECRAFT PROXIMITY MANEUVERS IN A DYNAMIC ENVIRONMENT 5...LEFT BLANK ii Approved for public release. Distribution is unlimited. EXPERIMENTAL EVALUATION METHODOLOGY FOR SPACECRAFT PROXIMITY MANEUVERS IN A
Beginning without a Conclusion.
ERIC Educational Resources Information Center
Frazier, Richard
1988-01-01
Describes a series of activities without conclusions to introduce scientific reasoning in a ninth grade physical science course. Uses popcorn popping to get students to think about the concepts of graphing, histograms, frequency, probability, and scientific methodology. (CW)
Smith, Adam B; Cocks, Kim; Parry, David; Taylor, Matthew
2014-04-01
The inclusion of patient-reported outcome (PRO) instruments to record patient health-related quality of life (HRQOL) data has virtually become the norm in oncology randomised controlled trials (RCTs). Despite this fact, recent concerns have focused on the quality of reporting of HRQOL. The primary aim of this study was to evaluate the quality of reporting of HRQOL data from two common instruments in oncology RCTs. A meta-review was undertaken of systematic reviews reporting HRQOL data collected using PRO instruments in oncology randomised controlled trials (RCTs). English language articles published between 2000 and 2012 were included and evaluated against a methodology checklist. Four hundred and thirty-five potential articles were identified. Six systematic reviews were included in the analysis. A total of 70,403 patients had completed PROs. The European Organization for Research and Treatment of Cancer QLQ-C30 and Functional Assessment of Cancer Therapy-General questionnaire accounted for 55 % of RCTs. Eighty per cent of RCTs had used psychometrically validated instruments; 70 % reported culturally valid instruments and almost all reported the assessment timing (96 %). Thirty per cent of RCTS reported clinical significance and missing data. In terms of methodological design, only 25 % of RCTs could be categorised as probably robust. The majority of oncology RCTs has shortcomings in terms of reporting HRQOL data when assessed against regulatory and methodology guidelines. These limitations will need to be addressed if HRQOL data are to be used to successfully support clinical decision-making, treatment options and labelling claims in oncology.
Survival prediction of trauma patients: a study on US National Trauma Data Bank.
Sefrioui, I; Amadini, R; Mauro, J; El Fallahi, A; Gabbrielli, M
2017-12-01
Exceptional circumstances like major incidents or natural disasters may cause a huge number of victims that might not be immediately and simultaneously saved. In these cases it is important to define priorities avoiding to waste time and resources for not savable victims. Trauma and Injury Severity Score (TRISS) methodology is the well-known and standard system usually used by practitioners to predict the survival probability of trauma patients. However, practitioners have noted that the accuracy of TRISS predictions is unacceptable especially for severely injured patients. Thus, alternative methods should be proposed. In this work we evaluate different approaches for predicting whether a patient will survive or not according to simple and easily measurable observations. We conducted a rigorous, comparative study based on the most important prediction techniques using real clinical data of the US National Trauma Data Bank. Empirical results show that well-known Machine Learning classifiers can outperform the TRISS methodology. Based on our findings, we can say that the best approach we evaluated is Random Forest: it has the best accuracy, the best area under the curve, and k-statistic, as well as the second-best sensitivity and specificity. It has also a good calibration curve. Furthermore, its performance monotonically increases as the dataset size grows, meaning that it can be very effective to exploit incoming knowledge. Considering the whole dataset, it is always better than TRISS. Finally, we implemented a new tool to compute the survival of victims. This will help medical practitioners to obtain a better accuracy than the TRISS tools. Random Forests may be a good candidate solution for improving the predictions on survival upon the standard TRISS methodology.
Potential of SNP markers for the characterization of Brazilian cassava germplasm.
de Oliveira, Eder Jorge; Ferreira, Cláudia Fortes; da Silva Santos, Vanderlei; de Jesus, Onildo Nunes; Oliveira, Gilmara Alvarenga Fachardo; da Silva, Maiane Suzarte
2014-06-01
High-throughput markers, such as SNPs, along with different methodologies were used to evaluate the applicability of the Bayesian approach and the multivariate analysis in structuring the genetic diversity in cassavas. The objective of the present work was to evaluate the diversity and genetic structure of the largest cassava germplasm bank in Brazil. Complementary methodological approaches such as discriminant analysis of principal components (DAPC), Bayesian analysis and molecular analysis of variance (AMOVA) were used to understand the structure and diversity of 1,280 accessions genotyped using 402 single nucleotide polymorphism markers. The genetic diversity (0.327) and the average observed heterozygosity (0.322) were high considering the bi-allelic markers. In terms of population, the presence of a complex genetic structure was observed indicating the formation of 30 clusters by DAPC and 34 clusters by Bayesian analysis. Both methodologies presented difficulties and controversies in terms of the allocation of some accessions to specific clusters. However, the clusters suggested by the DAPC analysis seemed to be more consistent for presenting higher probability of allocation of the accessions within the clusters. Prior information related to breeding patterns and geographic origins of the accessions were not sufficient for providing clear differentiation between the clusters according to the AMOVA analysis. In contrast, the F ST was maximized when considering the clusters suggested by the Bayesian and DAPC analyses. The high frequency of germplasm exchange between producers and the subsequent alteration of the name of the same material may be one of the causes of the low association between genetic diversity and geographic origin. The results of this study may benefit cassava germplasm conservation programs, and contribute to the maximization of genetic gains in breeding programs.
Coleman, Laci S.; Ford, W. Mark; Dobony, Christopher A.; Britzke, Eric R.
2014-01-01
Concomitant with the emergence and spread of white-nose syndrome (WNS) and precipitous decline of many bat species in North America, natural resource managers need modified and/or new techniques for bat inventory and monitoring that provide robust occupancy estimates. We used Anabat acoustic detectors to determine the most efficient passive acoustic sampling design for optimizing detection probabilities of multiple bat species in a WNS-impacted environment in New York, USA. Our sampling protocol included: six acoustic stations deployed for the entire duration of monitoring as well as a 4 x 4 grid and five transects of 5-10 acoustic units that were deployed for 6-8 night sample durations surveyed during the summers of 2011-2012. We used Program PRESENCE to determine detection probability and site occupancy estimates. Overall, the grid produced the highest detection probabilities for most species because it contained the most detectors and intercepted the greatest spatial area. However, big brown bats (Eptesicus fuscus) and species not impacted by WNS were detected easily regardless of sampling array. Endangered Indiana (Myotis sodalis) and little brown (Myotis lucifugus) and tri-colored bats (Perimyotis subflavus) showed declines in detection probabilities over our study, potentially indicative of continued WNS-associated declines. Identification of species presence through efficient methodologies is vital for future conservation efforts as bat populations decline further due to WNS and other factors.
[CALCULATION OF THE PROBABILITY OF METALS INPUT INTO AN ORGANISM WITH DRINKING POTABLE WATERS].
Tunakova, Yu A; Fayzullin, R I; Valiev, V S
2015-01-01
The work was performed in framework of the State program for the improvement of the competitiveness of Kazan (Volga) Federal University among the world's leading research and education centers and subsidies unveiled to Kazan Federal University to perform public tasks in the field of scientific research. In the current methodological recommendations "Guide for assessing the risk to public health under the influence of chemicals that pollute the environment," P 2.1.10.1920-04 there is regulated the determination of quantitative and/or qualitative characteristics of the harmful effects to human health from exposure to environmental factors. We proposed to complement the methodological approaches presented in P 2.1.10.1920-04, with the estimation of the probability of pollutants input in the body with drinking water which is the greater, the higher the order of the excess of the actual concentrations of the substances in comparison with background concentrations. In the paper there is proposed a method of calculation of the probability of exceeding the actual concentrations of metal cations above the background in samples of drinking water consumed by the population, which were selected at the end points of consumption in houses and apartments, to accommodate the passage of secondary pollution ofwater pipelines and distributing paths. Research was performed on the example of Kazan, divided into zones. The calculation of probabilities was made with the use of Bayes' theorem.
Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska
Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.
2007-01-01
We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.
NASA Technical Reports Server (NTRS)
Scalzo, F.
1983-01-01
Sensor redundancy management (SRM) requires a system which will detect failures and reconstruct avionics accordingly. A probability density function to determine false alarm rates, using an algorithmic approach was generated. Microcomputer software was developed which will print out tables of values for the cummulative probability of being in the domain of failure; system reliability; and false alarm probability, given a signal is in the domain of failure. The microcomputer software was applied to the sensor output data for various AFT1 F-16 flights and sensor parameters. Practical recommendations for further research were made.
Characterization of autoregressive processes using entropic quantifiers
NASA Astrophysics Data System (ADS)
Traversaro, Francisco; Redelico, Francisco O.
2018-01-01
The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.
[Evaluation of arguments in research reports].
Botes, A
1999-06-01
Some authors on research methodology are of opinion that research reports are based on the logic of reasoning and that such reports communicate with the reader by presenting logical, coherent arguments (Böhme, 1975:206; Mouton, 1996:69). This view implies that researchers draw specific conclusions and that such conclusions are justified by way of reasoning (Doppelt, 1998:105; Giere, 1984:26; Harre, 1965:11; Leherer & Wagner, 1983 & Pitt, 1988:7). The structure of a research report thus consists mainly of conclusions and reasons for such conclusions (Booth, Colomb & Williams, 1995:97). From this it appears that justification by means of reasoning is a standard procedure in research and research reports. Despite the fact that the logic of research is based on reasoning, that the justification of research findings by way of reasoning appears to be standard procedure and that the structure of a research report comprises arguments, the evaluation or assessment of research, as described in most textbooks on research methodology (Burns & Grove, 1993:647; Creswell, 1994:193; LoBiondo-Wood & Haber, 1994:441/481) does not focus on the arguments of research. The evaluation criteria for research reports which are set in these textbooks are related to the way in which the research process is carried out and focus on the measures for internal, external, theoretical, measurement and inferential validity. This means that criteria for the evaluation of research are comprehensive and they should be very specific in respect of each type of research (for example quantitative or qualitative). When the evaluation of research reports is focused on arguments and logic, there could probably be one set of universal standards against which all types of human science research reports can be assessed. Such a universal set of standards could possibly simplify the evaluation of research reports in the human sciences since they can be used to assess all the critical aspects of research reports. As arguments from the basic structure of research reports and are probably also important in the evaluation of research reports in the human sciences, the following questions which I want to answer, are relevant to this paper namely: What are the standards which the reasoning in research reports in the human sciences should meet? How can research reports in the human sciences be assessed or evaluated according to these standards? In answering the first question, the logical demands that are made on reasoning in research are investigated. From these demands the acceptability of the statements, relevance and support of the premises to the conclusion are set as standards for reasoning in research. In answering the second question, a research article is used to demonstrate how the macro- and micro-arguments of research reports can be assessed or evaluated according to these standards. With evaluation it is indicated that the aspects of internal, external, theoretical, measurement and inferential validity can be evaluated according to these standards.
Combined UMC- DFT prediction of electron-hole coupling in unit cells of pentacene crystals.
Leal, Luciano Almeida; de Souza Júnior, Rafael Timóteo; de Almeida Fonseca, Antonio Luciano; Ribeiro Junior, Luiz Antonio; Blawid, Stefan; da Silva Filho, Demetrio Antonio; da Cunha, Wiliam Ferreira
2017-05-01
Pentacene is an organic semiconductor that draws special attention from the scientific community due to the high mobility of its charge carriers. As electron-hole interactions are important aspects in the regard of such property, a computationally inexpensive method to predict the coupling between these quasi-particles is highly desired. In this work, we propose a hybrid methodology of combining Uncoupled Monte Carlo Simulations (UMC) and Density functional Theory (DFT) methodologies to obtain a good compromise between computational feasibility and accuracy. As a first step in considering a Pentacene crystal, we describe its unit cell: the Pentacene Dimer. Because many conformations can be encountered for the dimer and considering the complexity of the system, we make use of UMC in order to find the most probable structures and relative orientations for the Pentacene-Pentacene complex. Following, we carry out electronic structure calculations in the scope of DFT with the goal of describing the electron-hole coupling on the most probable configurations obtained by UMC. The comparison of our results with previously reported data on the literature suggests that the methodology is well suited for describing transfer integrals of organic semiconductors. The observed accuracy together with the smaller computational cost required by our approach allows us to conclude that such methodology might be an important tool towards the description of systems with higher complexity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swamy, S.A.; Bhowmick, D.C.; Prager, D.E.
The regulatory requirements for postulated pipe ruptures have changed significantly since the first nuclear plants were designed. The Leak-Before-Break (LBB) methodology is now accepted as a technically justifiable approach for eliminating postulation of double-ended guillotine breaks (DEGB) in high energy piping systems. The previous pipe rupture design requirements for nuclear power plant applications are responsible for all the numerous and massive pipe whip restraints and jet shields installed for each plant. This results in significant plant congestion, increased labor costs and radiation dosage for normal maintenance and inspection. Also the restraints increase the probability of interference between the piping andmore » supporting structures during plant heatup, thereby potentially impacting overall plant reliability. The LBB approach to eliminate postulating ruptures in high energy piping systems is a significant improvement to former regulatory methodologies, and therefore, the LBB approach to design is gaining worldwide acceptance. However, the methods and criteria for LBB evaluation depend upon the policy of individual country and significant effort continues towards accomplishing uniformity on a global basis. In this paper the historical development of the U.S. LBB criteria will be traced and the results of an LBB evaluation for a typical Japanese PWR primary loop applying U.S. NRC approved methods will be presented. In addition, another approach using the Japanese LBB criteria will be shown and compared with the U.S. criteria. The comparison will be highlighted in this paper with detailed discussion.« less
Evaluation of Enhanced Risk Monitors for Use on Advanced Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Veeramany, Arun; Bonebrake, Christopher A.
This study provides an overview of the methodology for integrating time-dependent failure probabilities into nuclear power reactor risk monitors. This prototypic enhanced risk monitor (ERM) methodology was evaluated using a hypothetical probabilistic risk assessment (PRA) model, generated using a simplified design of a liquid-metal-cooled advanced reactor (AR). Component failure data from industry compilation of failures of components similar to those in the simplified AR model were used to initialize the PRA model. Core damage frequency (CDF) over time were computed and analyzed. In addition, a study on alternative risk metrics for ARs was conducted. Risk metrics that quantify the normalizedmore » cost of repairs, replacements, or other operations and management (O&M) actions were defined and used, along with an economic model, to compute the likely economic risk of future actions such as deferred maintenance based on the anticipated change in CDF due to current component condition and future anticipated degradation. Such integration of conventional-risk metrics with alternate-risk metrics provides a convenient mechanism for assessing the impact of O&M decisions on safety and economics of the plant. It is expected that, when integrated with supervisory control algorithms, such integrated-risk monitors will provide a mechanism for real-time control decision-making that ensure safety margins are maintained while operating the plant in an economically viable manner.« less
Computer-aided personal interviewing. A new technique for data collection in epidemiologic surveys.
Birkett, N J
1988-03-01
Most epidemiologic studies involve the collection of data directly from selected respondents. Traditionally, interviewers are provided with the interview in booklet form on paper and answers are recorded therein. On receipt at the study office, the interview results are coded, transcribed, and keypunched for analysis. The author's team has developed a method of personal interviewing which uses a structured interview stored on a lap-sized computer. Responses are entered into the computer and are subject to immediate error-checking and correction. All skip-patterns are automatic. Data entry to the final data-base involves no manual data transcription. A pilot evaluation with a preliminary version of the system using tape-recorded interviews in a test/re-test methodology revealed a slightly higher error rate, probably related to weaknesses in the pilot system and the training process. Computer interviews tended to be longer but other features of the interview process were not affected by computer. The author's team has now completed 2,505 interviews using this system in a community-based blood pressure survey. It has been well accepted by both interviewers and respondents. Failure to complete an interview on the computer was uncommon (5 per cent) and well-handled by paper back-up questionnaires. The results show that computer-aided personal interviewing in the home is feasible but that further evaluation is needed to establish the impact of this methodology on overall data quality.
Event-scale power law recession analysis: quantifying methodological uncertainty
NASA Astrophysics Data System (ADS)
Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.
2017-01-01
The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship between the power-law recession scale parameter and catchment antecedent wetness varies depending on recession definition and fitting choices. Considering study results, we recommend a combination of four key methodological decisions to maximize the quality of fitted recession curves, and to minimize bias in the related populations of fitted recession parameters.
Estimating the probability that the Taser directly causes human ventricular fibrillation.
Sun, H; Haemmerich, D; Rahko, P S; Webster, J G
2010-04-01
This paper describes the first methodology and results for estimating the order of probability for Tasers directly causing human ventricular fibrillation (VF). The probability of an X26 Taser causing human VF was estimated using: (1) current density near the human heart estimated by using 3D finite-element (FE) models; (2) prior data of the maximum dart-to-heart distances that caused VF in pigs; (3) minimum skin-to-heart distances measured in erect humans by echocardiography; and (4) dart landing distribution estimated from police reports. The estimated mean probability of human VF was 0.001 for data from a pig having a chest wall resected to the ribs and 0.000006 for data from a pig with no resection when inserting a blunt probe. The VF probability for a given dart location decreased with the dart-to-heart horizontal distance (radius) on the skin surface.
NASA Astrophysics Data System (ADS)
Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.
2018-02-01
Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.
Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005
ERIC Educational Resources Information Center
Coffman, Julia, Ed.
2005-01-01
This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sacks, H.K.; Novak, T.
2008-03-15
During the past decade, several methane/air explosions in abandoned or sealed areas of underground coal mines have been attributed to lightning. Previously published work by the authors showed, through computer simulations, that currents from lightning could propagate down steel-cased boreholes and ignite explosive methane/air mixtures. The presented work expands on the model and describes a methodology based on IEEE Standard 1410-2004 to estimate the probability of an ignition. The methodology provides a means to better estimate the likelihood that an ignition could occur underground and, more importantly, allows the calculation of what-if scenarios to investigate the effectiveness of engineering controlsmore » to reduce the hazard. The computer software used for calculating fields and potentials is also verified by comparing computed results with an independently developed theoretical model of electromagnetic field propagation through a conductive medium.« less
Methodology for the systems engineering process. Volume 3: Operational availability
NASA Technical Reports Server (NTRS)
Nelson, J. H.
1972-01-01
A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.
Seismic Characterization of EGS Reservoirs
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.
2014-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.
Jobe, Thomas H.; Helgason, Cathy M.
1998-04-01
Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.
The Cylindrical Component Methodology Evaluation Module for MUVES-S2
2017-04-01
ARL-TR-7990 ● APR 2017 US Army Research Laboratory The Cylindrical Component Methodology Evaluation Module for MUVES-S2 by...Laboratory The Cylindrical Component Methodology Evaluation Module for MUVES-S2 by David S Butler, Marianne Kunkel, and Brian G Smith...Methodology Evaluation Module for MUVES-S2 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) David S Butler, Marianne
Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.
2016-02-16
Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less
Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.
Nandi, Anirban; Danquah, Michael K.
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA. PMID:25057428
NASA Technical Reports Server (NTRS)
Dec, John A.; Braun, Robert D.
2011-01-01
A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.
Method of self-consistent evaluation of absolute emission probabilities of particles and gamma rays
NASA Astrophysics Data System (ADS)
Badikov, Sergei; Chechev, Valery
2017-09-01
In assumption of well installed decay scheme the method provides a) exact balance relationships, b) lower (compared to the traditional techniques) uncertainties of recommended absolute emission probabilities of particles and gamma rays, c) evaluation of correlations between the recommended emission probabilities (for the same and different decay modes). Application of the method for the decay data evaluation for even curium isotopes led to paradoxical results. The multidimensional confidence regions for the probabilities of the most intensive alpha transitions constructed on the basis of present and the ENDF/B-VII.1, JEFF-3.1, DDEP evaluations are inconsistent whereas the confidence intervals for the evaluated probabilities of single transitions agree with each other.
NASA Astrophysics Data System (ADS)
Machado, Milena; Santos, Jane Meri; Reisen, Valdério Anselmo; Reis, Neyval Costa; Mavroidis, Ilias; Lima, Ana T.
2018-06-01
Air quality standards for settleable particulate matter (SPM) are found in many countries around the world. As well known, annoyance caused by SPM can be considered a community problem even if only a small proportion of the population is bothered at rather infrequent occasions. Many authors have shown that SPM cause soiling in residential and urban environments and degradation of materials (eg, objects and surface painting) that can impair the use and enjoyment of property and alter the normal activities of society. In this context, this paper has as main contribution to propose a guidance to establish air quality standards for annoyance caused by SPM in metropolitan industrial areas. To attain this objective, a new methodology is proposed which is based on the nonlinear correlation between the perceived annoyance (qualitative variable) and particles deposition rate (quantitative variable). Since the response variable is binary (annoyed and not annoyed), the logistic regression model is used to estimate the probability of people being annoyed at different levels of particles deposition rate and to compute the odds ratio function which gives, under a specific level of particles deposition rate, the estimated expected value of the population perceived annoyance. The proposed methodology is verified in a data set measured in the metropolitan area of Great Vitória, Espirito Santo, Brazil. As a general conclusion, the estimated probability function of perceived annoyance as a function of SPM has shown that 17% of inhabitants report annoyance to very low particles deposition levels of 5 g/(m2•30 days). In addition, for an increasing of 1 g/(m2•30 days) of SPM, the smallest estimated odds ratio of perceived annoyance by a factor of 1.5, implying that the probability of occurrence is almost 2 times as large as the probability of no occurrence of annoyance.
An alternative methodology for interpretation and reporting of hand hygiene compliance data.
DiDiodato, Giulio
2012-05-01
Since 2009, all hospitals in Ontario have been mandated to publicly report health care provider compliance with hand hygiene opportunities (http://www.health.gov.on.ca/patient_safety/index.html). Hand hygiene compliance (HHC) is reported for 2 of the 4 moments during the health care provider-patient encounter. This study analyzes the HHC data by using an alternative methodology for interpretation and reporting. Annualized HHC data were available for fiscal years 2009 and 2010 for each of the 5 hospital corporations (6 sites) in the North Simcoe Muskoka Local Health Integration Network. The weighted average for HHC was used to estimate the overall observed rate for HHC for each hospital and reporting period. Using Bayes' probability theorem, this estimate was used to predict the probability that any patient would experience HHC for at least 75% of hand hygiene moments. This probability was categorized as excellent (≥75%), above average (50%-74%), below average (25%-49%), or poor (<25%). The results were reported using a balanced scorecard display. The overall observed rates for HHC ranged from 50% to 87% (mean, 75% ± 11%, P = .079). Using the alternative methodology for reporting, 6 of the 12 reporting periods would be categorized as excellent, 1 as above average, 2 as below average, and 3 as poor. Population-level HHC data can be converted to patient-level risk information. Reporting this information to the public may increase the value and understandability of this patient safety indicator. Copyright © 2012 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.
Data-driven probability concentration and sampling on manifold
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soize, C., E-mail: christian.soize@univ-paris-est.fr; Ghanem, R., E-mail: ghanem@usc.edu
2016-09-15
A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation methodmore » for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.« less
2012-09-01
3,435 10,461 9.1 3.1 63 Unmarried with Children+ Unmarried without Children 439,495 0.01 10,350 43,870 10.1 2.2 64 Married with Children+ Married ...logistic regression model was used to predict the probability of eligibility for the survey (known eligibility vs . unknown eligibility). A second logistic...regression model was used to predict the probability of response among eligible sample members (complete response vs . non-response). CHAID (Chi
A framework for assessing the adequacy and effectiveness of software development methodologies
NASA Technical Reports Server (NTRS)
Arthur, James D.; Nance, Richard E.
1990-01-01
Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.
Meffre, R; Gehin, C; Schmitt, P M; De Oliveira, F; Dittmar, A
2006-01-01
Pressure ulcers constitute an important health problem. They affect lots of people with mobility disorder and they are difficult to detect and prevent because the damage begins on the muscle. This paper proposes a new approach to study pressure ulcers. We aim at developing a methodology to analyse the probability for a patient to develop a pressure ulcer, and that can detect risky situation. The idea is to relate the mobility disorder to autonomic nervous system (ANS) trouble. More precisely, the evaluation of the consequence of the discomfort on the ANS (stress induced by discomfort) can be relevant for the early detection of the pressure ulcer. Mobility is evaluated through movement measurement. This evaluation, at the interface between soft living tissues and any support has to consider the specificity of the human environment. Soft living tissues have non-linear mechanical properties making conventional rigid sensors non suitable for interface parameters measurement. A new actimeter system has been designed in order to study movements of the human body whatever its support while seating. The device is based on elementary active cells. The number of pressure cells can be easily adapted to the application. The spatial resolution is about 4 cm(2). In this paper, we compare activity measurement of a seated subject with his autonomic nervous system activity, recorded by E.motion device. It has been developed in order to record six parameters: skin potential, skin resistance, skin temperature, skin blood rate, instantaneous cardiac frequency and instantaneous respiratory frequency. The design, instrumentation, and first results are presented.
The 12th International Conference on Computer Safety, Reliability and Security
1993-10-29
then used [10]. The adequacy of the proposed methodology is shown through the design and the validation of a simple control system: a train set example...satisfying the safety condition. 4 Conclusions In this paper we have presented a methodology which can be used for the design of safety-critical systems...has a Burner but no Detector (or the Detector is permanently non -active). The PA: G1 for this design is shown in Fig 3a. The probability matrices are
SURVIAC Bulletin: RPG Encounter Modeling, Vol 27, Issue 1, 2012
2012-01-01
return a probability of hit ( PHIT ) for the scenario. In the model, PHIT depends on the presented area of the targeted system and a set of errors infl...simplifying assumptions, is data-driven, and uses simple yet proven methodologies to determine PHIT . Th e inputs to THREAT describe the target, the RPG, and...Point on 2-D Representation of a CH-47 Th e determination of PHIT by THREAT is performed using one of two possible methodologies. Th e fi rst is a
Solar energy program evaluation: an introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
deLeon, P.
The Program Evaluation Methodology provides an overview of the practice and methodology of program evaluation and defines more precisely the evaluation techniques and methodologies that would be most appropriate to government organizations which are actively involved in the research, development, and commercialization of solar energy systems. Formal evaluation cannot be treated as a single methodological approach for assessing a program. There are four basic types of evaluation designs - the pre-experimental design; the quasi-experimental design based on time series; the quasi-experimental design based on comparison groups; and the true experimental design. This report is organized to first introduce the rolemore » and issues of evaluation. This is to provide a set of issues to organize the subsequent sections detailing the national solar energy programs. Then, these two themes are integrated by examining the evaluation strategies and methodologies tailored to fit the particular needs of the various individual solar energy programs. (MCW)« less
Fracture mechanics analysis of cracked structures using weight function and neural network method
NASA Astrophysics Data System (ADS)
Chen, J. G.; Zang, F. G.; Yang, Y.; Shi, K. K.; Fu, X. L.
2018-06-01
Stress intensity factors(SIFs) due to thermal-mechanical load has been established by using weight function method. Two reference stress states sere used to determine the coefficients in the weight function. Results were evaluated by using data from literature and show a good agreement between them. So, the SIFs can be determined quickly using the weight function obtained when cracks subjected to arbitrary loads, and presented method can be used for probabilistic fracture mechanics analysis. A probabilistic methodology considering Monte-Carlo with neural network (MCNN) has been developed. The results indicate that an accurate probabilistic characteristic of the KI can be obtained by using the developed method. The probability of failure increases with the increasing of loads, and the relationship between is nonlinear.
Petroleum refinery operational planning using robust optimization
NASA Astrophysics Data System (ADS)
Leiras, A.; Hamacher, S.; Elkamel, A.
2010-12-01
In this article, the robust optimization methodology is applied to deal with uncertainties in the prices of saleable products, operating costs, product demand, and product yield in the context of refinery operational planning. A numerical study demonstrates the effectiveness of the proposed robust approach. The benefits of incorporating uncertainty in the different model parameters were evaluated in terms of the cost of ignoring uncertainty in the problem. The calculations suggest that this benefit is equivalent to 7.47% of the deterministic solution value, which indicates that the robust model may offer advantages to those involved with refinery operational planning. In addition, the probability bounds of constraint violation are calculated to help the decision-maker adopt a more appropriate parameter to control robustness and judge the tradeoff between conservatism and total profit.
On the use of attractor dimension as a feature in structural health monitoring
Nichols, J.M.; Virgin, L.N.; Todd, M.D.; Nichols, J.D.
2003-01-01
Recent works in the vibration-based structural health monitoring community have emphasised the use of correlation dimension as a discriminating statistic in seperating a damaged from undamaged response. This paper explores the utility of attractor dimension as a 'feature' and offers some comparisons between different metrics reflecting dimension. This focus is on evaluating the performance of two different measures of dimension as damage indicators in a structural health monitoring context. Results indicate that the correlation dimension is probably a poor choice of statistic for the purpose of signal discrimination. Other measures of dimension may be used for the same purposes with a higher degree of statistical reliability. The question of competing methodologies is placed in a hypothesis testing framework and answered with experimental data taken from a cantilivered beam.
A hierarchical approach to reliability modeling of fault-tolerant systems. M.S. Thesis
NASA Technical Reports Server (NTRS)
Gossman, W. E.
1986-01-01
A methodology for performing fault tolerant system reliability analysis is presented. The method decomposes a system into its subsystems, evaluates vent rates derived from the subsystem's conditional state probability vector and incorporates those results into a hierarchical Markov model of the system. This is done in a manner that addresses failure sequence dependence associated with the system's redundancy management strategy. The method is derived for application to a specific system definition. Results are presented that compare the hierarchical model's unreliability prediction to that of a more complicated tandard Markov model of the system. The results for the example given indicate that the hierarchical method predicts system unreliability to a desirable level of accuracy while achieving significant computational savings relative to component level Markov model of the system.
NASA Technical Reports Server (NTRS)
Matsui, Toshihisa; Zeng, Xiping; Tao, Wei-Kuo; Masunaga, Hirohiko; Olson, William S.; Lang, Stephen
2008-01-01
This paper proposes a methodology known as the Tropical Rainfall Measuring Mission (TRMM) Triple-Sensor Three-step Evaluation Framework (T3EF) for the systematic evaluation of precipitating cloud types and microphysics in a cloud-resolving model (CRM). T3EF utilizes multi-frequency satellite simulators and novel statistics of multi-frequency radiance and backscattering signals observed from the TRMM satellite. Specifically, T3EF compares CRM and satellite observations in the form of combined probability distributions of precipitation radar (PR) reflectivity, polarization-corrected microwave brightness temperature (Tb), and infrared Tb to evaluate the candidate CRM. T3EF is used to evaluate the Goddard Cumulus Ensemble (GCE) model for cases involving the South China Sea Monsoon Experiment (SCSMEX) and Kwajalein Experiment (KWAJEX). This evaluation reveals that the GCE properly captures the satellite-measured frequencies of different precipitating cloud types in the SCSMEX case but underestimates the frequencies of deep convective and deep stratiform types in the KWAJEX case. Moreover, the GCE tends to simulate excessively large and abundant frozen condensates in deep convective clouds as inferred from the overestimated GCE-simulated radar reflectivities and microwave Tb depressions. Unveiling the detailed errors in the GCE s performance provides the best direction for model improvements.
Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno
2016-01-01
Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323
Heintz, Emelie; Gerber-Grote, Andreas; Ghabri, Salah; Hamers, Francoise F; Rupel, Valentina Prevolnik; Slabe-Erker, Renata; Davidson, Thomas
2016-01-01
The objectives of this study were to review current methodological guidelines for economic evaluations of all types of technologies in the 33 countries with organizations involved in the European Network for Health Technology Assessment (EUnetHTA), and to provide a general framework for economic evaluation at a European level. Methodological guidelines for health economic evaluations used by EUnetHTA partners were collected through a survey. Information from each guideline was extracted using a pre-tested extraction template. On the basis of the extracted information, a summary describing the methods used by the EUnetHTA countries was written for each methodological item. General recommendations were formulated for methodological issues where the guidelines of the EUnetHTA partners were in agreement or where the usefulness of economic evaluations may be increased by presenting the results in a specific way. At least one contact person from all 33 EUnetHTA countries (100 %) responded to the survey. In total, the review included 51 guidelines, representing 25 countries (eight countries had no methodological guideline for health economic evaluations). On the basis of the results of the extracted information from all 51 guidelines, EUnetHTA issued ten main recommendations for health economic evaluations. The presented review of methodological guidelines for health economic evaluations and the consequent recommendations will hopefully improve the comparability, transferability and overall usefulness of economic evaluations performed within EUnetHTA. Nevertheless, there are still methodological issues that need to be investigated further.
Asynchronous threat awareness by observer trials using crowd simulation
NASA Astrophysics Data System (ADS)
Dunau, Patrick; Huber, Samuel; Stein, Karin U.; Wellig, Peter
2016-10-01
The last few years showed that a high risk of asynchronous threats is given in every day life. Especially in large crowds a high probability of asynchronous attacks is evident. High observational abilities to detect threats are desirable. Consequently highly trained security and observation personal is needed. This paper evaluates the effectiveness of a training methodology to enhance performance of observation personnel engaging in a specific target identification task. For this purpose a crowd simulation video is utilized. The study first provides a measurement of the base performance before the training sessions. Furthermore a training procedure will be performed. Base performance will then be compared to the after training performance in order to look for a training effect. A thorough evaluation of both the training sessions as well as the overall performance will be done in this paper. A specific hypotheses based metric is used. Results will be discussed in order to provide guidelines for the design of training for observational tasks.
Wu, Cai; Li, Liang
2018-05-15
This paper focuses on quantifying and estimating the predictive accuracy of prognostic models for time-to-event outcomes with competing events. We consider the time-dependent discrimination and calibration metrics, including the receiver operating characteristics curve and the Brier score, in the context of competing risks. To address censoring, we propose a unified nonparametric estimation framework for both discrimination and calibration measures, by weighting the censored subjects with the conditional probability of the event of interest given the observed data. The proposed method can be extended to time-dependent predictive accuracy metrics constructed from a general class of loss functions. We apply the methodology to a data set from the African American Study of Kidney Disease and Hypertension to evaluate the predictive accuracy of a prognostic risk score in predicting end-stage renal disease, accounting for the competing risk of pre-end-stage renal disease death, and evaluate its numerical performance in extensive simulation studies. Copyright © 2018 John Wiley & Sons, Ltd.
Spatial planning using probabilistic flood maps
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano
2015-04-01
Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.
Dirler, Julia; Winkler, Gertrud; Lachenmeier, Dirk W
2018-06-01
The International Agency for Research on Cancer (IARC) evaluates "very hot (>65 °C) beverages" as probably carcinogenic to humans. However, there is a lack of research regarding what temperatures consumers actually perceive as "very hot" or as "too hot". A method for sensory analysis of such threshold temperatures was developed. The participants were asked to mix a very hot coffee step by step into a cooler coffee. Because of that, the coffee to be tasted was incrementally increased in temperature during the test. The participants took a sip at every addition, until they perceive the beverage as too hot for consumption. The protocol was evaluated in the form of a pilot study using 87 participants. Interestingly, the average pain threshold of the test group (67 °C) and the preferred drinking temperature (63 °C) iterated around the IARC threshold for carcinogenicity. The developed methodology was found as fit for the purpose and may be applied in larger studies.
Vibroacoustic test plan evaluation: Parameter variation study
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloef, H. R.
1976-01-01
Statistical decision models are shown to provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology developed provides a major step toward the development of a realistic tool to quantitatively tailor test programs to specific payloads. Testing is considered at the no test, component, subassembly, or system level of assembly. Component redundancy and partial loss of flight data are considered. Most and probabilistic costs are considered, and incipient failures resulting from ground tests are treated. Optimums defining both component and assembly test levels are indicated for the modified test plans considered. modeling simplifications must be considered in interpreting the results relative to a particular payload. New parameters introduced were a no test option, flight by flight failure probabilities, and a cost to design components for higher vibration requirements. Parameters varied were the shuttle payload bay internal acoustic environment, the STS launch cost, the component retest/repair cost, and the amount of redundancy in the housekeeping section of the payload reliability model.
van der Meer, Esther W C; van Dongen, Johanna M; Boot, Cécile R L; van der Gulden, Joost W J; Bosmans, Judith E; Anema, Johannes R
2016-05-01
The aim of this study was to evaluate the cost-effectiveness of a multifaceted implementation strategy for the prevention of hand eczema in comparison with a control group among healthcare workers. A total of 48 departments (n=1,649) were randomly allocated to the implementation strategy or the control group. Data on hand eczema and costs were collected at baseline and every 3 months. Cost-effectiveness analyses were performed using linear multilevel analyses. The probability of the implementation strategy being cost-effective gradually increased with an increasing willingness-to-pay, to 0.84 at a ceiling ratio of €590,000 per person with hand eczema prevented (societal perspective). The implementation strategy appeared to be not cost-effective in comparison with the control group (societal perspective), nor was it cost-beneficial to the employer. However, this study had some methodological problems which should be taken into account when interpreting the results.
NASA Technical Reports Server (NTRS)
Tinker, Michael L.; Steincamp, James W.; Stewart, Eric T.; Patton, Bruce W.; Pannell, William P.; Newby, Ronald L.; Coffman, Mark E.; Qualls, A. L.; Bancroft, S.; Molvik, Greg
2003-01-01
The Nuclear Electric Vehicle Optimization Toolset (NEVOT) optimizes the design of all major Nuclear Electric Propulsion (NEP) vehicle subsystems for a defined mission within constraints and optimization parameters chosen by a user. The tool uses a Genetic Algorithm (GA) search technique to combine subsystem designs and evaluate the fitness of the integrated design to fulfill a mission. The fitness of an individual is used within the GA to determine its probability of survival through successive generations in which the designs with low fitness are eliminated and replaced with combinations or mutations of designs with higher fitness. The program can find optimal solutions for different sets of fitness metrics without modification and can create and evaluate vehicle designs that might never be conceived of through traditional design techniques. It is anticipated that the flexible optimization methodology will expand present knowledge of the design trade-offs inherent in designing nuclear powered space vehicles and lead to improved NEP designs.
Safety belt and motorcycle helmet use in Virginia : the December 2003 update.
DOT National Transportation Integrated Search
2004-01-01
The Virginia Transportation Research Council has been collecting safety belt use data in Virginia since 1974. Beginning in 1992, the data gathering methodology was changed to a statistically valid probability-based sampling plan in accordance with fe...
Polynomial chaos representation of databases on manifolds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soize, C., E-mail: christian.soize@univ-paris-est.fr; Ghanem, R., E-mail: ghanem@usc.edu
2017-04-15
Characterizing the polynomial chaos expansion (PCE) of a vector-valued random variable with probability distribution concentrated on a manifold is a relevant problem in data-driven settings. The probability distribution of such random vectors is multimodal in general, leading to potentially very slow convergence of the PCE. In this paper, we build on a recent development for estimating and sampling from probabilities concentrated on a diffusion manifold. The proposed methodology constructs a PCE of the random vector together with an associated generator that samples from the target probability distribution which is estimated from data concentrated in the neighborhood of the manifold. Themore » method is robust and remains efficient for high dimension and large datasets. The resulting polynomial chaos construction on manifolds permits the adaptation of many uncertainty quantification and statistical tools to emerging questions motivated by data-driven queries.« less
Leue, Anja; Cano Rodilla, Carmen; Beauducel, André
2015-01-01
Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated. PMID:26783525
Leue, Anja; Cano Rodilla, Carmen; Beauducel, André
2015-01-01
Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.; Bandte, Oliver; Schrage, Daniel P.
1996-01-01
This paper outlines an approach for the determination of economically viable robust design solutions using the High Speed Civil Transport (HSCT) as a case study. Furthermore, the paper states the advantages of a probability based aircraft design over the traditional point design approach. It also proposes a new methodology called Robust Design Simulation (RDS) which treats customer satisfaction as the ultimate design objective. RDS is based on a probabilistic approach to aerospace systems design, which views the chosen objective as a distribution function introduced by so called noise or uncertainty variables. Since the designer has no control over these variables, a variability distribution is defined for each one of them. The cumulative effect of all these distributions causes the overall variability of the objective function. For cases where the selected objective function depends heavily on these noise variables, it may be desirable to obtain a design solution that minimizes this dependence. The paper outlines a step by step approach on how to achieve such a solution for the HSCT case study and introduces an evaluation criterion which guarantees the highest customer satisfaction. This customer satisfaction is expressed by the probability of achieving objective function values less than a desired target value.
Bentur, J S; Andow, D A; Cohen, M B; Romena, A M; Gould, F
2000-10-01
Using the F2 screen methodology, we estimated the frequency of alleles conferring resistance to the Cry1Ab toxin of Bacillus thuringiensis Berliner in a Philippine population of the stem borer Scirpophaga incertulas (Walker). Evaluation of >450 isofemale lines for survival of F2 larvae on cry1Ab plants did not detect the presence of an allele conferring a high level of resistance. The frequency of such an allele in the sampled population was conservatively estimated to be <3.6 x 10(-3) with 95% confidence and a detection probability of 94%. However, there was evidence of the presence of alleles conferring partial resistance to Cry1Ab. The frequency of alleles for partial resistance was estimated as 4.8 x 10(-3) with a 95% CI between 1.3 x 10(-3) and 1.04 x 10(-2) and a detection probability of 94%. Our results suggest that the frequency of alleles conferring resistance to Cry1Ab in the population of S. incertulas sampled is not too high to preclude successful implementation of the high dose/refuge resistance management strategy.
Buij, R.; McShea, W.J.; Campbell, P.; Lee, M.E.; Dallmeier, F.; Guimondou, S.; Mackaga, L.; Guisseougou, N.; Mboumba, S.; Hines, J.E.; Nichols, J.D.; Alonso, A.
2007-01-01
The importance of human activity and ecological features in influencing African forest elephant ranging behaviour was investigated in the Rabi-Ndogo corridor of the Gamba Complex of Protected Areas in southwest Gabon. Locations in a wide geographical area with a range of environmental variables were selected for patch-occupancy surveys using elephant dung to assess seasonal presence and absence of elephants. Patch-occupancy procedures allowed for covariate modelling evaluating hypotheses for both occupancy in relation to human activity and ecological features, and detection probability in relation to vegetation density. The best fitting models for old and fresh dung data sets indicate that (1) detection probability for elephant dung is negatively related to the relative density of the vegetation, and (2) human activity, such as presence and infrastructure, are more closely associated with elephant distribution patterns than are ecological features, such as the presence of wetlands and preferred fresh fruit. Our findings emphasize the sensitivity of elephants to human disturbance, in this case infrastructure development associated with gas and oil production. Patch-occupancy methodology offers a viable alternative to current transect protocols for monitoring programs with multiple covariates.
POD evaluation using simulation: A phased array UT case on a complex geometry part
NASA Astrophysics Data System (ADS)
Dominguez, Nicolas; Reverdy, Frederic; Jenson, Frederic
2014-02-01
The use of Probability of Detection (POD) for NDT performances demonstration is a key link in products lifecycle management. The POD approach is to apply the given NDT procedure on a series of known flaws to estimate the probability to detect with respect to the flaw size. A POD is relevant if and only if NDT operations are carried out within the range of variability authorized by the procedure. Such experimental campaigns require collection of large enough datasets to cover the range of variability with sufficient occurrences to build a reliable POD statistics, leading to expensive costs to get POD curves. In the last decade research activities have been led in the USA with the MAPOD group and later in Europe with the SISTAE and PICASSO projects based on the idea to use models and simulation tools to feed POD estimations. This paper proposes an example of application of POD using simulation on the inspection procedure of a complex -full 3D- geometry part using phased arrays ultrasonic testing. It illustrates the methodology and the associated tools developed in the CIVA software. The paper finally provides elements of further progress in the domain.
Chatziprodromidou, I P; Apostolou, T
2018-04-01
The aim of the study was to estimate the sensitivity and specificity of enzyme-linked immunosorbent assay (ELISA) and immunoblot (IB) for detecting antibodies of Neospora caninum in dairy cows, in the absence of a gold standard. The study complies with STRADAS-paratuberculosis guidelines for reporting the accuracy of the test. We tried to apply Bayesian models that do not require conditional independence of the tests under evaluation, but as convergence problems appeared, we used Bayesian methodology, that does not assume conditional dependence of the tests. Informative prior probability distributions were constructed, based on scientific inputs regarding sensitivity and specificity of the IB test and the prevalence of disease in the studied populations. IB sensitivity and specificity were estimated to be 98.8% and 91.3%, respectively, while the respective estimates for ELISA were 60% and 96.7%. A sensitivity analysis, where modified prior probability distributions concerning IB diagnostic accuracy applied, showed a limited effect in posterior assessments. We concluded that ELISA can be used to screen the bulk milk and secondly, IB can be used whenever needed.
Costa, J B G; Ahola, J K; Weller, Z D; Peel, R K; Whittier, J C; Barcellos, J O J
2016-06-01
The objective of this research was to define and analyze drops in reticulo-rumen temperature (Trr) as an indicator of calving time in Holstein females. Data were collected from 111 primiparous and 150 parous Holstein females between November 2012 and March 2013. Between -15 and -5 d relative to anticipated calving date, each female received an orally administered temperature sensing reticulo-rumen bolus that collected temperatures hourly. Daily mean Trr was calculated from d -5 to 0 relative to using all Trr values (A-Trr) or only Trr values ≥37.7°C (W-Trr) not altered by water intake. To identify a Trr drop, 2 methodologies for computing the baseline temperature were used. Generalized linear models (GLM) were used to estimate the probability of calving within the next 12 or 24 h for primiparous, parous, and all females, based on the size of the Trr drop. For all GLM, a large drop in Trr corresponded with a large estimated probability of calving. The predictive power of the GLM was assessed using receiver-operating characteristic (ROC) curves. The ROC curve analyses showed that all models, regardless of methodology in calculation of the baseline or tested category (primiparous or parous), were able to predict calving; however, area under the ROC curve values, an indication of prediction quality, were greater for methods predicting calving within 24 h. Further comparisons between GLM for primiparous and parous, and using baseline 1 and 2, provide insight on the differences in predictive performance. Based on the GLM, Trr drops of 0.2, 0.3, and 0.4°C were identified as useful indicators of parturition and further analyzed using sensitivity, specificity, and diagnostic odds ratios. Based on sensitivity, specificity, and diagnostic odds ratios, the best indicator of calving was an average Trr drop ≥0.2°C, regardless of methodology used to compute the baseline or category of animal evaluated. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guan, Xuefei; Zhou, S. Kevin; Rasselkorde, El Mahjoub
The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location.more » The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.« less
NASA Astrophysics Data System (ADS)
Guan, Xuefei; Rasselkorde, El Mahjoub; Abbasi, Waheed; Zhou, S. Kevin
2015-03-01
The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location. The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.
Saikia, Ruprekha; Baruah, Bhargav; Kalita, Dipankar; Pant, Kamal K; Gogoi, Nirmali; Kataki, Rupam
2018-04-01
The objective of the present investigation was to optimize the pyrolysis condition of an abundantly available and low cost perennial grass of north-east India Saccharum ravannae L. (S. ravannae) using response surface methodology based on central composite design. Kinetic study of the biomass was conducted at four different heating rates of 10, 20, 40 and 60 °C min -1 and results were interpreted by Friedman, Kissinger Akira Sunnose and Flynn-Wall-Ozawa methods. Average activation energy 151.45 kJ mol -1 was used for evaluation of reaction mechanism following Criado master plot. Maximum bio-oil yield of 38.1 wt% was obtained at pyrolysis temperature of 550 °C, heating rate of 20 °C min -1 and nitrogen flow rate of 226 mL min -1 . Study on bio-oil quality revealed higher content of hydrocarbon, antioxidant property, total phenolic content and metal chelating capacity. These opened up probable applications of S. ravannae bio-oil in different fields including fuel, food industry and biomedical domain. Copyright © 2018 Elsevier Ltd. All rights reserved.
Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability
2015-07-01
12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015...Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability Marwan M. Harajli Graduate Student, Dept. of Civil and Environ...criterion is usually the failure probability . In this paper, we examine the buffered failure probability as an attractive alternative to the failure
Roberson, A.M.; Andersen, D.E.; Kennedy, P.L.
2005-01-01
Broadcast surveys using conspecific calls are currently the most effective method for detecting northern goshawks (Accipiter gentilis) during the breeding season. These surveys typically use alarm calls during the nestling phase and juvenile food-begging calls during the fledgling-dependency phase. Because goshawks are most vocal during the courtship phase, we hypothesized that this phase would be an effective time to detect goshawks. Our objective was to improve current survey methodology by evaluating the probability of detecting goshawks at active nests in northern Minnesota in 3 breeding phases and at 4 broadcast distances and to determine the effective area surveyed per broadcast station. Unlike previous studies, we broadcast calls at only 1 distance per trial. This approach better quantifies (1) the relationship between distance and probability of detection, and (2) the effective area surveyed (EAS) per broadcast station. We conducted 99 broadcast trials at 14 active breeding areas. When pooled over all distances, detection rates were highest during the courtship (70%) and fledgling-dependency phases (68%). Detection rates were lowest during the nestling phase (28%), when there appeared to be higher variation in likelihood of detecting individuals. EAS per broadcast station was 39.8 ha during courtship and 24.8 ha during fledgling-dependency. Consequently, in northern Minnesota, broadcast stations may be spaced 712m and 562 m apart when conducting systematic surveys during courtship and fledgling-dependency, respectively. We could not calculate EAS for the nestling phase because probability of detection was not a simple function of distance from nest. Calculation of EAS could be applied to other areas where the probability of detection is a known function of distance.
2014-01-01
Background Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore, assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. Methods Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity and comprehensiveness, a simple five-step procedure was developed. Results For a more valid assessment of results from a randomised clinical trial we propose the following five-steps: (1) report the confidence intervals and the exact P-values; (2) report Bayes factor for the primary outcome, being the ratio of the probability that a given trial result is compatible with a ‘null’ effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance threshold if the trial is stopped early or if interim analyses have been conducted; (4) adjust the confidence intervals and the P-values for multiplicity due to number of outcome comparisons; and (5) assess clinical significance of the trial results. Conclusions If the proposed five-step procedure is followed, this may increase the validity of assessments of intervention effects in randomised clinical trials. PMID:24588900
NASA Astrophysics Data System (ADS)
Kim, Dong Hyeok; Lee, Ouk Sub; Kim, Hong Min; Choi, Hye Bin
2008-11-01
A modified Split Hopkinson Pressure Bar technique with aluminum pressure bars and a pulse shaper technique to achieve a closer impedance match between the pressure bars and the specimen materials such as hot temperature degraded POM (Poly Oxy Methylene) and PP (Poly Propylene). The more distinguishable experimental signals were obtained to evaluate the more accurate dynamic deformation behavior of materials under a high strain rate loading condition. A pulse shaping technique is introduced to reduce the non-equilibrium on the dynamic material response by modulation of the incident wave during a short period of test. This increases the rise time of the incident pulse in the SHPB experiment. For the dynamic stress strain curve obtained from SHPB experiment, the Johnson-Cook model is applied as a constitutive equation. The applicability of this constitutive equation is verified by using the probabilistic reliability estimation method. Two reliability methodologies such as the FORM and the SORM have been proposed. The limit state function(LSF) includes the Johnson-Cook model and applied stresses. The LSF in this study allows more statistical flexibility on the yield stress than a paper published before. It is found that the failure probability estimated by using the SORM is more reliable than those of the FORM/ It is also noted that the failure probability increases with increase of the applied stress. Moreover, it is also found that the parameters of Johnson-Cook model such as A and n, and the applied stress are found to affect the failure probability more severely than the other random variables according to the sensitivity analysis.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-06-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-03-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
Prediction of road accidents: A Bayesian hierarchical approach.
Deublein, Markus; Schubert, Matthias; Adey, Bryan T; Köhler, Jochen; Faber, Michael H
2013-03-01
In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models. Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions of the model response variables, conditional on the values of the risk indicating variables. The methodology is illustrated through a case study using data of the Austrian rural motorway network. In the case study, on randomly selected road segments the methodology is used to produce a model to predict the expected number of accidents in which an injury has occurred and the expected number of light, severe and fatally injured road users. Additionally, the methodology is used for geo-referenced identification of road sections with increased occurrence probabilities of injury accident events on a road link between two Austrian cities. It is shown that the proposed methodology can be used to develop models to estimate the occurrence of road accidents for any road network provided that the required data are available. Copyright © 2012 Elsevier Ltd. All rights reserved.
A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels
NASA Astrophysics Data System (ADS)
Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian
2016-08-01
Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).
Sulis, William H
2017-10-01
Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.
The Probability Evaluation Game: An Instrument to Highlight the Skill of Reflexive Listening
ERIC Educational Resources Information Center
Butler, Clare
2016-01-01
This paper describes the development of the Probability Evaluation Game (PEG): an innovative teaching instrument that emphasises the sophistication of listening and highlights listening as a key skill for accounting practitioners. Whilst in a roundtable format, playing PEG involves participants individually evaluating a series of probability terms…
Analyzing time-ordered event data with missed observations.
Dokter, Adriaan M; van Loon, E Emiel; Fokkema, Wimke; Lameris, Thomas K; Nolet, Bart A; van der Jeugd, Henk P
2017-09-01
A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally missed by observers or observational devices. These time series are not restricted to behavioral protocols, but can be any cyclic or recurring process where discrete outcomes are observed. Undetected events cause biased inferences on the process of interest, and statistical analyses are needed that can identify and correct the compromised detection processes. Missed observations in time series lead to observed time intervals between events at multiples of the true inter-event time, which conveys information on their detection probability. We derive the theoretical probability density function for observed intervals between events that includes a probability of missed detection. Methodology and software tools are provided for analysis of event data with potential observation bias and its removal. The methodology was applied to simulation data and a case study of defecation rate estimation in geese, which is commonly used to estimate their digestive throughput and energetic uptake, or to calculate goose usage of a feeding site from dropping density. Simulations indicate that at a moderate chance to miss arrival events ( p = 0.3), uncorrected arrival intervals were biased upward by up to a factor 3, while parameter values corrected for missed observations were within 1% of their true simulated value. A field case study shows that not accounting for missed observations leads to substantial underestimates of the true defecation rate in geese, and spurious rate differences between sites, which are introduced by differences in observational conditions. These results show that the derived methodology can be used to effectively remove observational biases in time-ordered event data.
Methodology for evaluation of railroad technology research projects
DOT National Transportation Integrated Search
1981-04-01
This Project memorandum presents a methodology for evaluating railroad research projects. The methodology includes consideration of industry and societal benefits, with special attention given to technical risks, implementation considerations, and po...
Ballesteros, Mónica; Montero, Nadia; López-Pousa, Antonio; Urrútia, Gerard; Solà, Ivan; Rada, Gabriel; Pardo-Hernandez, Hector; Bonfill, Xavier
2017-09-07
Gastrointestinal Stromal Tumours (GISTs) are the most common mesenchymal tumours. Currently, different pharmacological and surgical options are used to treat localised and metastatic GISTs, although this research field is broad and the body of evidence is scattered and expanding. Our objectives are to identify, describe and organise the current available evidence for GIST through an evidence mapping approach. We followed the methodology of Global Evidence Mapping (GEM). We searched Pubmed, EMBASE, The Cochrane Library and Epistemonikos in order to identify systematic reviews (SRs) with or without meta-analyses published between 1990 and March 2016. Two authors assessed eligibility and extracted data. Methodological quality of the included systematic reviews was assessed using AMSTAR. We organised the results according to identified PICO questions and presented the evidence map in tables and a bubble plot. A total of 17 SRs met eligibility criteria. These reviews included 66 individual studies, of which three quarters were either observational or uncontrolled clinical trials. Overall, the quality of the included SRs was moderate or high. In total, we extracted 14 PICO questions from them and the corresponding results mostly favoured the intervention arm. The most common type of study used to evaluate therapeutic interventions in GIST sarcomas has been non-experimental studies. However, the majority of the interventions are reported as beneficial or probably beneficial by the respective authors of SRs. The evidence mapping is a useful and reliable methodology to identify and present the existing evidence about therapeutic interventions.
[Evaluation of work-related stress in call-center workers: application of a methodology].
Ansaloni, Gianluca; Cichella, Patrizia; Morelli, Carla; Alberghini, Villiam; Finardi, Elisabetta; Guglielmin, Antonia Maria; Nini, Donatella; Sacenti, Elisabetta; Stagni, Cristina
2014-01-01
Several studies highlighting a correlation between call-center working conditions and psychosocial and ergonomic hazards. The aim of this study is to provide an operating methodology for the risk assessment of work-related stress. The study involved 554 call-centre workers employed in three insurance organizations and a mixed work group (worker, company and public health representative) for the study management was defined. We experimented an objective self-made checklist and then we administered a modified version of the OSI (Occupational Stress Indicator) questionnaire. We obtained complementary information from the two different data collection methods. The findings highlight a low level of perceived stress and health complaints compared with other studies previously carried out mainly in 'outsourcing' call centres: workers don't show stress symptoms without adopting coping strategies. Moreover the study underlines an acceptable level of work satisfaction, although there are low career opportunities. These results are probable due to the low job seniority associated to the high job security--the large majority of respondents, 87%, consisted of permanent workers--and the working time mainly consisted of daily shifts five days a week. Our methodology seems to be able to detect the level of work-related stress with a good degree of coherence. Furthermore the presence of a mixed work group determined a good level of involvement among the workers: 464 out of 554 operators completed and returned the questionnaire, representing a response rate of about 84%.
Methodology of management of dredging operations II. Applications.
Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D
2006-04-01
This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.
2004-03-01
probabilistic by design. Finally, as the fragments disperse, fragment density decreases, and the probability of a fragment strike drops rapidly. Given the...Any PPE subjected to such testing needs to be exposed repeatedly to several mines in order to obtain a sufficient number of strikes . This will allow...velocity of each fragment, and the location of fragment strikes cannot be controlled precisely. This means that the same test must be repeated a
A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.
Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen
2014-01-01
Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.
Rajeswaran, Jeevanantham; Blackstone, Eugene H; Ehrlinger, John; Li, Liang; Ishwaran, Hemant; Parides, Michael K
2018-01-01
Atrial fibrillation is an arrhythmic disorder where the electrical signals of the heart become irregular. The probability of atrial fibrillation (binary response) is often time varying in a structured fashion, as is the influence of associated risk factors. A generalized nonlinear mixed effects model is presented to estimate the time-related probability of atrial fibrillation using a temporal decomposition approach to reveal the pattern of the probability of atrial fibrillation and their determinants. This methodology generalizes to patient-specific analysis of longitudinal binary data with possibly time-varying effects of covariates and with different patient-specific random effects influencing different temporal phases. The motivation and application of this model is illustrated using longitudinally measured atrial fibrillation data obtained through weekly trans-telephonic monitoring from an NIH sponsored clinical trial being conducted by the Cardiothoracic Surgery Clinical Trials Network.
Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization
Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.
2014-01-01
Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406
Forecasting a winner for Malaysian Cup 2013 using soccer simulation model
NASA Astrophysics Data System (ADS)
Yusof, Muhammad Mat; Fauzee, Mohd Soffian Omar; Latif, Rozita Abdul
2014-07-01
This paper investigates through soccer simulation the calculation of the probability for each team winning Malaysia Cup 2013. Our methodology used here is we predict the outcomes of individual matches and then we simulate the Malaysia Cup 2013 tournament 5000 times. As match outcomes are always a matter of uncertainty, statistical model, in particular a double Poisson model is used to predict the number of goals scored and conceded for each team. Maximum likelihood estimation is use to measure the attacking strength and defensive weakness for each team. Based on our simulation result, LionXII has a higher probability in becoming the winner, followed by Selangor, ATM, JDT and Kelantan. Meanwhile, T-Team, Negeri Sembilan and Felda United have lower probabilities to win Malaysia Cup 2013. In summary, we find that the probability for each team becominga winner is small, indicating that the level of competitive balance in Malaysia Cup 2013 is quite high.
Mauro, John C; Loucks, Roger J; Balakrishnan, Jitendra; Raghavan, Srikanth
2007-05-21
The thermodynamics and kinetics of a many-body system can be described in terms of a potential energy landscape in multidimensional configuration space. The partition function of such a landscape can be written in terms of a density of states, which can be computed using a variety of Monte Carlo techniques. In this paper, a new self-consistent Monte Carlo method for computing density of states is described that uses importance sampling and a multiplicative update factor to achieve rapid convergence. The technique is then applied to compute the equilibrium quench probability of the various inherent structures (minima) in the landscape. The quench probability depends on both the potential energy of the inherent structure and the volume of its corresponding basin in configuration space. Finally, the methodology is extended to the isothermal-isobaric ensemble in order to compute inherent structure quench probabilities in an enthalpy landscape.
Medical Problem-Solving: A Critique of the Literature.
ERIC Educational Resources Information Center
McGuire, Christine H.
1985-01-01
Prescriptive, decision-analysis of medical problem-solving has been based on decision theory that involves calculation and manipulation of complex probability and utility values to arrive at optimal decisions that will maximize patient benefits. The studies offer a methodology for improving clinical judgment. (Author/MLW)
Fault Tree Analysis: An Emerging Methodology for Instructional Science.
ERIC Educational Resources Information Center
Wood, R. Kent; And Others
1979-01-01
Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Goebel, Kai
2013-01-01
This paper investigates the use of the inverse first-order reliability method (inverse- FORM) to quantify the uncertainty in the remaining useful life (RUL) of aerospace components. The prediction of remaining useful life is an integral part of system health prognosis, and directly helps in online health monitoring and decision-making. However, the prediction of remaining useful life is affected by several sources of uncertainty, and therefore it is necessary to quantify the uncertainty in the remaining useful life prediction. While system parameter uncertainty and physical variability can be easily included in inverse-FORM, this paper extends the methodology to include: (1) future loading uncertainty, (2) process noise; and (3) uncertainty in the state estimate. The inverse-FORM method has been used in this paper to (1) quickly obtain probability bounds on the remaining useful life prediction; and (2) calculate the entire probability distribution of remaining useful life prediction, and the results are verified against Monte Carlo sampling. The proposed methodology is illustrated using a numerical example.
How Root Cause Analysis Can Improve the Value Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wixson, James Robert
2002-05-01
Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can bemore » developed in the creativity phase because the team better understands the problems associated with these functions.« less
Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure
ERIC Educational Resources Information Center
Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.
2014-01-01
Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
Superior model for fault tolerance computation in designing nano-sized circuit systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, N. S. S., E-mail: narinderjit@petronas.com.my; Muthuvalu, M. S., E-mail: msmuthuvalu@gmail.com; Asirvadam, V. S., E-mail: vijanth-sagayan@petronas.com.my
2014-10-24
As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalizationmore » of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines.« less
Screening for Chlamydia trachomatis: a systematic review of the economic evaluations and modelling
Roberts, T E; Robinson, S; Barton, P; Bryan, S; Low, N
2006-01-01
Objective To review systematically and critically, evidence used to derive estimates of costs and cost effectiveness of chlamydia screening. Methods Systematic review. A search of 11 electronic bibliographic databases from the earliest date available to August 2004 using keywords including chlamydia, pelvic inflammatory disease, economic evaluation, and cost. We included studies of chlamydia screening in males and/or females over 14 years, including studies of diagnostic tests, contact tracing, and treatment as part of a screening programme. Outcomes included cases of chlamydia identified and major outcomes averted. We assessed methodological quality and the modelling approach used. Results Of 713 identified papers we included 57 formal economic evaluations and two cost studies. Most studies found chlamydia screening to be cost effective, partner notification to be an effective adjunct, and testing with nucleic acid amplification tests, and treatment with azithromycin to be cost effective. Methodological problems limited the validity of these findings: most studies used static models that are inappropriate for infectious diseases; restricted outcomes were used as a basis for policy recommendations; and high estimates of the probability of chlamydia associated complications might have overestimated cost effectiveness. Two high quality dynamic modelling studies found opportunistic screening to be cost effective but poor reporting or uncertainty about complication rates make interpretation difficult. Conclusion The inappropriate use of static models to study interventions to prevent a communicable disease means that uncertainty remains about whether chlamydia screening programmes are cost effective or not. The results of this review can be used by health service managers in the allocation of resources, and health economists and other researchers who are considering further research in this area. PMID:16731666
Roberge, Jason; O'Rourke, Mary Kay; Meza-Montenegro, Maria Mercedes; Gutiérrez-Millán, Luis Enrique; Burgess, Jefferey L; Harris, Robin B
2012-04-01
The Binational Arsenic Exposure Survey (BAsES) was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and Sonora, Mexico. Adults responded to questionnaires and provided dietary information. A first morning urine void and water from all household drinking sources were collected. Associations between urinary arsenic concentration (total, organic, inorganic) and estimated level of arsenic consumed from water and other beverages were evaluated through crude associations and by random effects models. Median estimated total arsenic intake from beverages among participants from Arizona communities ranged from 1.7 to 14.1 µg/day compared to 0.6 to 3.4 µg/day among those from Mexico communities. In contrast, median urinary inorganic arsenic concentrations were greatest among participants from Hermosillo, Mexico (6.2 µg/L) whereas a high of 2.0 µg/L was found among participants from Ajo, Arizona. Estimated arsenic intake from drinking water was associated with urinary total arsenic concentration (p < 0.001), urinary inorganic arsenic concentration (p < 0.001), and urinary sum of species (p < 0.001). Urinary arsenic concentrations increased between 7% and 12% for each one percent increase in arsenic consumed from drinking water. Variability in arsenic intake from beverages and urinary arsenic output yielded counter intuitive results. Estimated intake of arsenic from all beverages was greatest among Arizonans yet participants in Mexico had higher urinary total and inorganic arsenic concentrations. Other contributors to urinary arsenic concentrations should be evaluated.
Roberge, Jason; O’Rourke, Mary Kay; Meza-Montenegro, Maria Mercedes; Gutiérrez-Millán, Luis Enrique; Burgess, Jefferey L.; Harris, Robin B.
2012-01-01
The Binational Arsenic Exposure Survey (BAsES) was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and Sonora, Mexico. Adults responded to questionnaires and provided dietary information. A first morning urine void and water from all household drinking sources were collected. Associations between urinary arsenic concentration (total, organic, inorganic) and estimated level of arsenic consumed from water and other beverages were evaluated through crude associations and by random effects models. Median estimated total arsenic intake from beverages among participants from Arizona communities ranged from 1.7 to 14.1 µg/day compared to 0.6 to 3.4 µg/day among those from Mexico communities. In contrast, median urinary inorganic arsenic concentrations were greatest among participants from Hermosillo, Mexico (6.2 µg/L) whereas a high of 2.0 µg/L was found among participants from Ajo, Arizona. Estimated arsenic intake from drinking water was associated with urinary total arsenic concentration (p < 0.001), urinary inorganic arsenic concentration (p < 0.001), and urinary sum of species (p < 0.001). Urinary arsenic concentrations increased between 7% and 12% for each one percent increase in arsenic consumed from drinking water. Variability in arsenic intake from beverages and urinary arsenic output yielded counter intuitive results. Estimated intake of arsenic from all beverages was greatest among Arizonans yet participants in Mexico had higher urinary total and inorganic arsenic concentrations. Other contributors to urinary arsenic concentrations should be evaluated. PMID:22690182
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
Design Of An Intelligent Robotic System Organizer Via Expert System Tecniques
NASA Astrophysics Data System (ADS)
Yuan, Peter H.; Valavanis, Kimon P.
1989-02-01
Intelligent Robotic Systems are a special type of Intelligent Machines. When modeled based on Vle theory of Intelligent Controls, they are composed of three interactive levels, namely: organization, coordination, and execution, ordered according, to the ,Principle of Increasing, Intelligence with Decreasing Precl.sion. Expert System techniques, are used to design an Intelligent Robotic System Organizer with a dynamic Knowledge Base and an interactive Inference Engine. Task plans are formulated using, either or both of a Probabilistic Approach and Forward Chapling Methodology, depending on pertinent information associated with a spec;fic requested job. The Intelligent Robotic System, Organizer is implemented and tested on a prototype system operating in an uncertain environment. An evaluation of-the performance, of the prototype system is conducted based upon the probability of generating a successful task sequence versus the number of trials taken by the organizer.
Zakharov, Sergey
2011-03-01
The relevance and admissibility of expert medical testimony in relation to medical malpractice suits requires a more successful development of formal criteria and a more intentional compliance with efficient judicial procedures. The American judicial system provides an excellent model for implementation of a critical approach to knowledge collection, the evaluation of the validity of scientifically sound information, and the examination of expert's testimony on the basis of a sound methodology. An analysis of the assessment and application of reliability yields evidence that assuring standards to improve the quality of expert medical testimony will increase the overall probability of a fair outcome during the judicial process. Applying these beneficial strategies in medical malpractice cases will continue to support further considerations of promoting justice and solving problems through sufficient scientific means.
Evaluation of a hydrological model based on Bidirectional Reach (BReach)
NASA Astrophysics Data System (ADS)
Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Verhoest, Niko E. C.
2016-04-01
Evaluation and discrimination of model structures is crucial to ensure an appropriate use of hydrological models. When evaluating model results by aggregating their quality in (a subset of) individual observations, overall results of this analysis sometimes conceal important detailed information about model structural deficiencies. Analyzing model results within their local (time) context can uncover this detailed information. In this research, a methodology called Bidirectional Reach (BReach) is proposed to evaluate and analyze results of a hydrological model by assessing the maximum left and right reach in each observation point that is used for model evaluation. These maximum reaches express the capability of the model to describe a subset of the evaluation data both in the direction of the previous (left) and of the following data (right). This capability is evaluated on two levels. First, on the level of individual observations, the combination of a parameter set and an observation is classified as non-acceptable if the deviation between the accompanying model result and the measurement exceeds observational uncertainty. Second, the behavior in a sequence of observations is evaluated by means of a tolerance degree. This tolerance degree expresses the condition for satisfactory model behavior in a data series and is defined by the percentage of observations within this series that can have non-acceptable model results. Based on both criteria, the maximum left and right reaches of a model in an observation represent the data points in the direction of the previous respectively the following observations beyond which none of the sampled parameter sets both are satisfactory and result in an acceptable deviation. After assessing these reaches for a variety of tolerance degrees, results can be plotted in a combined BReach plot that show temporal changes in the behavior of model results. The methodology is applied on a Probability Distributed Model (PDM) of the river Grote Nete upstream of Geel-Zammel with 1 106 randomly sampled parameter sets for three separate years. Acceptable model results must fit in the 95 % uncertainty bounds of observed discharges and tolerance degrees of 0 %, 5 %, 10 %, 20 % and 40 % are applied. An evaluation of BReach results with regard to other variables, such as the magnitude and the rate of change of the observed discharges enables to detect recurring patterns in model errors. This results in an augmented understanding of the model's structural deficiencies, revealing the incapability of the PDM model to simulate both high and low flow simulations with a single parameter set for this catchment. As the methodology can be applied for different hydrological model structures, it is a useful tool to gain understanding of the difference in behavior of competing models.
ERIC Educational Resources Information Center
Davis, Jamie D., Ed.; Erickson, Jill Shepard, Ed.; Johnson, Sharon R., Ed.; Marshall, Catherine A., Ed.; Running Wolf, Paulette, Ed.; Santiago, Rolando L., Ed.
This first symposium of the Work Group on American Indian Research and Program Evaluation Methodology (AIRPEM) explored American Indian and Alaska Native cultural considerations in relation to "best practices" in research and program evaluation. These cultural considerations include the importance of tribal consultation on research…
Choice-Based Segmentation as an Enrollment Management Tool
ERIC Educational Resources Information Center
Young, Mark R.
2002-01-01
This article presents an approach to enrollment management based on target marketing strategies developed from a choice-based segmentation methodology. Students are classified into "switchable" or "non-switchable" segments based on their probability of selecting specific majors. A modified multinomial logit choice model is used to identify…
Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation
2016-05-01
identifying and mapping flaw size distributions on glass surfaces for predicting mechanical response. International Journal of Applied Glass ...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education
ERIC Educational Resources Information Center
Frank, Martina W.; Walker-Moffat, Wendy
This study considered how 25 highly diverse after-school programs with funding of $5.6 million were evaluated during a 10-month period. The paper describes the evaluation methodologies used and determined which methodologies were most effective within a diverse and political context. The Bayview Fund for Youth Development (name assumed for…
NASA Astrophysics Data System (ADS)
Zhang, Jiaxin; Shields, Michael D.
2018-01-01
This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.
Methodological reviews of economic evaluations in health care: what do they target?
Hutter, Maria-Florencia; Rodríguez-Ibeas, Roberto; Antonanzas, Fernando
2014-11-01
An increasing number of published studies of economic evaluations of health technologies have been reviewed and summarized with different purposes, among them to facilitate decision-making processes. These reviews have covered different aspects of economic evaluations, using a variety of methodological approaches. The aim of this study is to analyze the methodological characteristics of the reviews of economic evaluations in health care, published during the period 1990-2010, to identify their main features and the potential missing elements. This may help to develop a common procedure for elaborating these kinds of reviews. We performed systematic searches in electronic databases (Scopus, Medline and PubMed) of methodological reviews published in English, period 1990-2010. We selected the articles whose main purpose was to review and assess the methodology applied in the economic evaluation studies. We classified the data according to the study objectives, period of the review, number of reviewed studies, methodological and non-methodological items assessed, medical specialty, type of disease and technology, databases used for the review and their main conclusions. We performed a descriptive statistical analysis and checked how generalizability issues were considered in the reviews. We identified 76 methodological reviews, 42 published in the period 1990-2001 and 34 during 2002-2010. The items assessed most frequently (by 70% of the reviews) were perspective, type of economic study, uncertainty and discounting. The reviews also described the type of intervention and disease, funding sources, country in which the evaluation took place, type of journal and author's characteristics. Regarding the intertemporal comparison, higher frequencies were found in the second period for two key methodological items: the source of effectiveness data and the models used in the studies. However, the generalizability issues that apparently are creating a growing interest in the economic evaluation literature did not receive as much attention in the reviews of the second period. The remaining items showed similar frequencies in both periods. Increasingly more reviews of economic evaluation studies aim to analyze the application of methodological principles, and offer summaries of papers classified by either diseases or health technologies. These reviews are useful for finding literature trends, aims of studies and possible deficiencies in the implementation of methods of specific health interventions. As no significant methodological improvement was clearly detected in the two periods analyzed, it would be convenient to pay more attention to the methodological aspects of the reviews.
NASA Astrophysics Data System (ADS)
Engeland, Kolbjorn; Steinsland, Ingelin
2014-05-01
This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
Alarm Variables for Dengue Outbreaks: A Multi-Centre Study in Asia and Latin America
Bowman, Leigh R.; Tejeda, Gustavo S.; Coelho, Giovanini E.; Sulaiman, Lokman H.; Gill, Balvinder S.; McCall, Philip J.; Olliaro, Piero L.; Ranzinger, Silvia R.; Quang, Luong C.; Ramm, Ronald S.; Kroeger, Axel; Petzold, Max G.
2016-01-01
Background Worldwide, dengue is an unrelenting economic and health burden. Dengue outbreaks have become increasingly common, which place great strain on health infrastructure and services. Early warning models could allow health systems and vector control programmes to respond more cost-effectively and efficiently. Methodology/Principal Findings The Shewhart method and Endemic Channel were used to identify alarm variables that may predict dengue outbreaks. Five country datasets were compiled by epidemiological week over the years 2007–2013. These data were split between the years 2007–2011 (historic period) and 2012–2013 (evaluation period). Associations between alarm/ outbreak variables were analysed using logistic regression during the historic period while alarm and outbreak signals were captured during the evaluation period. These signals were combined to form alarm/ outbreak periods, where 2 signals were equal to 1 period. Alarm periods were quantified and used to predict subsequent outbreak periods. Across Mexico and Dominican Republic, an increase in probable cases predicted outbreaks of hospitalised cases with sensitivities and positive predictive values (PPV) of 93%/ 83% and 97%/ 86% respectively, at a lag of 1–12 weeks. An increase in mean temperature ably predicted outbreaks of hospitalised cases in Mexico and Brazil, with sensitivities and PPVs of 79%/ 73% and 81%/ 46% respectively, also at a lag of 1–12 weeks. Mean age was predictive of hospitalised cases at sensitivities and PPVs of 72%/ 74% and 96%/ 45% in Mexico and Malaysia respectively, at a lag of 4–16 weeks. Conclusions/Significance An increase in probable cases was predictive of outbreaks, while meteorological variables, particularly mean temperature, demonstrated predictive potential in some countries, but not all. While it is difficult to define uniform variables applicable in every country context, the use of probable cases and meteorological variables in tailored early warning systems could be used to highlight the occurrence of dengue outbreaks or indicate increased risk of dengue transmission. PMID:27348752
Le, Quang A; Doctor, Jason N
2011-05-01
As quality-adjusted life years have become the standard metric in health economic evaluations, mapping health-profile or disease-specific measures onto preference-based measures to obtain quality-adjusted life years has become a solution when health utilities are not directly available. However, current mapping methods are limited due to their predictive validity, reliability, and/or other methodological issues. We employ probability theory together with a graphical model, called a Bayesian network, to convert health-profile measures into preference-based measures and to compare the results to those estimated with current mapping methods. A sample of 19,678 adults who completed both the 12-item Short Form Health Survey (SF-12v2) and EuroQoL 5D (EQ-5D) questionnaires from the 2003 Medical Expenditure Panel Survey was split into training and validation sets. Bayesian networks were constructed to explore the probabilistic relationships between each EQ-5D domain and 12 items of the SF-12v2. The EQ-5D utility scores were estimated on the basis of the predicted probability of each response level of the 5 EQ-5D domains obtained from the Bayesian inference process using the following methods: Monte Carlo simulation, expected utility, and most-likely probability. Results were then compared with current mapping methods including multinomial logistic regression, ordinary least squares, and censored least absolute deviations. The Bayesian networks consistently outperformed other mapping models in the overall sample (mean absolute error=0.077, mean square error=0.013, and R overall=0.802), in different age groups, number of chronic conditions, and ranges of the EQ-5D index. Bayesian networks provide a new robust and natural approach to map health status responses into health utility measures for health economic evaluations.
Tracer methodology: an appropriate tool for assessing compliance with accreditation standards?
Bouchard, Chantal; Jean, Olivier
2017-10-01
Tracer methodology has been used by Accreditation Canada since 2008 to collect evidence on the quality and safety of care and services, and to assess compliance with accreditation standards. Given the importance of this methodology in the accreditation program, the objective of this study is to assess the quality of the methodology and identify its strengths and weaknesses. A mixed quantitative and qualitative approach was adopted to evaluate consistency, appropriateness, effectiveness and stakeholder synergy in applying the methodology. An online questionnaire was sent to 468 Accreditation Canada surveyors. According to surveyors' perceptions, tracer methodology is an effective tool for collecting useful, credible and reliable information to assess compliance with Qmentum program standards and priority processes. The results show good coherence between methodology components (appropriateness of the priority processes evaluated, activities to evaluate a tracer, etc.). The main weaknesses are the time constraints faced by surveyors and management's lack of cooperation during the evaluation of tracers. The inadequate amount of time allowed for the methodology to be applied properly raises questions about the quality of the information obtained. This study paves the way for a future, more in-depth exploration of the identified weaknesses to help the accreditation organization make more targeted improvements to the methodology. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Critical reflections on methodological challenge in arts and dementia evaluation and research.
Gray, Karen; Evans, Simon Chester; Griffiths, Amanda; Schneider, Justine
2017-01-01
Methodological rigour, or its absence, is often a focus of concern for the emerging field of evaluation and research around arts and dementia. However, this paper suggests that critical attention should also be paid to the way in which individual perceptions, hidden assumptions and underlying social and political structures influence methodological work in the field. Such attention will be particularly important for addressing methodological challenges relating to contextual variability, ethics, value judgement and signification identified through a literature review on this topic. Understanding how, where and when evaluators and researchers experience such challenges may help to identify fruitful approaches for future evaluation.
Impairment: The Case of Phonotactic Probability and Nonword Repetition
ERIC Educational Resources Information Center
McKean, Cristina; Letts, Carolyn; Howard, David
2013-01-01
Purpose: In this study, the authors aimed to explore the relationship between lexical and phonological knowledge in children with primary language impairment (PLI) through the application of a developmental methodology. Specifically, they tested whether there is evidence for an impairment in the process of phonological abstraction in this group of…
Using landslide risk analysis to protect fish habitat
R. M. Rice
1986-01-01
The protection of anadromous fish habitat is an important water quslity concern in the Pacific Northwest. Sediment from logging-related debris avalanches can cause habitat degradation. Research on conditions associated with the sites where debris avalanches originate has resulted in a risk assessment methodology based on linear discriminant analysis. The probability...
Self-Employment among Italian Female Graduates
ERIC Educational Resources Information Center
Rosti, Luisa; Chelli, Francesco
2009-01-01
Purpose: The purpose of this paper is to investigate the gender impact of tertiary education on the probability of entering and remaining in self-employment. Design/methodology/approach: A data set on labour market flows produced by the Italian National Statistical Office is exploited by interviewing about 62,000 graduate and non-graduate…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, J J; Gallagher, D W; Modarres, M
Appendices are presented concerning isolation condenser makeup; vapor suppression system; station air system; reactor building closed cooling water system; turbine building secondary closed water system; service water system; emergency service water system; fire protection system; emergency ac power; dc power system; event probability estimation; methodology of accident sequence quantification; and assignment of dominant sequences to release categories.
IR-Raman Correlation of Shocked Minerals in Csátalja Meteorite — Clues for Shock Stages
NASA Astrophysics Data System (ADS)
Gyollai, I.; Kereszturi, A.; Fintor, K.; Kereszty, Zs.; Szabo, M.; Walter, H.
2017-11-01
The analyzed meteorite called Csátalja is an H chondrite (H4, S2, W2), and based on the differences between its certain parts, probably it is a breccia. The aim of methodological testing is characterizing shock deformation and heterogeneity.
ERIC Educational Resources Information Center
BOEDDINGHAUS, WALTER
THE APPARENT DISAPPOINTMENT AND SLACKENING OF ENTHUSIASTIC INTEREST IN LANGUAGE LABORATORY INSTRUCTION IS MOST PROBABLY NOT DUE TO A FUNDAMENTAL LACK OF EFFECTIVENESS, BUT TO METHODOLOGICAL AND ORGANIZATIONAL PROBLEMS YET TO BE SOLVED. MOST IMPORTANT, THE RESTRICTIVE DEPENDENCE OF LABORATORY MATERIAL ON CLASSROOM LESSONS MUST BE ABANDONED. ONLY…
[Clinical practice guidelines in Peru: evaluation of its quality using the AGREE II instrument].
Canelo-Aybar, Carlos; Balbin, Graciela; Perez-Gomez, Ángela; Florez, Iván D
2016-01-01
To evaluate the methodological quality of clinical practice guidelines (CPGs) put into practice by the Peruvian Ministry of Health (MINSA), 17 CPGs from the ministry, published between 2009 and 2014, were independently evaluated by three methodologic experts using the AGREE II instrument. The score of AGREE II domains was low and very low in all CPGs: scope and purpose (medium, 44%), clarity of presentation (medium, 47%), participation of decision-makers (medium, 8%), methodological rigor (medium, 5%), applicability (medium, 5%), and editorial independence (medium, 8%). In conclusion, the methodological quality of CPGs implemented by the MINSA is low. Consequently, its use could not be recommended. The implementation of the methodology for the development of CPGs described in the recentlypublished CPG methodological preparation manual in Peru is a pressing need.
DOT National Transportation Integrated Search
2001-03-05
A systems modeling approach is presented for assessment of harm in the automotive accident environment. The methodology is presented in general form and then applied to evaluate vehicle aggressivity in frontal crashes. The methodology consists of par...
2016-09-15
7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology
Evaluation of counterfactuality in counterfactual communication protocols
NASA Astrophysics Data System (ADS)
Arvidsson-Shukur, D. R. M.; Barnes, C. H. W.; Gottfries, A. N. O.
2017-12-01
We provide an in-depth investigation of parameter estimation in nested Mach-Zehnder interferometers (NMZIs) using two information measures: the Fisher information and the Shannon mutual information. Protocols for counterfactual communication have, so far, been based on two different definitions of counterfactuality. In particular, some schemes have been based on NMZI devices, and have recently been subject to criticism. We provide a methodology for evaluating the counterfactuality of these protocols, based on an information-theoretical framework. More specifically, we make the assumption that any realistic quantum channel in MZI structures will have some weak uncontrolled interaction. We then use the Fisher information of this interaction to measure counterfactual violations. The measure is used to evaluate the suggested counterfactual communication protocol of H. Salih et al. [Phys. Rev. Lett. 110, 170502 (2013), 10.1103/PhysRevLett.110.170502]. The protocol of D. R. M. Arvidsson-Shukur and C. H. W. Barnes [Phys. Rev. A 94, 062303 (2016), 10.1103/PhysRevA.94.062303], based on a different definition, is evaluated with a probability measure. Our results show that the definition of Arvidsson-Shukur and Barnes is satisfied by their scheme, while that of Salih et al. is only satisfied by perfect quantum channels. For realistic devices the latter protocol does not achieve its objective.
in silico Surveillance: evaluating outbreak detection with simulation models
2013-01-01
Background Detecting outbreaks is a crucial task for public health officials, yet gaps remain in the systematic evaluation of outbreak detection protocols. The authors’ objectives were to design, implement, and test a flexible methodology for generating detailed synthetic surveillance data that provides realistic geographical and temporal clustering of cases and use to evaluate outbreak detection protocols. Methods A detailed representation of the Boston area was constructed, based on data about individuals, locations, and activity patterns. Influenza-like illness (ILI) transmission was simulated, producing 100 years of in silico ILI data. Six different surveillance systems were designed and developed using gathered cases from the simulated disease data. Performance was measured by inserting test outbreaks into the surveillance streams and analyzing the likelihood and timeliness of detection. Results Detection of outbreaks varied from 21% to 95%. Increased coverage did not linearly improve detection probability for all surveillance systems. Relaxing the decision threshold for signaling outbreaks greatly increased false-positives, improved outbreak detection slightly, and led to earlier outbreak detection. Conclusions Geographical distribution can be more important than coverage level. Detailed simulations of infectious disease transmission can be configured to represent nearly any conceivable scenario. They are a powerful tool for evaluating the performance of surveillance systems and methods used for outbreak detection. PMID:23343523
Liu, Hongzhuo; Feng, Liang; Tolia, Gaurav; Liddell, Mark R.; Hao, Jinsong; Li, S. Kevin
2013-01-01
A convenient and efficient in vitro diffusion cell method to evaluate formulations for inner ear delivery via the intratympanic route is currently not available. The existing in vitro diffusion cell systems commonly used to evaluate drug formulations do not resemble the physical dimensions of the middle ear and round window membrane. The objectives of this study were to examine a modified in vitro diffusion cell system of a small diffusion area for studying sustained release formulations in inner ear drug delivery and to identify a formulation for sustained drug delivery to the inner ear. Four formulations and a control were examined in this study using cidofovir as the model drug. Drug release from the formulations in the modified diffusion cell system was slower than that in the conventional diffusion cell system due to the decrease in the diffusion surface area of the modified diffusion cell system. The modified diffusion cell system was able to show different drug release behaviors among the formulations and allowed formulation evaluation better than the conventional diffusion cell system. Among the formulations investigated, poly(lactic-co-glycolic acid)–poly(ethylene glycol)–poly(lactic-co-glycolic acid) triblock copolymer systems provided the longest sustained drug delivery, probably due to their rigid gel structures and/or polymer-to-cidofovir interactions. PMID:23631539
Continuation of probability density functions using a generalized Lyapunov approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baars, S., E-mail: s.baars@rug.nl; Viebahn, J.P., E-mail: viebahn@cwi.nl; Mulder, T.E., E-mail: t.e.mulder@uu.nl
Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.
Burst wait time simulation of CALIBAN reactor at delayed super-critical state
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humbert, P.; Authier, N.; Richard, B.
2012-07-01
In the past, the super prompt critical wait time probability distribution was measured on CALIBAN fast burst reactor [4]. Afterwards, these experiments were simulated with a very good agreement by solving the non-extinction probability equation [5]. Recently, the burst wait time probability distribution has been measured at CEA-Valduc on CALIBAN at different delayed super-critical states [6]. However, in the delayed super-critical case the non-extinction probability does not give access to the wait time distribution. In this case it is necessary to compute the time dependent evolution of the full neutron count number probability distribution. In this paper we present themore » point model deterministic method used to calculate the probability distribution of the wait time before a prescribed count level taking into account prompt neutrons and delayed neutron precursors. This method is based on the solution of the time dependent adjoint Kolmogorov master equations for the number of detections using the generating function methodology [8,9,10] and inverse discrete Fourier transforms. The obtained results are then compared to the measurements and Monte-Carlo calculations based on the algorithm presented in [7]. (authors)« less
Harvesting model uncertainty for the simulation of interannual variability
NASA Astrophysics Data System (ADS)
Misra, Vasubandhu
2009-08-01
An innovative modeling strategy is introduced to account for uncertainty in the convective parameterization (CP) scheme of a coupled ocean-atmosphere model. The methodology involves calling the CP scheme several times at every given time step of the model integration to pick the most probable convective state. Each call of the CP scheme is unique in that one of its critical parameter values (which is unobserved but required by the scheme) is chosen randomly over a given range. This methodology is tested with the relaxed Arakawa-Schubert CP scheme in the Center for Ocean-Land-Atmosphere Studies (COLA) coupled general circulation model (CGCM). Relative to the control COLA CGCM, this methodology shows improvement in the El Niño-Southern Oscillation simulation and the Indian summer monsoon precipitation variability.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
A decision model for planetary missions
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.; Brigadier, W. L.
1976-01-01
Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.
Transient Reliability of Ceramic Structures For Heat Engine Applications
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama M.
2002-01-01
The objectives of this report was to develop a methodology to predict the time-dependent reliability (probability of failure) of brittle material components subjected to transient thermomechanical loading, taking into account the change in material response with time. This methodology for computing the transient reliability in ceramic components subjected to fluctuation thermomechanical loading was developed, assuming SCG (Slow Crack Growth) as the delayed mode of failure. It takes into account the effect of varying Weibull modulus and materials with time. It was also coded into a beta version of NASA's CARES/Life code, and an example demonstrating its viability was presented.
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
Assessing agreement with relative area under the coverage probability curve.
Barnhart, Huiman X
2016-08-15
There has been substantial statistical literature in the last several decades on assessing agreement, and coverage probability approach was selected as a preferred index for assessing and improving measurement agreement in a core laboratory setting. With this approach, a satisfactory agreement is based on pre-specified high satisfactory coverage probability (e.g., 95%), given one pre-specified acceptable difference. In practice, we may want to have quality control on more than one pre-specified differences, or we may simply want to summarize the agreement based on differences up to a maximum acceptable difference. We propose to assess agreement via the coverage probability curve that provides a full spectrum of measurement error at various differences/disagreement. Relative area under the coverage probability curve is proposed for the summary of overall agreement, and this new summary index can be used for comparison of different intra-methods or inter-methods/labs/observers' agreement. Simulation studies and a blood pressure example are used for illustration of the methodology. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Probability concepts in quality risk management.
Claycamp, H Gregg
2012-01-01
Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management.
Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; Shah, Ashwin R.
1997-01-01
The properties of ceramic matrix composites (CMC's) are known to display a considerable amount of scatter due to variations in fiber/matrix properties, interphase properties, interphase bonding, amount of matrix voids, and many geometry- or fabrication-related parameters, such as ply thickness and ply orientation. This paper summarizes preliminary studies in which formal probabilistic descriptions of the material-behavior- and fabrication-related parameters were incorporated into micromechanics and macromechanics for CMC'S. In this process two existing methodologies, namely CMC micromechanics and macromechanics analysis and a fast probability integration (FPI) technique are synergistically coupled to obtain the probabilistic composite behavior or response. Preliminary results in the form of cumulative probability distributions and information on the probability sensitivities of the response to primitive variables for a unidirectional silicon carbide/reaction-bonded silicon nitride (SiC/RBSN) CMC are presented. The cumulative distribution functions are computed for composite moduli, thermal expansion coefficients, thermal conductivities, and longitudinal tensile strength at room temperature. The variations in the constituent properties that directly affect these composite properties are accounted for via assumed probabilistic distributions. Collectively, the results show that the present technique provides valuable information about the composite properties and sensitivity factors, which is useful to design or test engineers. Furthermore, the present methodology is computationally more efficient than a standard Monte-Carlo simulation technique; and the agreement between the two solutions is excellent, as shown via select examples.
HMM for hyperspectral spectrum representation and classification with endmember entropy vectors
NASA Astrophysics Data System (ADS)
Arabi, Samir Y. W.; Fernandes, David; Pizarro, Marco A.
2015-10-01
The Hyperspectral images due to its good spectral resolution are extensively used for classification, but its high number of bands requires a higher bandwidth in the transmission data, a higher data storage capability and a higher computational capability in processing systems. This work presents a new methodology for hyperspectral data classification that can work with a reduced number of spectral bands and achieve good results, comparable with processing methods that require all hyperspectral bands. The proposed method for hyperspectral spectra classification is based on the Hidden Markov Model (HMM) associated to each Endmember (EM) of a scene and the conditional probabilities of each EM belongs to each other EM. The EM conditional probability is transformed in EM vector entropy and those vectors are used as reference vectors for the classes in the scene. The conditional probability of a spectrum that will be classified is also transformed in a spectrum entropy vector, which is classified in a given class by the minimum ED (Euclidian Distance) among it and the EM entropy vectors. The methodology was tested with good results using AVIRIS spectra of a scene with 13 EM considering the full 209 bands and the reduced spectral bands of 128, 64 and 32. For the test area its show that can be used only 32 spectral bands instead of the original 209 bands, without significant loss in the classification process.
2018-03-01
We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model that...factors. We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model...9 D. BINARY CLASSIFICATION AND FEATURE SELECTION ..........11 III. METHODOLOGY
1978-03-01
for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented
NASA Astrophysics Data System (ADS)
Sciazko, Anna; Komatsu, Yosuke; Brus, Grzegorz; Kimijima, Shinji; Szmyd, Janusz S.
2014-09-01
For a mathematical model based on the result of physical measurements, it becomes possible to determine their influence on the final solution and its accuracy. However, in classical approaches, the influence of different model simplifications on the reliability of the obtained results are usually not comprehensively discussed. This paper presents a novel approach to the study of methane/steam reforming kinetics based on an advanced methodology called the Orthogonal Least Squares method. The kinetics of the reforming process published earlier are divergent among themselves. To obtain the most probable values of kinetic parameters and enable direct and objective model verification, an appropriate calculation procedure needs to be proposed. The applied Generalized Least Squares (GLS) method includes all the experimental results into the mathematical model which becomes internally contradicted, as the number of equations is greater than number of unknown variables. The GLS method is adopted to select the most probable values of results and simultaneously determine the uncertainty coupled with all the variables in the system. In this paper, the evaluation of the reaction rate after the pre-determination of the reaction rate, which was made by preliminary calculation based on the obtained experimental results over a Nickel/Yttria-stabilized Zirconia catalyst, was performed.
NASA Astrophysics Data System (ADS)
Ismaila, Aminu; Md Kasmani, Rafiziana; Meng-Hock, Koh; Termizi Ramli, Ahmad
2017-10-01
This paper deals with the assessment of external explosion, resulting from accidental release of jet fuel from the large commercial airliner in the nuclear power plant (NPP). The study used three widely prediction methods such as Trinitrotoluene (TNT), multi energy (TNO) and Baker-strehow (BST) to determine the unconfined vapour cloud explosion (UVCE) overpressure within the distances of 100-1400 m from the first impact location. The containment building was taken as the reference position. The fatalities of persons and damage of structures was estimated using probit methodology. Analysis of the results shows that both reactor building and control-room will be highly damaged with risk consequences and probability, depending on the assumed position of the crash. The structures at the radial distance of 600 m may suffer major structural damage with probability ranging from 25 to 100%. The minor structural damage was observed throughout the bounds of the plant complex. The people working within 250 m radius may get affected with different fatality ranging from 28 to 100%. The findings of this study is valuable to evaluate the safety improvement needed on the NPP site and on the risk and consequences associated with the hydrocarbon fuel release/fires due to external hazards.
Yoshioka, Akio; Fukuzawa, Kaori; Mochizuki, Yuji; Yamashita, Katsumi; Nakano, Tatsuya; Okiyama, Yoshio; Nobusawa, Eri; Nakajima, Katsuhisa; Tanaka, Shigenori
2011-09-01
Ab initio electronic-state calculations for influenza virus hemagglutinin (HA) trimer complexed with Fab antibody were performed on the basis of the fragment molecular orbital (FMO) method at the second and third-order Møller-Plesset (MP2 and MP3) perturbation levels. For the protein complex containing 2351 residues and 36,160 atoms, the inter-fragment interaction energies (IFIEs) were evaluated to illustrate the effective interactions between all the pairs of amino acid residues. By analyzing the calculated data on the IFIEs, we first discussed the interactions and their fluctuations between multiple domains contained in the trimer complex. Next, by combining the IFIE data between the Fab antibody and each residue in the HA antigen with experimental data on the hemadsorption activity of HA mutants, we proposed a protocol to predict probable mutations in HA. The proposed protocol based on the FMO-MP2.5 calculation can explain the historical facts concerning the actual mutations after the emergence of A/Hong Kong/1/68 influenza virus with subtype H3N2, and thus provides a useful methodology to enumerate those residue sites likely to mutate in the future. Copyright © 2011 Elsevier Inc. All rights reserved.
Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi
2016-10-07
Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.
The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticalitymore » during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).« less
Forster, Jeri E.; MaWhinney, Samantha; Ball, Erika L.; Fairclough, Diane
2011-01-01
Dropout is common in longitudinal clinical trials and when the probability of dropout depends on unobserved outcomes even after conditioning on available data, it is considered missing not at random and therefore nonignorable. To address this problem, mixture models can be used to account for the relationship between a longitudinal outcome and dropout. We propose a Natural Spline Varying-coefficient mixture model (NSV), which is a straightforward extension of the parametric Conditional Linear Model (CLM). We assume that the outcome follows a varying-coefficient model conditional on a continuous dropout distribution. Natural cubic B-splines are used to allow the regression coefficients to semiparametrically depend on dropout and inference is therefore more robust. Additionally, this method is computationally stable and relatively simple to implement. We conduct simulation studies to evaluate performance and compare methodologies in settings where the longitudinal trajectories are linear and dropout time is observed for all individuals. Performance is assessed under conditions where model assumptions are both met and violated. In addition, we compare the NSV to the CLM and a standard random-effects model using an HIV/AIDS clinical trial with probable nonignorable dropout. The simulation studies suggest that the NSV is an improvement over the CLM when dropout has a nonlinear dependence on the outcome. PMID:22101223
Journal: A Review of Some Tracer-Test Design Equations for ...
Determination of necessary tracer mass, initial sample-collection time, and subsequent sample-collection frequency are the three most difficult aspects to estimate for a proposed tracer test prior to conducting the tracer test. To facilitate tracer-mass estimation, 33 mass-estimation equations are reviewed here, 32 of which were evaluated using previously published tracer-test design examination parameters. Comparison of the results produced a wide range of estimated tracer mass, but no means is available by which one equation may be reasonably selected over the others. Each equation produces a simple approximation for tracer mass. Most of the equations are based primarily on estimates or measurements of discharge, transport distance, and suspected transport times. Although the basic field parameters commonly employed are appropriate for estimating tracer mass, the 33 equations are problematic in that they were all probably based on the original developers' experience in a particular field area and not necessarily on measured hydraulic parameters or solute-transport theory. Suggested sampling frequencies are typically based primarily on probable transport distance, but with little regard to expected travel times. This too is problematic in that tends to result in false negatives or data aliasing. Simulations from the recently developed efficient hydrologic tracer-test design methodology (EHTD) were compared with those obtained from 32 of the 33 published tracer-
Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi
2016-01-01
Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide. PMID:27713530
Improving detection probabilities for pests in stored grain.
Elmouttie, David; Kiermeier, Andreas; Hamilton, Grant
2010-12-01
The presence of insects in stored grain is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspection of bulk grain commodities is essential to detect pests and thereby to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grain, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper, a sampling methodology is demonstrated that accounts for the heterogeneous distribution of insects in bulk grain. It is shown that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling programme to detect insects in bulk grain. The results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. It is also demonstrated that the probability of detecting pests in bulk grain increases as the number of subsamples increases, even when the total volume or mass of grain sampled remains constant. This study underlines the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models. Copyright © 2010 Society of Chemical Industry.
Tennessee long-range transportation plan : project evaluation system
DOT National Transportation Integrated Search
2005-12-01
The Project Evaluation System (PES) Report is an analytical methodology to aid programming efforts and prioritize multimodal investments. The methodology consists of both quantitative and qualitative evaluation criteria built upon the Guiding Princip...
[Imaging techniques for studying functional recovery following a stroke: I. Methodological aspects].
Ramos-Cabrer, P; Agulla, J; Argibay, B; Brea, D; Campos, F; Castillo, J
2011-03-16
Many patients that survive stroke have to face serious functional disabilities for the rest of their lives, which is a personal drama for themselves and their relatives, and an elevated charge for society. Thus functional recovery following stroke should be a key objective for the development of new therapeutic approaches. In this series of two works we review the strategies and tools available nowadays for the evaluation of multiple aspects related to brain function (both in humans and research animals), and how they are helping neuroscientist to better understand the processes of restoration and reorganization of brain function that are triggered following stroke. We have mainly focused on magnetic resonance applications, probably the most versatile neuroimaging technique available nowadays, and that everyday surprises us with new and exciting applications. But we tackle other alternative and complementary techniques, since a multidisciplinary approach allows a wider perspective over the underlying mechanisms behind tissue repair, plastic reorganization of the brain and compensatory mechanisms that are triggered after stroke. The first of the works of this series is focused on methodological aspects that will help us to understand how it is possible to assess brain function based on different physical and physiological principles. In the second work we will focus on different practical issues related to the application of the techniques here discussed.
Crossing trend analysis methodology and application for Turkish rainfall records
NASA Astrophysics Data System (ADS)
Şen, Zekâi
2018-01-01
Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.
Huter, Kai; Dubas-Jakóbczyk, Katarzyna; Kocot, Ewa; Kissimova-Skarbek, Katarzyna; Rothgang, Heinz
2018-01-01
In the light of demographic developments health promotion interventions for older people are gaining importance. In addition to methodological challenges arising from the economic evaluation of health promotion interventions in general, there are specific methodological problems for the particular target group of older people. There are especially four main methodological challenges that are discussed in the literature. They concern measurement and valuation of informal caregiving, accounting for productivity costs, effects of unrelated cost in added life years and the inclusion of 'beyond-health' benefits. This paper focuses on the question whether and to what extent specific methodological requirements are actually met in applied health economic evaluations. Following a systematic review of pertinent health economic evaluations, the included studies are analysed on the basis of four assessment criteria that are derived from methodological debates on the economic evaluation of health promotion interventions in general and economic evaluations targeting older people in particular. Of the 37 studies included in the systematic review, only very few include cost and outcome categories discussed as being of specific relevance to the assessment of health promotion interventions for older people. The few studies that consider these aspects use very heterogeneous methods, thus there is no common methodological standard. There is a strong need for the development of guidelines to achieve better comparability and to include cost categories and outcomes that are relevant for older people. Disregarding these methodological obstacles could implicitly lead to discrimination against the elderly in terms of health promotion and disease prevention and, hence, an age-based rationing of public health care.
Communicating the Threat of a Tropical Cyclone to the Eastern Range
NASA Technical Reports Server (NTRS)
Winters, Katherine A.; Roeder, William P.; McAleenan, Mike; Belson, Brian L.; Shafer, Jaclyn A.
2012-01-01
The 45th Weather Squadron (45 WS) has developed a tool to help visualize the Wind Speed Probability product from the National Hurricane Center (NHC) and to help communicate that information to space launch customers and decision makers at the 45th Space Wing (45 SW) and Kennedy Space Center (KSC) located in east central Florida. This paper reviews previous work and presents the new visualization tool, including initial feedback as well as the pros and cons. The NHC began issuing their Wind Speed Probability product for tropical cyclones publicly in 2006. The 45 WS uses this product to provide a threat assessment to 45 SW and KSC leadership for risk evaluations with an approaching tropical cyclone. Although the wind speed probabilities convey the uncertainty of a tropical cyclone well, communicating this information to customers is a challenge. The 45 WS continually strives to provide the wind speed probability information to customers in a context which clearly communicates the threat of a tropical cyclone. First, an intern from the Florida Institute of Technology (FIT) Atmospheric Sciences department, sponsored by Scitor Corporation, independently evaluated the NHC wind speed probability product. This work was later extended into a M.S. thesis at FIT, partially funded by Scitor Corporation and KSC. A second thesis at FIT further extended the evaluation partially funded by KSC. Using this analysis, the 45 WS categorized the probabilities into five probability interpretation categories: Very Low, Low, Moderate, High, and Very High. These probability interpretation categories convert the forecast probability and forecast interval into easily understood categories that are consistent across all ranges of probabilities and forecast intervals. As a follow-on project, KSC funded a summer intern to evaluate the human factors of the probability interpretation categories, which ultimately refined some of the thresholds. The 45 WS created a visualization tool to express the timing and risk for multiple locations in a single graphic. Preliminary results on an on-going project by FIT will be included in this paper. This project is developing a new method of assigning the probability interpretation categories and updating the evaluation of the performance of the NHC wind speed probability analysis.
ERIC Educational Resources Information Center
Nyabero, Charles
2016-01-01
The purpose of this article was to explore on how course evaluation, decision making process, the methodology of evaluation and various roles of evaluation interact in the process of curriculum development. In the process of this exploration, the characteristics the types of evaluation, purposes of course evaluation, methodology of evaluation,…
Kramer, Andrew A; Higgins, Thomas L; Zimmerman, Jack E
2014-03-01
To examine the accuracy of the original Mortality Probability Admission Model III, ICU Outcomes Model/National Quality Forum modification of Mortality Probability Admission Model III, and Acute Physiology and Chronic Health Evaluation IVa models for comparing observed and risk-adjusted hospital mortality predictions. Retrospective paired analyses of day 1 hospital mortality predictions using three prognostic models. Fifty-five ICUs at 38 U.S. hospitals from January 2008 to December 2012. Among 174,001 intensive care admissions, 109,926 met model inclusion criteria and 55,304 had data for mortality prediction using all three models. None. We compared patient exclusions and the discrimination, calibration, and accuracy for each model. Acute Physiology and Chronic Health Evaluation IVa excluded 10.7% of all patients, ICU Outcomes Model/National Quality Forum 20.1%, and Mortality Probability Admission Model III 24.1%. Discrimination of Acute Physiology and Chronic Health Evaluation IVa was superior with area under receiver operating curve (0.88) compared with Mortality Probability Admission Model III (0.81) and ICU Outcomes Model/National Quality Forum (0.80). Acute Physiology and Chronic Health Evaluation IVa was better calibrated (lowest Hosmer-Lemeshow statistic). The accuracy of Acute Physiology and Chronic Health Evaluation IVa was superior (adjusted Brier score = 31.0%) to that for Mortality Probability Admission Model III (16.1%) and ICU Outcomes Model/National Quality Forum (17.8%). Compared with observed mortality, Acute Physiology and Chronic Health Evaluation IVa overpredicted mortality by 1.5% and Mortality Probability Admission Model III by 3.1%; ICU Outcomes Model/National Quality Forum underpredicted mortality by 1.2%. Calibration curves showed that Acute Physiology and Chronic Health Evaluation performed well over the entire risk range, unlike the Mortality Probability Admission Model and ICU Outcomes Model/National Quality Forum models. Acute Physiology and Chronic Health Evaluation IVa had better accuracy within patient subgroups and for specific admission diagnoses. Acute Physiology and Chronic Health Evaluation IVa offered the best discrimination and calibration on a large common dataset and excluded fewer patients than Mortality Probability Admission Model III or ICU Outcomes Model/National Quality Forum. The choice of ICU performance benchmarks should be based on a comparison of model accuracy using data for identical patients.
Pattie. Jr., Robert Wayne; Adamek, Evan Robert; Brenner, Thomas; ...
2017-08-10
We report on the evaluation of commercial electroless nickel phosphorus (NiP) coatings for ultracold neutron (UCN) transport and storage. The material potential of 50μm thick NiP coatings on stainless steel and aluminum substrates was measured to be V F=213(5.2)neV using the time-of-flight spectrometer ASTERIX at the Lujan Center. The loss per bounce probability was measured in pinhole bottling experiments carried out at ultracold neutron sources at Los Alamos Neutron Science Center and the Institut Laue-Langevin. For these tests a new guide coupling design was used to minimize gaps between the guide sections. The observed UCN loss in the bottle wasmore » interpreted in terms of an energy independent effective loss per bounce, which is the appropriate model when gaps in the system and upscattering are the dominate loss mechanisms, yielding a loss per bounce of 1.3(1)×10 –4. In conclusion, we also present a detailed discussion of the pinhole bottling methodology and an energy dependent analysis of the experimental results.« less
Baker, Mark
2012-01-01
Following a service evaluation methodology, this paper reports on registered nurses' (RNs) and healthcare assistants' (HCAs) perceptions about education and training requirements in order to work with people with complex neurological disabilities. A service evaluation was undertaken to meet the study aim using a non-probability, convenience method of sampling 368 nurses (n=110 RNs, n=258 HCAs) employed between October and November 2008 at one specialist hospital in south-west London in the U.K. The main results show that respondents were clear about the need to develop an education and training programme for RNs and HCAs working in this speciality area (91% of RNs and 94% of HCAs). A variety of topics were identified to be included within a work-based education and training programme, such as positively managing challenging behaviour, moving and handling, working with families. Adults with complex neurological needs have diverse needs and thus nurses working with this patient group require diverse education and training in order to deliver quality patient-focused nursing care. Copyright © 2011 Elsevier Ltd. All rights reserved.
Sun, F; Chen, J; Tong, Q; Zeng, S
2007-01-01
Management of drinking water safety is changing towards an integrated risk assessment and risk management approach that includes all processes in a water supply system from catchment to consumers. However, given the large number of water supply systems in China and the cost of implementing such a risk assessment procedure, there is a necessity to first conduct a strategic screening analysis at a national level. An integrated methodology of risk assessment and screening analysis is thus proposed to evaluate drinking water safety of a conventional water supply system. The violation probability, indicating drinking water safety, is estimated at different locations of a water supply system in terms of permanganate index, ammonia nitrogen, turbidity, residual chlorine and trihalomethanes. Critical parameters with respect to drinking water safety are then identified, based on which an index system is developed to prioritize conventional water supply systems in implementing a detailed risk assessment procedure. The evaluation results are represented as graphic check matrices for the concerned hazards in drinking water, from which the vulnerability of a conventional water supply system is characterized.
NASA Astrophysics Data System (ADS)
Pattie, R. W.; Adamek, E. R.; Brenner, T.; Brandt, A.; Broussard, L. J.; Callahan, N. B.; Clayton, S. M.; Cude-Woods, C.; Currie, S. A.; Geltenbort, P.; Ito, T. M.; Lauer, T.; Liu, C. Y.; Majewski, J.; Makela, M.; Masuda, Y.; Morris, C. L.; Ramsey, J. C.; Salvat, D. J.; Saunders, A.; Schroffenegger, J.; Tang, Z.; Wei, W.; Wang, Z.; Watkins, E.; Young, A. R.; Zeck, B. A.
2017-11-01
We report on the evaluation of commercial electroless nickel phosphorus (NiP) coatings for ultracold neutron (UCN) transport and storage. The material potential of 50 μm thick NiP coatings on stainless steel and aluminum substrates was measured to be VF = 213(5 . 2) neV using the time-of-flight spectrometer ASTERIX at the Lujan Center. The loss per bounce probability was measured in pinhole bottling experiments carried out at ultracold neutron sources at Los Alamos Neutron Science Center and the Institut Laue-Langevin. For these tests a new guide coupling design was used to minimize gaps between the guide sections. The observed UCN loss in the bottle was interpreted in terms of an energy independent effective loss per bounce, which is the appropriate model when gaps in the system and upscattering are the dominate loss mechanisms, yielding a loss per bounce of 1 . 3(1) × 10-4. We also present a detailed discussion of the pinhole bottling methodology and an energy dependent analysis of the experimental results.
Density matters: Review of approaches to setting organism-based ballast water discharge standards
Lee II,; Frazier,; Ruiz,
2010-01-01
As part of their effort to develop national ballast water discharge standards under NPDES permitting, the Office of Water requested that WED scientists identify and review existing approaches to generating organism-based discharge standards for ballast water. Six potential approaches were identified and the utility and uncertainties of each approach was evaluated. During the process of reviewing the existing approaches, the WED scientists, in conjunction with scientists at the USGS and Smithsonian Institution, developed a new approach (per capita invasion probability or "PCIP") that addresses many of the limitations of the previous methodologies. THE PCIP approach allows risk managers to generate quantitative discharge standards using historical invasion rates, ballast water discharge volumes, and ballast water organism concentrations. The statistical power of sampling ballast water for both the validation of ballast water treatment systems and ship-board compliance monitoring with the existing methods, though it should be possible to obtain sufficient samples during treatment validation. The report will go to a National Academy of Sciences expert panel that will use it in their evaluation of approaches to developing ballast water discharge standards for the Office of Water.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pattie. Jr., Robert Wayne; Adamek, Evan Robert; Brenner, Thomas
We report on the evaluation of commercial electroless nickel phosphorus (NiP) coatings for ultracold neutron (UCN) transport and storage. The material potential of 50μm thick NiP coatings on stainless steel and aluminum substrates was measured to be V F=213(5.2)neV using the time-of-flight spectrometer ASTERIX at the Lujan Center. The loss per bounce probability was measured in pinhole bottling experiments carried out at ultracold neutron sources at Los Alamos Neutron Science Center and the Institut Laue-Langevin. For these tests a new guide coupling design was used to minimize gaps between the guide sections. The observed UCN loss in the bottle wasmore » interpreted in terms of an energy independent effective loss per bounce, which is the appropriate model when gaps in the system and upscattering are the dominate loss mechanisms, yielding a loss per bounce of 1.3(1)×10 –4. In conclusion, we also present a detailed discussion of the pinhole bottling methodology and an energy dependent analysis of the experimental results.« less
Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Chen, Yiping; Zhuang, Zhaowen; Cheng, Yongqiang; Deng, Bin; Wang, Liandong; Zeng, Yonghu; Gao, Lei
2014-01-01
This paper offers a compacted mechanism to carry out the performance evaluation work for an automatic target recognition (ATR) system: (a) a standard description of the ATR system's output is suggested, a quantity to indicate the operating condition is presented based on the principle of feature extraction in pattern recognition, and a series of indexes to assess the output in different aspects are developed with the application of statistics; (b) performance of the ATR system is interpreted by a quality factor based on knowledge of engineering mathematics; (c) through a novel utility called “context-probability” estimation proposed based on probability, performance prediction for an ATR system is realized. The simulation result shows that the performance of an ATR system can be accounted for and forecasted by the above-mentioned measures. Compared to existing technologies, the novel method can offer more objective performance conclusions for an ATR system. These conclusions may be helpful in knowing the practical capability of the tested ATR system. At the same time, the generalization performance of the proposed method is good. PMID:24967605
Training Evaluation: An Analysis of the Stakeholders' Evaluation Needs
ERIC Educational Resources Information Center
Guerci, Marco; Vinante, Marco
2011-01-01
Purpose: In recent years, the literature on program evaluation has examined multi-stakeholder evaluation, but training evaluation models and practices have not generally taken this problem into account. The aim of this paper is to fill this gap. Design/methodology/approach: This study identifies intersections between methodologies and approaches…
2017-11-01
The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command
Evaluation Methods Sourcebook.
ERIC Educational Resources Information Center
Love, Arnold J., Ed.
The chapters commissioned for this book describe key aspects of evaluation methodology as they are practiced in a Canadian context, providing representative illustrations of recent developments in evaluation methodology as it is currently applied. The following chapters are included: (1) "Program Evaluation with Limited Fiscal and Human…
Landslide Risk: Economic Valuation in The North-Eastern Zone of Medellin City
NASA Astrophysics Data System (ADS)
Vega, Johnny Alexander; Hidalgo, César Augusto; Johana Marín, Nini
2017-10-01
Natural disasters of a geodynamic nature can cause enormous economic and human losses. The economic costs of a landslide disaster include relocation of communities and physical repair of urban infrastructure. However, when performing a quantitative risk analysis, generally, the indirect economic consequences of such an event are not taken into account. A probabilistic approach methodology that considers several scenarios of hazard and vulnerability to measure the magnitude of the landslide and to quantify the economic costs is proposed. With this approach, it is possible to carry out a quantitative evaluation of the risk by landslides, allowing the calculation of the economic losses before a potential disaster in an objective, standardized and reproducible way, taking into account the uncertainty of the building costs in the study zone. The possibility of comparing different scenarios facilitates the urban planning process, the optimization of interventions to reduce risk to acceptable levels and an assessment of economic losses according to the magnitude of the damage. For the development and explanation of the proposed methodology, a simple case study is presented, located in north-eastern zone of the city of Medellín. This area has particular geomorphological characteristics, and it is also characterized by the presence of several buildings in bad structural conditions. The proposed methodology permits to obtain an estimative of the probable economic losses by earthquake-induced landslides, taking into account the uncertainty of the building costs in the study zone. The obtained estimative shows that the structural intervention of the buildings produces a reduction the order of 21 % in the total landslide risk.
Rational decision-making in mental health: the role of systematic reviews.
Gilbody, Simon M.; Petticrew, Mark
1999-09-01
BACKGROUND: "Systematic reviews" have come to be recognized as the most rigorous method of summarizing confusing and often contradictory primary research in a transparent and reproducible manner. Their greatest impact has been in the summarization of epidemiological literature - particularly that relating to clinical effectiveness. Systematic reviews also have a potential to inform rational decision-making in healthcare policy and to form a component of economic evaluation. AIMS OF THE STUDY: This article aims to introduce the rationale behind systematic reviews and, using examples from mental health, to introduce the strengths and limitations of systematic reviews, particularly in informing mental health policy and economic evaluation. METHODS: Examples are selected from recent controversies surrounding the introduction of new psychiatric drugs (anti-depressants and anti-schizophrenia drugs) and methods of delivering psychiatric care in the community (case management and assertive community treatment). The potential for systematic reviews to (i) produce best estimates of clinical efficacy and effectiveness, (ii) aid economic evaluation and policy decision-making and (iii) highlight gaps in the primary research knowledge base are discussed. Lastly examples are selected from outside mental health to show how systematic reviews have a potential to be explicitly used in economic and health policy evaluation. RESULTS: Systematic reviews produce the best estimates of clinical efficacy, which can form an important component of economic evaluation. Importantly, serious methodological flaws and areas of uncertainty in the primary research literature are identified within an explicit framework. Summary indices of clinical effectiveness can be produced, but it is difficult to produce such summary indices of cost effectiveness by pooling economic data from primary studies. Modelling is commonly used in economic and policy evaluation. Here, systematic reviews can provide the best estimates of effectiveness and, importantly, highlight areas of uncertainty that can be used in "sensitivity analysis". DISCUSSION: Systematic reviews are an important recent methodological advance, the potential for which has only begun to be realized in mental health. This use of systematic reviews is probably most advanced in producing critical summaries of clinical effectiveness data. Systematic reviews cannot produce valid and believable conclusions when the primary research literature is of poor quality. An important function of systematic reviews will be in highlighting this poor quality research which is of little use in mental health decision making. IMPLICATIONS FOR HEALTH PROVISION: Health care provision should be both clinically and cost effective. Systematic reviews are a key component in ensuring that this goal is achieved. IMPLICATIONS FOR HEALTH POLICIES: Systematic reviews have potential to inform health policy. Examples presented show that health policy is often made without due consideration of the research evidence. Systematic reviews can provide robust and believable answers, which can help inform rational decision-making. Importantly, systematic reviews can highlight the need for important primary research and can inform the design of this research such that it provides answers that will help in forming healthcare policy. IMPLICATIONS FOR FURTHER RESEARCH: Systematic reviews should precede costly (and often unnecessary) primary research. Many areas of health policy and practice have yet to be evaluated using systematic review methodology. Methods for the summarization of economic data are methodologically complex and deserve further research
Guidelines for reporting evaluations based on observational methodology.
Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2015-01-01
Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.
NASA Astrophysics Data System (ADS)
Sayol, J. M.; Marcos, M.
2018-02-01
This study presents a novel methodology to estimate the impact of local sea level rise and extreme surges and waves in coastal areas under climate change scenarios. The methodology is applied to the Ebro Delta, a valuable and vulnerable low-lying wetland located in the northwestern Mediterranean Sea. Projections of local sea level accounting for all contributions to mean sea level changes, including thermal expansion, dynamic changes, fresh water addition and glacial isostatic adjustment, have been obtained from regionalized sea level projections during the 21st century. Particular attention has been paid to the uncertainties, which have been derived from the spread of the multi-model ensemble combined with seasonal/inter-annual sea level variability from local tide gauge observations. Besides vertical land movements have also been integrated to estimate local relative sea level rise. On the other hand, regional projections over the Mediterranean basin of storm surges and wind-waves have been used to evaluate changes in extreme events. The compound effects of surges and extreme waves have been quantified using their joint probability distributions. Finally, offshore sea level projections from extreme events superimposed to mean sea level have been propagated onto a high resolution digital elevation model of the study region in order to construct flood hazards maps for mid and end of the 21st century and under two different climate change scenarios. The effect of each contribution has been evaluated in terms of percentage of the area exposed to coastal hazards, which will help to design more efficient protection and adaptation measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.; Budnitz, Robert J.
If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO 2 annually, with the CO 2 delivered to many thousands of wells that will inject the CO 2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelinesmore » are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO 2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of accident sequences of concern and of their consequences, and crucially the methodology provides insights into what measures might be taken to mitigate those accident sequences identified as of concern. Mitigating strategies could address reducing the likelihood of an accident sequence of concern, or reducing the consequences, or some combination. The methodology elucidates both local and integrated risks along the pipeline or at the well providing information useful to decision makers at various levels including local (e.g., property owners and town councils), regional (e.g., county and state representatives), and national levels (federal regulators and corporate proponents).« less
Methodological Review of Intimate Partner Violence Prevention Research
ERIC Educational Resources Information Center
Murray, Christine E.; Graybeal, Jennifer
2007-01-01
The authors present a methodological review of empirical program evaluation research in the area of intimate partner violence prevention. The authors adapted and utilized criterion-based rating forms to standardize the evaluation of the methodological strengths and weaknesses of each study. The findings indicate that the limited amount of…
Prospect evaluation as a function of numeracy and probability denominator.
Millroth, Philip; Juslin, Peter
2015-05-01
This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.
Bottai, Matteo; Tjärnlund, Anna; Santoni, Giola; Werth, Victoria P; Pilkington, Clarissa; de Visser, Marianne; Alfredsson, Lars; Amato, Anthony A; Barohn, Richard J; Liang, Matthew H; Aggarwal, Rohit; Arnardottir, Snjolaug; Chinoy, Hector; Cooper, Robert G; Danko, Katalin; Dimachkie, Mazen M; Feldman, Brian M; García-De La Torre, Ignacio; Gordon, Patrick; Hayashi, Taichi; Katz, James D; Kohsaka, Hitoshi; Lachenbruch, Peter A; Lang, Bianca A; Li, Yuhui; Oddis, Chester V; Olesinka, Marzena; Reed, Ann M; Rutkowska-Sak, Lidia; Sanner, Helga; Selva-O’Callaghan, Albert; Wook Song, Yeong; Ytterberg, Steven R; Miller, Frederick W; Rider, Lisa G; Lundberg, Ingrid E; Amoruso, Maria
2017-01-01
Objective To describe the methodology used to develop new classification criteria for adult and juvenile idiopathic inflammatory myopathies (IIMs) and their major subgroups. Methods An international, multidisciplinary group of myositis experts produced a set of 93 potentially relevant variables to be tested for inclusion in the criteria. Rheumatology, dermatology, neurology and paediatric clinics worldwide collected data on 976 IIM cases (74% adults, 26% children) and 624 non-IIM comparator cases with mimicking conditions (82% adults, 18% children). The participating clinicians classified each case as IIM or non-IIM. Generally, the classification of any given patient was based on few variables, leaving remaining variables unmeasured. We investigated the strength of the association between all variables and between these and the disease status as determined by the physician. We considered three approaches: (1) a probability-score approach, (2) a sum-of-items approach criteria and (3) a classification-tree approach. Results The approaches yielded several candidate models that were scrutinised with respect to statistical performance and clinical relevance. The probability-score approach showed superior statistical performance and clinical practicability and was therefore preferred over the others. We developed a classification tree for subclassification of patients with IIM. A calculator for electronic devices, such as computers and smartphones, facilitates the use of the European League Against Rheumatism/American College of Rheumatology (EULAR/ACR) classification criteria. Conclusions The new EULAR/ACR classification criteria provide a patient’s probability of having IIM for use in clinical and research settings. The probability is based on a score obtained by summing the weights associated with a set of criteria items. PMID:29177080
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castillo, H.
1982-01-01
The Government of Costa Rica has stated the need for a formal procedure for the evaluation and categorization of an environmental program. Methodological studies were prepared as the basis for the development of the general methodology by which each government or institution can adapt and implement the procedure. The methodology was established by using different techniques according to their contribution to the evaluation process, such as: Systemic Approach, Delphi, and Saaty Methods. The methodology consists of two main parts: 1) evaluation of the environmental aspects by using different techniques; 2) categorization of the environmental aspects by applying the methodology tomore » the Costa Rican Environmental affairs using questionnaire answers supplied by experts both inside and outside of the country. The second part of the research includes Appendixes in which is presented general information concerning institutions related to environmental affairs; description of the methods used; results of the current status evaluation and its scale; the final scale of categorization; and the questionnaires and a list of experts. The methodology developed in this research will have a beneficial impact on environmental concerns in Costa Rica. As a result of this research, a Commission Office of Environmental Affairs, providing links between consumers, engineers, scientists, and the Government, is recommended. Also there is significant potential use of this methodology in developed countries for a better balancing of the budgets of major research programs such as cancer, heart, and other research areas.« less
Traumatic brain injury: methodological approaches to estimate health and economic outcomes.
Lu, Juan; Roe, Cecilie; Aas, Eline; Lapane, Kate L; Niemeier, Janet; Arango-Lasprilla, Juan Carlos; Andelic, Nada
2013-12-01
The effort to standardize the methodology and adherence to recommended principles for all economic evaluations has been emphasized in medical literature. The objective of this review is to examine whether economic evaluations in traumatic brain injury (TBI) research have been compliant with existing guidelines. Medline search was performed between January 1, 1995 and August 11, 2012. All original TBI-related full economic evaluations were included in the study. Two authors independently rated each study's methodology and data presentation to determine compliance to the 10 methodological principles recommended by Blackmore et al. Descriptive analysis was used to summarize the data. Inter-rater reliability was assessed with Kappa statistics. A total of 28 studies met the inclusion criteria. Eighteen of these studies described cost-effectiveness, seven cost-benefit, and three cost-utility analyses. The results showed a rapid growth in the number of published articles on the economic impact of TBI since 2000 and an improvement in their methodological quality. However, overall compliance with recommended methodological principles of TBI-related economic evaluation has been deficient. On average, about six of the 10 criteria were followed in these publications, and only two articles met all 10 criteria. These findings call for an increased awareness of the methodological standards that should be followed by investigators both in performance of economic evaluation and in reviews of evaluation reports prior to publication. The results also suggest that all economic evaluations should be made by following the guidelines within a conceptual framework, in order to facilitate evidence-based practices in the field of TBI.
Quantification of effective exoelectrogens by most probable number (MPN) in a microbial fuel cell.
Heidrich, Elizabeth S; Curtis, Thomas P; Woodcock, Stephen; Dolfing, Jan
2016-10-01
The objective of this work was to quantify the number of exoelectrogens in wastewater capable of producing current in a microbial fuel cell by adapting the classical most probable number (MPN) methodology using current production as end point. Inoculating a series of microbial fuel cells with various dilutions of domestic wastewater and with acetate as test substrate yielded an apparent number of exoelectrogens of 17perml. Using current as a proxy for activity the apparent exoelectrogen growth rate was 0.03h(-1). With starch or wastewater as more complex test substrates similar apparent growth rates were obtained, but the apparent MPN based numbers of exoelectrogens in wastewater were significantly lower, probably because in contrast to acetate, complex substrates require complex food chains to deliver the electrons to the electrodes. Consequently, the apparent MPN is a function of the combined probabilities of members of the food chain being present. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
ERIC Educational Resources Information Center
Mosher, Paul H.
1979-01-01
Reviews the history, literature, and methodology of collection evaluation or assessment in American research libraries; discusses current problems, tools, and methodology of evaluation; and describes an ongoing collection evaluation program at the Stanford University Libraries. (Author/MBR)
Scheraga, H A; Paine, G H
1986-01-01
We are using a variety of theoretical and computational techniques to study protein structure, protein folding, and higher-order structures. Our earlier work involved treatments of liquid water and aqueous solutions of nonpolar and polar solutes, computations of the stabilities of the fundamental structures of proteins and their packing arrangements, conformations of small cyclic and open-chain peptides, structures of fibrous proteins (collagen), structures of homologous globular proteins, introduction of special procedures as constraints during energy minimization of globular proteins, and structures of enzyme-substrate complexes. Recently, we presented a new methodology for predicting polypeptide structure (described here); the method is based on the calculation of the probable and average conformation of a polypeptide chain by the application of equilibrium statistical mechanics in conjunction with an adaptive, importance sampling Monte Carlo algorithm. As a test, it was applied to Met-enkephalin.
Figures of Merit for Control Verification
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Goesu. Daniel P.
2008-01-01
This paper proposes a methodology for evaluating a controller's ability to satisfy a set of closed-loop specifications when the plant has an arbitrary functional dependency on uncertain parameters. Control verification metrics applicable to deterministic and probabilistic uncertainty models are proposed. These metrics, which result from sizing the largest uncertainty set of a given class for which the specifications are satisfied, enable systematic assessment of competing control alternatives regardless of the methods used to derive them. A particularly attractive feature of the tools derived is that their efficiency and accuracy do not depend on the robustness of the controller. This is in sharp contrast to Monte Carlo based methods where the number of simulations required to accurately approximate the failure probability grows exponentially with its closeness to zero. This framework allows for the integration of complex, high-fidelity simulations of the integrated system and only requires standard optimization algorithms for its implementation.
Commercialization of NESSUS: Status
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Millwater, Harry R.
1991-01-01
A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.
Contamination of planets by nonsterile flight hardware.
NASA Technical Reports Server (NTRS)
Wolfson, R. P.; Craven, C. W.
1971-01-01
The various factors about space missions and spacecraft involved in the study of nonsterile space flight hardware with respect to their effects on planetary quarantine are reviewed. It is shown that methodology currently exists to evaluate the various potential contamination sources and to take appropriate steps in the design of spacecraft ha rdware and mission parameters so that quarantine constraints are met. This work should be done for each program so that the latest knowledge pertaining to various biological questions is utilized, and so that the specific hardware designs of the program can be assessed. The general trend of specific recommendations include: (1) biasing the launch trajectory away from planet to assure against accidental impact of the spacecraft; (2) selecting planetary orbits that meet quarantine requirements - both for accidental impact and for minimizing contamination probabilities due to ejecta; and (3) manufacturing and handling spacecraft under cleanliness conditions assuring minimum bioload.
Ocean outfalls as an alternative to minimizing risks to human and environmental health.
Feitosa, Renato Castiglia
2017-06-01
Submarine outfalls are proposed as an efficient alternative for the final destination of wastewater in densely populated coastal areas, due to the high dispersal capacity and the clearance of organic matter in the marine environment, and because they require small areas for implementation. This paper evaluates the probability of unsuitable bathing conditions in coastal areas nearby to the Ipanema, Barra da Tijuca and Icaraí outfalls based on a computational methodology gathering hydrodynamic, pollutant transport, and bacterial decay modelling. The results show a strong influence of solar radiation and all factors that mitigate its levels in the marine environment on coliform concentration. The aforementioned outfalls do not pollute the coastal areas, and unsuitable bathing conditions are restricted to nearby effluent launching points. The pollution observed at the beaches indicates that the contamination occurs due to the polluted estuarine systems, rivers and canals that flow to the coast.
NASA Astrophysics Data System (ADS)
Verechagin, V.; Kris, R.; Schwarzband, I.; Milstein, A.; Cohen, B.; Shkalim, A.; Levy, S.; Price, D.; Bal, E.
2018-03-01
Over the years, mask and wafers defects dispositioning has become an increasingly challenging and time consuming task. With design rules getting smaller, OPC getting complex and scanner illumination taking on free-form shapes - the probability of a user to perform accurate and repeatable classification of defects detected by mask inspection tools into pass/fail bins is reducing. The critical challenging of mask defect metrology for small nodes ( < 30 nm) was reviewed in [1]. While Critical Dimension (CD) variation measurement is still the method of choice for determining a mask defect future impact on wafer, the high complexity of OPCs combined with high variability in pattern shapes poses a challenge for any automated CD variation measurement method. In this study, a novel approach for measurement generalization is presented. CD variation assessment performance is evaluated on multiple different complex shape patterns, and is benchmarked against an existing qualified measurement methodology.
Assessing qualitative long-term volcanic hazards at Lanzarote Island (Canary Islands)
NASA Astrophysics Data System (ADS)
Becerril, Laura; Martí, Joan; Bartolini, Stefania; Geyer, Adelina
2017-07-01
Conducting long-term hazard assessment in active volcanic areas is of primary importance for land-use planning and defining emergency plans able to be applied in case of a crisis. A definition of scenario hazard maps helps to mitigate the consequences of future eruptions by anticipating the events that may occur. Lanzarote is an active volcanic island that has hosted the largest (> 1.5 km3 DRE) and longest (6 years) eruption, the Timanfaya eruption (1730-1736), on the Canary Islands in historical times (last 600 years). This eruption brought severe economic losses and forced local people to migrate. In spite of all these facts, no comprehensive hazard assessment or hazard maps have been developed for the island. In this work, we present an integrated long-term volcanic hazard evaluation using a systematic methodology that includes spatial analysis and simulations of the most probable eruptive scenarios.
DENSITY: software for analysing capture-recapture data from passive detector arrays
Efford, M.G.; Dawson, D.K.; Robbins, C.S.
2004-01-01
A general computer-intensive method is described for fitting spatial detection functions to capture-recapture data from arrays of passive detectors such as live traps and mist nets. The method is used to estimate the population density of 10 species of breeding birds sampled by mist-netting in deciduous forest at Patuxent Research Refuge, Laurel, Maryland, U.S.A., from 1961 to 1972. Total density (9.9 ? 0.6 ha-1 mean ? SE) appeared to decline over time (slope -0.41 ? 0.15 ha-1y-1). The mean precision of annual estimates for all 10 species pooled was acceptable (CV(D) = 14%). Spatial analysis of closed-population capture-recapture data highlighted deficiencies in non-spatial methodologies. For example, effective trapping area cannot be assumed constant when detection probability is variable. Simulation may be used to evaluate alternative designs for mist net arrays where density estimation is a study goal.
Satellite solar power - Will it pay off
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1977-01-01
A cost analysis is presented for front-end investments required for the development of a satellite solar power system. The methodology used makes use of risk analysis techniques to quantify the present state of knowledge relevant to the construction and operation of a satellite solar power station 20 years in the future. Results are used to evaluate the 'expected value' of a three-year research program providing additional information which will be used as a basis for a decision to either continue development of the concept at an increasing funding level or to terminate or drastically alter the program. The program is costed phase by phase, and a decision tree is constructed. The estimated probability of success for the research and studies phase is .540. The expected value of a program leading to the construction of 120 systems at a rate of four per year is 12.433 billion dollars.
Probabilistic Evaluation of Blade Impact Damage
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Abumeri, G. H.
2003-01-01
The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.
Impact of Life Events on the Relapse of Schizophrenic Patients
ERIC Educational Resources Information Center
Hussein, Hassan Ali; Jacoob, Shirooq; Sharour, Loai Abu
2016-01-01
Objectives: To investigate the relationship between stressful life events at the time of relapse in schizophrenic patients at psychiatric hospitals in Baghdad city. Methodology: A purposive (non-probability) sampling of 50 schizophrenic patients who have relapsed was involved in the present study. Data were collected through the use of the…
Factors Affecting Smoking Tendency and Smoking Intensity
ERIC Educational Resources Information Center
David, Nissim Ben; Zion, Uri Ben
2009-01-01
Purpose: The purpose of this paper is to measure the relative effect of relevant explanatory variable on smoking tendency and smoking intensity. Design/methodology/approach: Using survey data collected by the Israeli Bureau of Statistics in 2003-2004, a probit procedure is estimated for analyzing factors that affect the probability of being a…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-06
... methodological approaches to express the dose- response relationship between radiation exposure and CLL. In... promulgated in 2002. The first was that the epidemiological studies did not demonstrate radiation as the cause...'s recent review found the evidence of radiogenicity offered by epidemiology studies to be non...
ERIC Educational Resources Information Center
Belfiore, Phillip J.; Basile, Sarah Pulley; Lee, David L.
2008-01-01
One of the most problematic behaviors in children with developmental disabilities is noncompliance. Although behavioral research has provided strategies to impact noncompliance, oftentimes the methodologies are consequent techniques, which may not be conducive to implementation by the classroom teacher. In this teacher-designed and implemented…
NASA Technical Reports Server (NTRS)
1971-01-01
Developed methodologies and procedures for the reduction of microbial burden on an assembled spacecraft at the time of encapsulation or terminal sterilization are reported. This technology is required for reducing excessive microbial burden on spacecraft components for the purposes of either decreasing planetary contamination probabilities for an orbiter or minimizing the duration of a sterilization process for a lander.
Probabilistic inspection strategies for minimizing service failures
NASA Technical Reports Server (NTRS)
Brot, Abraham
1994-01-01
The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.
Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology
NASA Technical Reports Server (NTRS)
Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; LaBel, K. A.
2015-01-01
Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the design margin concept with one of failure probability.
USDA-ARS?s Scientific Manuscript database
In recent years, there have been a number of Listeria monocytogenes recalls involving fresh-cut apples, probably contaminated during treatments with antibrowning solutions. In the present study, we used response surface methodology to develop and optimize formulations for reducing L. monocytogenes ...
Simoens, Steven
2013-01-01
Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474
Simoens, Steven
2013-01-01
This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.
NASA Astrophysics Data System (ADS)
Kurceren, Ragip; Modestino, James W.
1998-12-01
The use of forward error-control (FEC) coding, possibly in conjunction with ARQ techniques, has emerged as a promising approach for video transport over ATM networks for cell-loss recovery and/or bit error correction, such as might be required for wireless links. Although FEC provides cell-loss recovery capabilities it also introduces transmission overhead which can possibly cause additional cell losses. A methodology is described to maximize the number of video sources multiplexed at a given quality of service (QoS), measured in terms of decoded cell loss probability, using interlaced FEC codes. The transport channel is modelled as a block interference channel (BIC) and the multiplexer as single server, deterministic service, finite buffer supporting N users. Based upon an information-theoretic characterization of the BIC and large deviation bounds on the buffer overflow probability, the described methodology provides theoretically achievable upper limits on the number of sources multiplexed. Performance of specific coding techniques using interlaced nonbinary Reed-Solomon (RS) codes and binary rate-compatible punctured convolutional (RCPC) codes is illustrated.
Khakzad, Nima; Khan, Faisal; Amyotte, Paul
2015-07-01
Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Teoh, Lay Eng; Khoo, Hooi Ling
2013-09-01
This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.
Macro-economic assessment of flood risk in Italy under current and future climate
NASA Astrophysics Data System (ADS)
Carrera, Lorenzo; Koks, Elco; Mysiak, Jaroslav; Aerts, Jeroen; Standardi, Gabriele
2014-05-01
This paper explores an integrated methodology for assessing direct and indirect costs of fluvial flooding to estimate current and future fluvial flood risk in Italy. Our methodology combines a Geographic Information System spatial approach, with a general economic equilibrium approach using a downscaled modified version of a Computable General Equilibrium model at NUTS2 scale. Given the level of uncertainty in the behavior of disaster-affected economies, the simulation considers a wide range of business recovery periods. We calculate expected annual losses for each NUTS2 region, and exceedence probability curves to determine probable maximum losses. Given a certain acceptable level of risk, we describe the conditions of flood protection and business recovery periods under which losses are contained within this limit. Because of the difference between direct costs, which are an overestimation of stock losses, and indirect costs, which represent the macro-economic effects, our results have different policy meanings. While the former is relevant for post-disaster recovery, the latter is more relevant for public policy issues, particularly for cost-benefit analysis and resilience assessment.
[Adult mortality differentials in Argentina].
Rofman, R
1994-06-01
Adult mortality differentials in Argentina are estimated and analyzed using data from the National Social Security Administration. The study of adult mortality has attracted little attention in developing countries because of the scarcity of reliable statistics and the greater importance assigned to demographic phenomena traditionally associated with development, such as infant mortality and fertility. A sample of 39,421 records of retired persons surviving as of June 30, 1988, was analyzed by age, sex, region of residence, relative amount of pension, and social security fund of membership prior to the consolidation of the system in 1967. The thirteen former funds were grouped into the five categories of government, commerce, industry, self-employed, and other, which were assumed to be proxies for the activity sector in which the individual spent his active life. The sample is not representative of the Argentine population, since it excludes the lowest and highest socioeconomic strata and overrepresents men and urban residents. It is, however, believed to be adequate for explaining mortality differentials for most of the population covered by the social security system. The study methodology was based on the technique of logistic analysis and on the use of regional model life tables developed by Coale and others. To evaluate the effect of the study variables on the probability of dying, a regression model of maximal verisimilitude was estimated. The model relates the logit of the probability of death between ages 65 and 95 to the available explanatory variables, including their possible interactions. Life tables were constructed by sex, region of residence, previous pension fund, and income. As a test of external consistency, a model including only age and sex as explanatory variables was constructed using the methodology. The results confirmed consistency between the estimated values and other published estimates. A significant conclusion of the study was that social security data are a satisfactory source for study of adult mortality, a finding of importance in cases where vital statistics systems are deficient. Mortality differentials by income level and activity sector were significant, representing up to 11.5 years in life expectancy at age 20 and 4.4 years at age 65. Mortality differentials by region were minor, probably due to the nature of the sample. The lowest observed mortality levels were in own-account workers, independent professionals, and small businessmen.
Predicting Vision-Related Disability in Glaucoma.
Abe, Ricardo Y; Diniz-Filho, Alberto; Costa, Vital P; Wu, Zhichao; Medeiros, Felipe A
2018-01-01
To present a new methodology for investigating predictive factors associated with development of vision-related disability in glaucoma. Prospective, observational cohort study. Two hundred thirty-six patients with glaucoma followed up for an average of 4.3±1.5 years. Vision-related disability was assessed by the 25-item National Eye Institute Visual Function Questionnaire (NEI VFQ-25) at baseline and at the end of follow-up. A latent transition analysis model was used to categorize NEI VFQ-25 results and to estimate the probability of developing vision-related disability during follow-up. Patients were tested with standard automated perimetry (SAP) at 6-month intervals, and evaluation of rates of visual field change was performed using mean sensitivity (MS) of the integrated binocular visual field. Baseline disease severity, rate of visual field loss, and duration of follow-up were investigated as predictive factors for development of disability during follow-up. The relationship between baseline and rates of visual field deterioration and the probability of vision-related disability developing during follow-up. At baseline, 67 of 236 (28%) glaucoma patients were classified as disabled based on NEI VFQ-25 results, whereas 169 (72%) were classified as nondisabled. Patients classified as nondisabled at baseline had 14.2% probability of disability developing during follow-up. Rates of visual field loss as estimated by integrated binocular MS were almost 4 times faster for those in whom disability developed versus those in whom it did not (-0.78±1.00 dB/year vs. -0.20±0.47 dB/year, respectively; P < 0.001). In the multivariate model, each 1-dB lower baseline binocular MS was associated with 34% higher odds of disability developing over time (odds ratio [OR], 1.34; 95% confidence interval [CI], 1.06-1.70; P = 0.013). In addition, each 0.5-dB/year faster rate of loss of binocular MS during follow-up was associated with a more than 3.5 times increase in the risk of disability developing (OR, 3.58; 95% CI, 1.56-8.23; P = 0.003). A new methodology for classification and analysis of change in patient-reported quality-of-life outcomes allowed construction of models for predicting vision-related disability in glaucoma. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Five Steps for Improving Evaluation Reports by Using Different Data Analysis Methods.
ERIC Educational Resources Information Center
Thompson, Bruce
Although methodological integrity is not the sole determinant of the value of a program evaluation, decision-makers do have a right, at a minimum, to be able to expect competent work from evaluators. This paper explores five areas where evaluators might improve methodological practices. First, evaluation reports should reflect the limited…
PMP Estimations at Sparsely Controlled Andinian Basins and Climate Change Projections
NASA Astrophysics Data System (ADS)
Lagos Zúñiga, M. A.; Vargas, X.
2012-12-01
Probable Maximum Precipitation (PMP) estimation implies an extensive review of hydrometeorological data and understandig of precipitation formation processes. There exists different methodology processes that apply for their estimations and all of them require a good spatial and temporal representation of storms. The estimation of hydrometeorological PMP on sparsely controlled basins is a difficult task, specially if the studied area has an important orographic effect due to mountains and the mixed precipitation occurrence in the most several storms time period, the main task of this study is to propose and estimate PMP in a sparsely controlled basin, affected by abrupt topography and mixed hidrology basin; also analyzing statystic uncertainties estimations and possible climate changes effects in its estimation. In this study the PMP estimation under statistical and hydrometeorological aproaches (watershed-based and traditional depth area duration analysis) was done in a semi arid zone at Puclaro dam in north Chile. Due to the lack of good spatial meteorological representation at the study zone, we propose a methodology to consider the orographic effects of Los Andes due to orographic effects patterns based in a RCM PRECIS-DGF and annual isoyetal maps. Estimations were validated with precipitation patterns for given winters, considering snow route and rainfall gauges at the preferencial wind direction, finding good results. The estimations are also compared with the highest areal storms in USA, Australia, India and China and with frequency analysis in local rain gauge stations in order to decide about the most adequate approach for the study zone. Climate change projections were evaluated with ECHAM5 GCM model, due to its good quality representation in the seasonality and the magnitude of meteorological variables. Temperature projections, for 2040-2065 period, show that there would be a rise in the catchment contributing area that would lead to an increase of the average liquid precipitation over the basin. Temperature projections would also affect the maximization factors in the calculation of the PMP, increasing it up to 126.6% and 62.5% in scenarios A2 and B1, respectively. These projections are important to be studied due to the implications of PMP in hydrologic design of great hydraulic works as Probable Maximum Flood (PMF). We propose that the methodology presented in this study could be also used in other basins of similar characteristics.
Therapy of a couple with a bipolar spouse.
Witusik, Andrzej; Pietras, Tadeusz
2017-10-23
Qualitative analysis of therapy of a couple with a partner who has bipolar disorder is an important research paradigm in contemporary psychotherapy of mental disorders.The qualitative method of the study is important both from the cognitive point of view and for the evaluation of the therapeutic efficacy in the individual, idiographical aspect. The aim of the study is a qualitative analysis of the therapeutic process of a couple in which one partner suffers from bipolar affective disorder. The study of the couple therapy process utilized the qualitative research methodology using variouspsychotherapeutic paradigms indicating the interrelationships that exist between relapses of the disease and functioning of the couple. The importance of triangulation processes, inheritance of transgenerational myths and dysfunctional cognitive patterns in the functional destabilization of a couple with one partner suffering from bipolar affective disorder was indicated. The study of the couple therapy process utilized the qualitative research methodology using variouspsychotherapeutic paradigms indicating the interrelationships that exist between relapses of the disease and functioning of the couple. The importance of triangulation processes, inheritance of transgenerational myths and dysfunctional cognitive patterns in the functional destabilization of a couple with one partner suffering from bipolar affective disorder was indicated. The dysfunctionality of the discussed couple is largely due to the effects of bipolar disorder and related disturbances on marital functioning. The spectrum of autism in the child is probably related both to the genetic strain of predisposition to psychiatric disorders and to the dysfunctionality of the parental dyad. The presence of bipolar affective disorder in the partner's family is also a genetic burden. The wife's aggression represents probably a syndrome of adaptation to disease in the family. Aggression plays a morphostatic role in the couple integrity.In both families of origin of the spouses, the transgeneration myth placed the woman in the position of a strong and family-oriented person.
NASA Technical Reports Server (NTRS)
Polotzky, Anthony S.; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1990-01-01
The development of a controller performance evaluation (CPE) methodology for multiinput/multioutput digital control systems is described. The equations used to obtain the open-loop plant, controller transfer matrices, and return-difference matrices are given. Results of applying the CPE methodology to evaluate MIMO digital flutter suppression systems being tested on an active flexible wing wind-tunnel model are presented to demonstrate the CPE capability.
Measuring political polarization: Twitter shows the two sides of Venezuela
NASA Astrophysics Data System (ADS)
Morales, A. J.; Borondo, J.; Losada, J. C.; Benito, R. M.
2015-03-01
We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.
A Methodology for Sustainability Evaluation and Reporting in Higher Education Institutions
ERIC Educational Resources Information Center
Madeira, Ana C.; Carravilla, Maria Antonia; Oliveira, Jose F.; Costa, Carlos A. V.
2011-01-01
The purpose of this paper is to present a methodology that allows higher education institutions (HEIs) to promote, to evaluate and to report on sustainability. The ultimate goal of the afore-mentioned methodology is to help HEIs achieve sustainability. First, a model entitled Sustainability in Higher Education Institutions (SusHEI) that generally…
Probabilistic Design of a Mars Sample Return Earth Entry Vehicle Thermal Protection System
NASA Technical Reports Server (NTRS)
Dec, John A.; Mitcheltree, Robert A.
2002-01-01
The driving requirement for design of a Mars Sample Return mission is to assure containment of the returned samples. Designing to, and demonstrating compliance with, such a requirement requires physics based tools that establish the relationship between engineer's sizing margins and probabilities of failure. The traditional method of determining margins on ablative thermal protection systems, while conservative, provides little insight into the actual probability of an over-temperature during flight. The objective of this paper is to describe a new methodology for establishing margins on sizing the thermal protection system (TPS). Results of this Monte Carlo approach are compared with traditional methods.
Using QALYs in telehealth evaluations: a systematic review of methodology and transparency.
Bergmo, Trine S
2014-08-03
The quality-adjusted life-year (QALY) is a recognised outcome measure in health economic evaluations. QALY incorporates individual preferences and identifies health gains by combining mortality and morbidity into one single index number. A literature review was conducted to examine and discuss the use of QALYs to measure outcomes in telehealth evaluations. Evaluations were identified via a literature search in all relevant databases. Only economic evaluations measuring both costs and QALYs using primary patient level data of two or more alternatives were included. A total of 17 economic evaluations estimating QALYs were identified. All evaluations used validated generic health related-quality of life (HRQoL) instruments to describe health states. They used accepted methods for transforming the quality scores into utility values. The methodology used varied between the evaluations. The evaluations used four different preference measures (EQ-5D, SF-6D, QWB and HUI3), and utility scores were elicited from the general population. Most studies reported the methodology used in calculating QALYs. The evaluations were less transparent in reporting utility weights at different time points and variability around utilities and QALYs. Few made adjustments for differences in baseline utilities. The QALYs gained in the reviewed evaluations varied from 0.001 to 0.118 in implying a small but positive effect of telehealth intervention on patient's health. The evaluations reported mixed cost-effectiveness results. The use of QALYs in telehealth evaluations has increased over the last few years. Different methodologies and utility measures have been used to calculate QALYs. A more harmonised methodology and utility measure is needed to ensure comparability across telehealth evaluations.
Financial options methodology for analyzing investments in new technology
NASA Technical Reports Server (NTRS)
Wenning, B. D.
1995-01-01
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.
NASA Astrophysics Data System (ADS)
Sabarish, R. Mani; Narasimhan, R.; Chandhru, A. R.; Suribabu, C. R.; Sudharsan, J.; Nithiyanantham, S.
2017-05-01
In the design of irrigation and other hydraulic structures, evaluating the magnitude of extreme rainfall for a specific probability of occurrence is of much importance. The capacity of such structures is usually designed to cater to the probability of occurrence of extreme rainfall during its lifetime. In this study, an extreme value analysis of rainfall for Tiruchirapalli City in Tamil Nadu was carried out using 100 years of rainfall data. Statistical methods were used in the analysis. The best-fit probability distribution was evaluated for 1, 2, 3, 4 and 5 days of continuous maximum rainfall. The goodness of fit was evaluated using Chi-square test. The results of the goodness-of-fit tests indicate that log-Pearson type III method is the overall best-fit probability distribution for 1-day maximum rainfall and consecutive 2-, 3-, 4-, 5- and 6-day maximum rainfall series of Tiruchirapalli. To be reliable, the forecasted maximum rainfalls for the selected return periods are evaluated in comparison with the results of the plotting position.
NASA Astrophysics Data System (ADS)
Kergadallan, Xavier; Bernardara, Pietro; Benoit, Michel; Andreewsky, Marc; Weiss, Jérôme
2013-04-01
Estimating the probability of occurrence of extreme sea levels is a central issue for the protection of the coast. Return periods of sea level with wave set-up contribution are estimated here in one site : Cherbourg in France in the English Channel. The methodology follows two steps : the first one is computation of joint probability of simultaneous wave height and still sea level, the second one is interpretation of that joint probabilities to assess a sea level for a given return period. Two different approaches were evaluated to compute joint probability of simultaneous wave height and still sea level : the first one is multivariate extreme values distributions of logistic type in which all components of the variables become large simultaneously, the second one is conditional approach for multivariate extreme values in which only one component of the variables have to be large. Two different methods were applied to estimate sea level with wave set-up contribution for a given return period : Monte-Carlo simulation in which estimation is more accurate but needs higher calculation time and classical ocean engineering design contours of type inverse-FORM in which the method is simpler and allows more complex estimation of wave setup part (wave propagation to the coast for example). We compare results from the two different approaches with the two different methods. To be able to use both Monte-Carlo simulation and design contours methods, wave setup is estimated with an simple empirical formula. We show advantages of the conditional approach compared to the multivariate extreme values approach when extreme sea-level occurs when either surge or wave height is large. We discuss the validity of the ocean engineering design contours method which is an alternative when computation of sea levels is too complex to use Monte-Carlo simulation method.
Bimodal fuzzy analytic hierarchy process (BFAHP) for coronary heart disease risk assessment.
Sabahi, Farnaz
2018-04-04
Rooted deeply in medical multiple criteria decision-making (MCDM), risk assessment is very important especially when applied to the risk of being affected by deadly diseases such as coronary heart disease (CHD). CHD risk assessment is a stochastic, uncertain, and highly dynamic process influenced by various known and unknown variables. In recent years, there has been a great interest in fuzzy analytic hierarchy process (FAHP), a popular methodology for dealing with uncertainty in MCDM. This paper proposes a new FAHP, bimodal fuzzy analytic hierarchy process (BFAHP) that augments two aspects of knowledge, probability and validity, to fuzzy numbers to better deal with uncertainty. In BFAHP, fuzzy validity is computed by aggregating the validities of relevant risk factors based on expert knowledge and collective intelligence. By considering both soft and statistical data, we compute the fuzzy probability of risk factors using the Bayesian formulation. In BFAHP approach, these fuzzy validities and fuzzy probabilities are used to construct a reciprocal comparison matrix. We then aggregate fuzzy probabilities and fuzzy validities in a pairwise manner for each risk factor and each alternative. BFAHP decides about being affected and not being affected by ranking of high and low risks. For evaluation, the proposed approach is applied to the risk of being affected by CHD using a real dataset of 152 patients of Iranian hospitals. Simulation results confirm that adding validity in a fuzzy manner can accrue more confidence of results and clinically useful especially in the face of incomplete information when compared with actual results. Applying the proposed BFAHP on CHD risk assessment of the dataset, it yields high accuracy rate above 85% for correct prediction. In addition, this paper recognizes that the risk factors of diastolic blood pressure in men and high-density lipoprotein in women are more important in CHD than other risk factors. Copyright © 2018 Elsevier Inc. All rights reserved.
The threshold algorithm: Description of the methodology and new developments
NASA Astrophysics Data System (ADS)
Neelamraju, Sridhar; Oligschleger, Christina; Schön, J. Christian
2017-10-01
Understanding the dynamics of complex systems requires the investigation of their energy landscape. In particular, the flow of probability on such landscapes is a central feature in visualizing the time evolution of complex systems. To obtain such flows, and the concomitant stable states of the systems and the generalized barriers among them, the threshold algorithm has been developed. Here, we describe the methodology of this approach starting from the fundamental concepts in complex energy landscapes and present recent new developments, the threshold-minimization algorithm and the molecular dynamics threshold algorithm. For applications of these new algorithms, we draw on landscape studies of three disaccharide molecules: lactose, maltose, and sucrose.
Target-motion prediction for robotic search and rescue in wilderness environments.
Macwan, Ashish; Nejat, Goldie; Benhabib, Beno
2011-10-01
This paper presents a novel modular methodology for predicting a lost person's (motion) behavior for autonomous coordinated multirobot wilderness search and rescue. The new concept of isoprobability curves is introduced and developed, which represents a unique mechanism for identifying the target's probable location at any given time within the search area while accounting for influences such as terrain topology, target physiology and psychology, clues found, etc. The isoprobability curves are propagated over time and space. The significant tangible benefit of the proposed target-motion prediction methodology is demonstrated through a comparison to a nonprobabilistic approach, as well as through a simulated realistic wilderness search scenario.